Jobs
Interviews

1529 Talend Jobs - Page 30

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SnapLogic Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 Years of full time education Summary: As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using SnapLogic. Your typical day will involve working with the development team, analyzing business requirements, and developing solutions to meet those requirements. Roles & Responsibilities: - Design, develop, and maintain SnapLogic integrations and workflows to meet business requirements. - Collaborate with cross-functional teams to analyze business requirements and develop solutions to meet those requirements. - Develop and maintain technical documentation for SnapLogic integrations and workflows. - Troubleshoot and resolve issues with SnapLogic integrations and workflows. Professional & Technical Skills: - Must To Have Skills: Strong experience in SnapLogic. - Good To Have Skills: Experience in other ETL tools like Informatica, Talend, or DataStage. - Experience in designing, developing, and maintaining integrations and workflows using SnapLogic. - Experience in analyzing business requirements and developing solutions to meet those requirements. - Experience in troubleshooting and resolving issues with SnapLogic integrations and workflows. Additional Information: - The candidate should have a minimum of 5 years of experience in SnapLogic. - The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful solutions using SnapLogic. - This position is based at our Pune office.

Posted 1 month ago

Apply

8.0 - 12.0 years

16 - 18 Lacs

Chennai

Work from Office

Strong SQL skills and experience working with ETL tools such as Informatica, IICS, Talend or SSIS. Proven experience as a QA engineer or similar role focused on testing Informatica Intelligent Cloud Services. Experience with testing tools such as JIRA. Strong analytical and problem-solving skills Ability to work on multiple projects and prioritize work effectively. Real time exposure in Risk Based testing concepts, process, approach, and implementation. Real time exposure in QA to QE transformation program concepts, process, approach and implementation.

Posted 1 month ago

Apply

7.0 years

0 Lacs

Gandhinagar, Gujarat, India

On-site

Key Responsibilities Lead and mentor a high-performing data pod composed of data engineers, data analysts, and BI developers. Design, implement, and maintain ETL pipelines and data workflows to support real-time and batch processing. Architect and optimize data warehouses for scale, performance, and security. Perform advanced data analysis and modeling to extract insights and support business decisions. Lead data science initiatives including predictive modeling, NLP, and statistical analysis. Manage and tune relational and non-relational databases (SQL, NoSQL) for availability and performance. Develop Power BI dashboards and reports for stakeholders across departments. Ensure data quality, integrity, and compliance with data governance and security standards. Work with cross-functional teams (product, marketing, ops) to turn data into strategy. Qualifications Required : PhD in Data Science, Computer Science, Engineering, Mathematics, or related field. 7+ years of hands-on experience across data engineering, data science, analysis, and database administration. Strong experience with ETL tools (e.g., Airflow, Talend, SSIS) and data warehouses (e.g., Snowflake, Redshift, BigQuery). Proficient in SQL, Python, and Power BI. Familiarity with modern cloud data platforms (AWS/GCP/Azure). Strong understanding of data modeling, data governance, and MLOps practices. Exceptional ability to translate business needs into scalable data solutions. (ref:hirist.tech)

Posted 1 month ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Description Template Job Title - Decision Science Practitioner Analyst S&C GN Management Level : Senior Analyst Location: Bangalore/ Kolkata Must have skills: Collibra Data Quality - data profiling, anomaly detection, reconciliation, data validation, Python, SQL Good to have skills: PySpark, Kubernetes, Docker, Git Job Summary: We are seeking a highly skilled and motivated Data Science cum Data Engineer Senior Analyst to lead innovative projects and drive impactful solutions in domains such as Consumer Tech , Enterprise Tech , and Semiconductors . This role combines hands-on technical expertise , and client delivery management to execute cutting-edge projects in data science & data engineering Key Responsibilities Data Science and Engineering Implement and manage end to end Data Quality frameworks using Collibra Data Quality (CDQ). This includes – requirement gathering from the client, code development on SQL, Unit testing, Client demos, User acceptance testing, documentation etc. Work extensively with business users, data analysts, and other stakeholders to understand data quality requirements and business use cases. Develop data validation, profiling, anomaly detection, and reconciliation processes. Write SQL queries for simple to complex data quality checks. Python, and PySpark scripts to support data transformation and data ingestion. Deploy and manage solutions on Kubernetes workloads for scalable execution. Maintain comprehensive technical documentation of Data Quality processes and implemented solutions. Work in an Agile environment, leveraging Jira for sprint planning and task management. Troubleshoot data quality issues and collaborate with engineering teams for resolution. Provide insights for continuous improvement in data governance and quality processes. Build and manage robust data pipelines using Pyspark and Python to read and write from databases such as Vertica and PostgreSQL. Optimize and maintain existing pipelines for performance and reliability. Build custom solutions using Python, including FastAPI applications and plugins for Collibra Data Quality. Oversee the infrastructure of the Collibra application in Kubernetes environment, perform upgrades when required, and troubleshoot and resolve any Kubernetes issues that may affect the application's operation. Deploy and manage solutions, optimize resources for deployments in Kubernetes, including writing YAML files and managing configurations Build and deploy Docker images for various use cases, ensuring efficient and reusable solutions. Collaboration and Training Communicate effectively with stakeholders to align technical implementations with business objectives. Provide training and guidance to stakeholders on Collibra Data Quality usage and help them build and implement data quality rules. Version Control and Documentation Use Git for version control to manage code and collaborate effectively. Document all implementations, including data quality workflows, data pipelines, and deployment processes, ensuring easy reference and knowledge sharing. Database and Data Model Optimization Design and optimize data models for efficient storage and retrieval. Required Qualifications Experience: 4+ years in data science Education: B tech, M tech in Computer Science, Statistics, Applied Mathematics, or related field Industry Knowledge: Preferred experience in Consumer Tech, Enterprise Tech, or Semiconductors but not mandatory Technical Skills Programming: Proficiency in Python, SQL for data analysis and transformation. Tools : Hands-on experience with Collibra Data Quality (CDQ) or similar Data Quality tools (e.g., Informatica DQ, Talend, Great Expectations, Ataccama, etc.). Experience working with Kubernetes workloads. Experience with Agile methodologies and task tracking using Jira. Preferred Skills Strong analytical and problem-solving skills with a results-oriented mindset. Good communication, stakeholder management & requirement gathering capabilities. Additional Information: The ideal candidate will possess a strong educational background in quantitative discipline and experience in working with Hi-Tech clients This position is based at our Bengaluru (preferred) and Kolkata office. About Our Company | Accenture Experience: 4+ years in data science Educational Qualification: B tech, M tech in Computer Science, Statistics, Applied Mathematics, or related field

Posted 1 month ago

Apply

3.0 - 8.0 years

0 - 3 Lacs

Chennai

Work from Office

Job Summary: We are seeking a skilled and experienced Talend Developer to join our team. The ideal candidate will have expertise in Talend ETL tools and be responsible for designing, developing, and maintaining data integration solutions that allow our business to effectively manage and process large volumes of data. You will collaborate with cross-functional teams to design scalable solutions and ensure data quality, performance, and security.Key Responsibilities: Design and develop ETL processes using Talend Data Integration tools. Collaborate with data architects, business analysts, and other stakeholders to understand data integration requirements. Implement and maintain data pipelines for seamless data movement across multiple platforms. Perform data transformations, data cleaning, and aggregation tasks to ensure high data quality. Optimize Talend jobs for performance, scalability, and reliability. Troubleshoot and resolve issues related to data integration, data flow, and Talend jobs. Monitor job execution and handle scheduling, error handling, and recovery. Ensure compliance with data security policies and best practices. Stay up to date with the latest developments and best practices in data integration and Talend tools. Requirements: Proven experience working with Talend Data Integration tools. Strong knowledge of SQL and experience with relational databases (MySQL, Oracle, SQL Server, etc.). Experience with data warehousing and large-scale data integration. Familiarity with cloud platforms ,integration with Talend. Strong analytical and problem-solving skills. Knowledge of data transformation techniques and best practices. Ability to work in a collaborative team environment. Excellent written and verbal communication skills. Location: Chennai Experience:3+Years - 10 Years

Posted 1 month ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Hyderabad

Work from Office

We are looking for a skilled Snowflake Developer with 5-7 years of experience to join our team at IDESLABS PRIVATE LIMITED. The ideal candidate will have expertise in designing, developing, and implementing data warehousing solutions using Snowflake. Roles and Responsibility Design and develop scalable data warehousing solutions using Snowflake. Collaborate with cross-functional teams to identify business requirements and design data models. Develop and maintain complex SQL queries for data extraction and manipulation. Implement data validation and quality checks to ensure accuracy and integrity. Optimize database performance and troubleshoot issues. Work closely with stakeholders to understand business needs and provide technical guidance. Job Requirements Strong understanding of data modeling and data warehousing concepts. Proficiency in writing complex SQL queries and stored procedures. Experience with Snowflake development tools and technologies. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment. Strong communication and interpersonal skills.

Posted 1 month ago

Apply

1.0 - 5.0 years

2 - 6 Lacs

Nagercoil

Work from Office

Job Summary: We are seeking a skilled Data Migration Specialist to support critical data transition initiatives, particularly involving Salesforce and Microsoft SQL Server . This role will be responsible for the end-to-end migration of data between systems, including data extraction, transformation, cleansing, loading, and validation. The ideal candidate will have a strong foundation in relational databases, a deep understanding of the Salesforce data model, and proven experience handling large-volume data loads. Required Skills and Qualifications: 1+ years of experience in data migration , ETL , or database development roles. Strong hands-on experience with Microsoft SQL Server and T-SQL (complex queries, joins, indexing, and profiling). Proven experience using Salesforce Data Loader for bulk data operations. Solid understanding of Salesforce CRM architecture , including object relationships and schema design. Strong background in data transformation and cleansing techniques . Nice to Have: Experience with large-scale data migration projects involving CRM or ERP systems. Exposure to ETL tools such as Talend, Informatica, Mulesoft, or custom scripts. Salesforce certifications (e.g., Administrator , Data Architecture & Management Designer ) are a plus. Knowledge of Apex , Salesforce Flows , or other declarative tools is a bonus. Roles and Responsibilities Key Responsibilities: Execute end-to-end data migration activities , including data extraction, transformation, and loading (ETL). Develop and optimize complex SQL queries, joins, and stored procedures for data profiling, analysis, and validation. Utilize Salesforce Data Loader and/or Apex DataLoader CLI to manage high-volume data imports and exports. Understand and work with the Salesforce data model , including standard/custom objects and relationships (Lookup, Master-Detail). Perform data cleansing, de-duplication , and transformation to ensure quality and consistency. Troubleshoot and resolve data-related issues , load failures, and anomalies. Collaborate with cross-functional teams to gather data mapping requirements and ensure accurate system integration. Ensure data integrity , adherence to compliance standards, and document migration processes and mappings. Ability to independently analyze, troubleshoot, and resolve data-related issues effectively. Follow best practices for data security, performance tuning, and migration efficiency.

Posted 1 month ago

Apply

3.0 - 6.0 years

14 - 30 Lacs

Delhi, India

On-site

Industry & Sector: A fast-growing services provider in the enterprise data analytics and business-intelligence sector, we deliver high-throughput data pipelines, warehouses, and BI insights that power critical decisions for global BFSI, retail, and healthcare clients. Our on-site engineering team in India ensures the reliability, accuracy, and performance of every dataset that reaches production. Role & Responsibilities Design, execute, and maintain end-to-end functional, regression, and performance test suites for ETL workflows across multiple databases and file systems. Validate source-to-target mappings, data transformations, and incremental loads to guarantee 100% data integrity and reconciliation. Develop SQL queries, Unix shell scripts, and automated jobs to drive repeatable test execution, logging, and reporting. Identify, document, and triage defects using JIRA/HP ALM, partnering with data engineers to resolve root causes quickly. Create reusable test data sets and environment configurations that accelerate Continuous Integration/Continuous Deployment (CI/CD) cycles. Contribute to test strategy, coverage metrics, and best-practice playbooks while mentoring junior testers on ETL quality standards. Skills & Qualifications Must-Have: 3-6 years hands-on ETL testing experience in data warehouse or big-data environments. Advanced SQL for complex joins, aggregations, and data profiling. Exposure to leading ETL tools such as Informatica, DataStage, or Talend. Proficiency in Unix/Linux command-line and shell scripting for job orchestration. Solid understanding of SDLC, STLC, and Agile ceremonies; experience with JIRA or HP ALM. Preferred: Automation with Python, Selenium, or Apache Airflow for data pipelines. Knowledge of cloud data platforms (AWS Redshift, Azure Synapse, or GCP BigQuery). Performance testing of large datasets and familiarity with BI tools like Tableau or Power BI. Benefits & Culture Highlights Merit-based growth path with dedicated ETL automation upskilling programs. Collaborative, process-mature environment that values quality engineering over quick fixes. Comprehensive health cover, on-site cafeteria, and generous leave policy to support work-life balance. Workplace Type: On-site | Location: India | Title Used Internally: ETL Test Engineer. Skills: agile methodologies,aws redshift,jira,hp alm,datastage,apache airflow,test automation,power bi,selenium,advanced sql,data warehouse,unix/linux,azure synapse,stlc,gcp bigquery,shell scripting,sql,performance testing,agile,python,sdlc,tableau,defect tracking,informatica,etl testing,dimension modeling,talend

Posted 1 month ago

Apply

0 years

0 Lacs

Tiruvallur, Tamil Nadu, India

On-site

Als Werkstudent im Bereich Data Engineering unterstützt du unser Analytics-Team bei der Entwicklung und Optimierung einer modernen Datenplattform auf Basis von Microsoft Fabric. Du arbeitest eng mit unseren Data Architects und Developern zusammen und übernimmst eigenverantwortlich Aufgaben in folgenden Bereichen. Deine Aufgaben Aufbau und Pflege von Datenpipelines mit Microsoft Fabric (Dataflows, Pipelines, OneLake) Entwicklung und Ausführung von Python-Notebooks zur Datenverarbeitung und Analyse Integration und Transformation von Daten aus verschiedenen Quellen (z. B. SQL, APIs, Excel) Unterstützung bei der Erstellung von Lakehouse-Strukturen u. deren Anbindung an Power BI Dokumentation und Automatisierung von Datenprozessen Unser Angebot Herausfordernde Aufgaben und Entfaltungsmöglichkeiten Begleitung und Einarbeitung durch die Teamkollegen sowie gezieltes Feedback für deine persönliche Weiterentwicklung Einblicke in das Themenfeld Data Engineering, indem du aktiv mitarbeiten und Referenzen aufbauen kannst Zentrumsnaher Arbeitsort und sehr gute Erreichbarkeit mit dem ÖPNV Dein Profil Immatrikulierter Student (m/w/d) im Bereich (Wirtschafts-)Informatik, Data Science, (Wirtschafts-)Mathematik oder verwandte Studienfächer Vorzugsweise kurz vor Abschluss des Bachelor-Studiengangs oder bereits im Masterstudium Erste Praxiserfahrung Python (z. B. Pandas, PySpark, Jupyter Notebooks) Interesse an Cloud-Datenplattformen und modernen BI-Technologien Idealerweise erste Berührungspunkte mit Microsoft Fabric, Power BI, Talend, Tableau oder Azure Gute Deutsch- und/oder Englischkenntnisse Bereitschaft, im Team sowie eigenverantwortlich zu arbeiten und sich Neues anzueignen sowie eine schnelle Auffassungsgabe und Zuverlässigkeit

Posted 1 month ago

Apply

3.0 - 6.0 years

14 - 30 Lacs

Greater Kolkata Area

On-site

Industry & Sector: A fast-growing services provider in the enterprise data analytics and business-intelligence sector, we deliver high-throughput data pipelines, warehouses, and BI insights that power critical decisions for global BFSI, retail, and healthcare clients. Our on-site engineering team in India ensures the reliability, accuracy, and performance of every dataset that reaches production. Role & Responsibilities Design, execute, and maintain end-to-end functional, regression, and performance test suites for ETL workflows across multiple databases and file systems. Validate source-to-target mappings, data transformations, and incremental loads to guarantee 100% data integrity and reconciliation. Develop SQL queries, Unix shell scripts, and automated jobs to drive repeatable test execution, logging, and reporting. Identify, document, and triage defects using JIRA/HP ALM, partnering with data engineers to resolve root causes quickly. Create reusable test data sets and environment configurations that accelerate Continuous Integration/Continuous Deployment (CI/CD) cycles. Contribute to test strategy, coverage metrics, and best-practice playbooks while mentoring junior testers on ETL quality standards. Skills & Qualifications Must-Have: 3-6 years hands-on ETL testing experience in data warehouse or big-data environments. Advanced SQL for complex joins, aggregations, and data profiling. Exposure to leading ETL tools such as Informatica, DataStage, or Talend. Proficiency in Unix/Linux command-line and shell scripting for job orchestration. Solid understanding of SDLC, STLC, and Agile ceremonies; experience with JIRA or HP ALM. Preferred: Automation with Python, Selenium, or Apache Airflow for data pipelines. Knowledge of cloud data platforms (AWS Redshift, Azure Synapse, or GCP BigQuery). Performance testing of large datasets and familiarity with BI tools like Tableau or Power BI. Benefits & Culture Highlights Merit-based growth path with dedicated ETL automation upskilling programs. Collaborative, process-mature environment that values quality engineering over quick fixes. Comprehensive health cover, on-site cafeteria, and generous leave policy to support work-life balance. Workplace Type: On-site | Location: India | Title Used Internally: ETL Test Engineer. Skills: agile methodologies,aws redshift,jira,hp alm,datastage,apache airflow,test automation,power bi,selenium,advanced sql,data warehouse,unix/linux,azure synapse,stlc,gcp bigquery,shell scripting,sql,performance testing,agile,python,sdlc,tableau,defect tracking,informatica,etl testing,dimension modeling,talend

Posted 1 month ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

Mumbai

Work from Office

Job Summary : Excellent authoring skills and ability to independently build resources Ability to solve complex business problems and deliver client delight Strong analytical and writing skills to build viewpoints on industry trends Excellent communication, interpersonal and presentation skills Cross-cultural competence with an ability to thrive in a dynamic environment Roles & Responsibilities: As a part of our Supply chain and operations practice, you will help organizations reimagine and transform their supply chains for tomorrowwith a positive impact on the business, society and the planet. Together, lets innovate, build competitive advantage, improve business, and societal outcomes, in an ever-changing, ever-challenging world. Help us make supply chains work better, faster, and be more resilient, with the following initiatives: Support clients and teams in the design, development and implementation of new and improved business processes, enabling technology in Supply Chain related projects. Involve in supply chain planning process and requirement discussions with the client to configure the data structure or data model accordingly. Work with the client in the design, development and testing of the supply chain implementation projects. Design apt solutions by considering the inbuilt as well as configurable capabilities of Kinaxis RapidResponse. Work with the client team to understand the system landscape. Perform workshops with single point of contacts of each legacy system which is getting integrated with Kinaxis RapidResponse. Provide data specification documents based on Kinaxis Rapid response configuration. Create Namespace or Tables based on clients current data flow. Create transformation workbooks and design test scripts for configuration testing, and train integration team on client business solution. Ensure RapidResponse gets integrated with clients systems. Qualification Professional & Technical Skills: MBA from Tier-1 B-school 5+ years of experience of working as a Integration architect on Kinaxis RapidResponse End-to-end implementation experience as Integration Architect Should have experience on Data Integration server or Talend tool Experience across industries such as Life sciences, Auto, Consumer Packaged Goods, preferred Knowledge of different Scheduled task and Scripts required to set up for the consistent flow of data Have a good understanding of Extraction, Transformation, Load concepts to proactively troubleshoot the Integration issues Experience in managing the implementation of Kinaxis RapidResponse Administrator and coordinating different key stakeholders within same project Must be a certified RapidResponse Administrator, level 2

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 10 Lacs

Gurugram

Work from Office

Job Summary : Excellent authoring skills and ability to independently build resources Ability to solve complex business problems and deliver client delight Strong analytical and writing skills to build viewpoints on industry trends Excellent communication, interpersonal and presentation skills Cross-cultural competence with an ability to thrive in a dynamic environment Roles & Responsibilities: As a part of our Supply chain and operations practice, you will help organizations reimagine and transform their supply chains for tomorrowwith a positive impact on the business, society and the planet. Together, lets innovate, build competitive advantage, improve business, and societal outcomes, in an ever-changing, ever-challenging world. Help us make supply chains work better, faster, and be more resilient, with the following initiatives: Support clients and teams in the design, development and implementation of new and improved business processes, enabling technology in Supply Chain related projects. Involve in supply chain planning process and requirement discussions with the client to configure the data structure or data model accordingly. Work with the client in the design, development and testing of the supply chain implementation projects. Design apt solutions by considering the inbuilt as well as configurable capabilities of Kinaxis RapidResponse. Work with the client team to understand the system landscape. Perform workshops with single point of contacts of each legacy system which is getting integrated with Kinaxis RapidResponse. Provide data specification documents based on Kinaxis Rapid response configuration. Create Namespace or Tables based on clients current data flow. Create transformation workbooks and design test scripts for configuration testing, and train integration team on client business solution. Ensure RapidResponse gets integrated with clients systems. Qualification Professional & Technical Skills: MBA from Tier-1 B-school 5+ years of experience of working as a Integration architect on Kinaxis RapidResponse End-to-end implementation experience as Integration Architect Should have experience on Data Integration server or Talend tool Experience across industries such as Life sciences, Auto, Consumer Packaged Goods, preferred Knowledge of different Scheduled task and Scripts required to set up for the consistent flow of data Have a good understanding of Extraction, Transformation, Load concepts to proactively troubleshoot the Integration issues Experience in managing the implementation of Kinaxis RapidResponse Administrator and coordinating different key stakeholders within same project Must be a certified RapidResponse Administrator, level 2

Posted 1 month ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Gurugram

Work from Office

About the Role: Grade Level (for internal use): 09 S&P Global Mobility The Role: ETL Developer The Team The ETL team forms an integral part of Global Data Operations (GDO) and caters to the North America & EMEA automotive business line. Core responsibilities include translating business requirements into technical design and ETL jobs along with unit testing, integration testing, regression testing, deployments & production operations. The team has an energetic and dynamic group of individuals, always looking to work through a challenge. Ownership, raising the bar and innovation is what the team runs on! The Impact The ETL team, being part of GDO, caters to the automotive business line and helps stakeholders with an optimum solution for their data needs. The role requires close coordination with global teams such as other development teams, research analysts, quality assurance analysts, architects etc. The role is vital for the automotive business as it involves providing highly efficient data solutions with high accuracy to various stakeholders. The role forms a bridge between the business and technical stakeholders. Whats in it for you Constant learning, working in a dynamic and challenging environment! Total Rewards. Monetary, beneficial, and developmental rewards! Work Life Balance. You can't do a good job if your job is all you do! Diversity & Inclusion. HeForShe! Internal Mobility. Grow with us! Responsibilities Using prior experience with file loading, cleansing and standardization, should be able to translate business requirements into ETL design and efficient ETL solutions using Informatica Powercenter (mandatory) and Talend Enterprise (preferred). Knowledge of tibco would be a preferred skill as well. Understand relational database technologies and data warehousing concepts and processes. Using prior experiences with High Volume data processing, be able to deal with complex technical issues Works closely with all levels of management and employees across the Automotive business line. Participates as part of cross-functional teams responsible for investigating issues, proposing solutions and implementing corrective actions. Good communication skills required for interface with various stakeholder groups; detail oriented with analytical skills What Were Looking For The ETL development team within the Mobility domain is looking for a Software Engineer to work on design, development & operations efforts in the ETL (Informatica) domain. Primary Skills and qualifications required: Experience with Informatica and/or Talend ETL tools Bachelors degree in Computer Science, with at least 3+ years of development and maintenance of ETL systems on Informatica PowerCenter and 1+ year of SQL experience. 3+ years of Informatica Design and Architecture experience and 1+ years of Optimization and Performance tuning of ETL code on Informatica 1+ years of python development experience and SQL, XML experience Working knowledge or greater of Cloud Based Technologies, Development, Operations a plus. About S&P Global Mobility At S&P Global Mobility, we provide invaluable insights derived from unmatched automotive data, enabling our customers to anticipate change and make decisions with conviction. Our expertise helps them to optimize their businesses, reach the right consumers, and shape the future of mobility. We open the door to automotive innovation, revealing the buying patterns of today and helping customers plan for the emerging technologies of tomorrow. For more information, visit www.spglobal.com/mobility . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- , SWP Priority Ratings - (Strategic Workforce Planning)

Posted 1 month ago

Apply

3.0 - 6.0 years

14 - 30 Lacs

Hyderabad, Telangana, India

On-site

Industry & Sector: A fast-growing services provider in the enterprise data analytics and business-intelligence sector, we deliver high-throughput data pipelines, warehouses, and BI insights that power critical decisions for global BFSI, retail, and healthcare clients. Our on-site engineering team in India ensures the reliability, accuracy, and performance of every dataset that reaches production. Role & Responsibilities Design, execute, and maintain end-to-end functional, regression, and performance test suites for ETL workflows across multiple databases and file systems. Validate source-to-target mappings, data transformations, and incremental loads to guarantee 100% data integrity and reconciliation. Develop SQL queries, Unix shell scripts, and automated jobs to drive repeatable test execution, logging, and reporting. Identify, document, and triage defects using JIRA/HP ALM, partnering with data engineers to resolve root causes quickly. Create reusable test data sets and environment configurations that accelerate Continuous Integration/Continuous Deployment (CI/CD) cycles. Contribute to test strategy, coverage metrics, and best-practice playbooks while mentoring junior testers on ETL quality standards. Skills & Qualifications Must-Have: 3-6 years hands-on ETL testing experience in data warehouse or big-data environments. Advanced SQL for complex joins, aggregations, and data profiling. Exposure to leading ETL tools such as Informatica, DataStage, or Talend. Proficiency in Unix/Linux command-line and shell scripting for job orchestration. Solid understanding of SDLC, STLC, and Agile ceremonies; experience with JIRA or HP ALM. Preferred: Automation with Python, Selenium, or Apache Airflow for data pipelines. Knowledge of cloud data platforms (AWS Redshift, Azure Synapse, or GCP BigQuery). Performance testing of large datasets and familiarity with BI tools like Tableau or Power BI. Benefits & Culture Highlights Merit-based growth path with dedicated ETL automation upskilling programs. Collaborative, process-mature environment that values quality engineering over quick fixes. Comprehensive health cover, on-site cafeteria, and generous leave policy to support work-life balance. Workplace Type: On-site | Location: India | Title Used Internally: ETL Test Engineer. Skills: agile methodologies,aws redshift,jira,hp alm,datastage,apache airflow,test automation,power bi,selenium,advanced sql,data warehouse,unix/linux,azure synapse,stlc,gcp bigquery,shell scripting,sql,performance testing,agile,python,sdlc,tableau,defect tracking,informatica,etl testing,dimension modeling,talend

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Chennai

Work from Office

Project Role : Quality Engineer (Tester) Project Role Description : Enables full stack solutions through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. Performs continuous testing for security, API, and regression suite. Creates automation strategy, automated scripts and supports data and environment configuration. Participates in code reviews, monitors, and reports defects to support continuous improvement activities for the end-to-end testing process. Must have skills : Data Warehouse ETL Testing Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Quality Engineer (Tester), you will enable full stack solutions through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecycle. You will perform continuous testing for security, API, and regression suite, create automation strategy, automated scripts, and support data and environment configuration. Participate in code reviews, monitor, and report defects to support continuous improvement activities for the end-to-end testing process. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and execute test cases, scripts, plans, and procedures.- Collaborate with cross-functional teams to ensure quality throughout the software development lifecycle. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Warehouse ETL Testing.- Strong understanding of SQL and database concepts.- Experience in testing tools like Informatica, Talend, or SSIS.- Knowledge of data warehousing concepts and ETL processes. Additional Information:- The candidate should have a minimum of 3 years of experience in Data Warehouse ETL Testing.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

1.0 - 3.0 years

15 - 20 Lacs

Pune

Work from Office

Company Overview With 80,000 customers across 150 countries, UKG is the largest U.S.-based private software company in the world. And we’re only getting started. Ready to bring your bold ideas and collaborative mindset to an organization that still has so much more to build and achieveRead on. Here, we know that you’re more than your work. That’s why our benefits help you thrive personally and professionally, from wellness programs and tuition reimbursement to U Choose — a customizable expense reimbursement program that can be used for more than 200+ needs that best suit you and your family, from student loan repayment, to childcare, to pet insurance. Our inclusive culture, active and engaged employee resource groups, and caring leaders value every voice and support you in doing the best work of your career. If you’re passionate about our purpose — people —then we can’t wait to support whatever gives you purpose. We’re united by purpose, inspired by you. Job Summary The Analytics Consultant I is a business intelligence focused expert that participates in the delivery of analytics solutions and reporting for various UKG products such as Pro, UKG Dimensions and UKG Datahub. The candidate is also responsible for interacting with other businesses and technical project stakeholders to gather business requirements and ensure successful delivery. The candidate should be able to leverage the strengths and capabilities of the software tools to provide an optimized solution to the customer. The Analytics Consultant I will also be responsible for developing custom analytics solutions and reports to specifications provided and support the solutions delivered. The candidate must be able to effectively communicate ideas both verbally and in writing at all levels in the organization, from executive staff to technical resources. The role requires working with the Program/Project manager, the Management Consultant, and the Analytics Consultants to deliver the solution based upon the defined design requirements and ensure it meets the scope and customer expectations. Key Responsibilities Interact with other businesses and technical project stakeholders to gather business requirements Deploy and Configure the UKG Analytics and Data Hub products based on the Design Documents Develop and deliver best practice visualizations and dashboards using a BI tools such as Cognos or BIRT or Power BI etc. Put together a test plan, validate the solution deployed and document the results Provide support during production cutover, and after go-live act as the first level of support for any requests that come through from the customer or other Consultants Analyze the customer’s data to spot trends and issues and present the results back to the customer Required Qualifications: 1-3 years’ experience designing and delivering Analytical/Business Intelligence solutions required Cognos, BIRT, Power BI or other business intelligence toolset experience required ETL experience using Talend or other industry standard ETL tools strongly preferred Advanced SQL proficiency is a plus Knowledge of Google Cloud Platform or Azure or something similar is desired, but not required Knowledge of Python is desired, but not required Willingness to learn new technologies and adapt quickly is required Strong interpersonal and problem-solving skills Flexibility to support customers in different time zones is required Where we’re going UKG is on the cusp of something truly special. Worldwide, we already hold the #1 market share position for workforce management and the #2 position for human capital management. Tens of millions of frontline workers start and end their days with our software, with billions of shifts managed annually through UKG solutions today. Yet it’s our AI-powered product portfolio designed to support customers of all sizes, industries, and geographies that will propel us into an even brighter tomorrow! UKGCareers@ukg.com

Posted 1 month ago

Apply

3.0 - 8.0 years

15 - 20 Lacs

Noida

Work from Office

Company Overview: With 80,000 customers across 150 countries, UKG is the largest U.S.-based private software company in the world. And we’re only getting started. Ready to bring your bold ideas and collaborative mindset to an organization that still has so much more to build and achieveRead on. Here, we know that you’re more than your work. That’s why our benefits help you thrive personally and professionally, from wellness programs and tuition reimbursement to U Choose — a customizable expense reimbursement program that can be used for more than 200+ needs that best suit you and your family, from student loan repayment, to childcare, to pet insurance. Our inclusive culture, active and engaged employee resource groups, and caring leaders value every voice and support you in doing the best work of your career. If you’re passionate about our purpose — people —then we can’t wait to support whatever gives you purpose. We’re united by purpose, inspired by you. The Analytics Consultant II (Level-2) is a business intelligence focused expert that participates in the delivery of analytics solutions and reporting for various UKG products such as Pro, UKG Dimensions and UKG Datahub. The candidate is also responsible for interacting with other businesses and technical project stakeholders to gather business requirements and ensure successful delivery. The candidate should be able to leverage the strengths and capabilities of the software tools to provide an optimized solution to the customer. The Analytics Consultant II will also be responsible for developing custom analytics solutions and reports to specifications provided and support the solutions delivered. The candidate must be able to effectively communicate ideas both verbally and in writing at all levels in the organization, from executive staff to technical resources. The role requires working with the Program/Project manager, the Management Consultant, and the Analytics Consultants to deliver the solution based upon the defined design requirements and ensure it meets the scope and customer expectations. Responsibilities includeInteract with other businesses and technical project stakeholders to gather business requirements Deploy and Configure the UKG Analytics and Data Hub products based on the Design Documents Develop and deliver best practice visualizations and dashboards using a BI tools such as Cognos or BIRT or Power BI etc. Put together a test plan, validate the solution deployed and document the results Provide support during production cutover, and after go-live act as the first level of support for any requests that come through from the customer or other Consultants Analyse the customer’s data to spot trends and issues and present the results back to the customer Qualification 3+ years’ experience designing and delivering Analytical/Business Intelligence solutions required Cognos, BIRT, Power BI or other business intelligence toolset experience required ETL experience using Talend or other industry standard ETL tools strongly preferred Advanced SQL proficiency is a plus Knowledge of Google Cloud Platform or Azure or something similar is desired, but not required Knowledge of Python is desired, but not required Willingness to learn new technologies and adapt quickly is required Strong interpersonal and problem-solving skills Flexibility to support customers in different time zones is required Where we’re going UKG is on the cusp of something truly special. Worldwide, we already hold the #1 market share position for workforce management and the #2 position for human capital management. Tens of millions of frontline workers start and end their days with our software, with billions of shifts managed annually through UKG solutions today. Yet it’s our AI-powered product portfolio designed to support customers of all sizes, industries, and geographies that will propel us into an even brighter tomorrow! UKG is proud to be an equal-opportunity employer and is committed to promoting diversity and inclusion in the workplace, including the recruitment process. Disability Accommodation UKGCareers@ukg.com

Posted 1 month ago

Apply

3.0 - 6.0 years

14 - 30 Lacs

Pune, Maharashtra, India

On-site

Industry & Sector: A fast-growing services provider in the enterprise data analytics and business-intelligence sector, we deliver high-throughput data pipelines, warehouses, and BI insights that power critical decisions for global BFSI, retail, and healthcare clients. Our on-site engineering team in India ensures the reliability, accuracy, and performance of every dataset that reaches production. Role & Responsibilities Design, execute, and maintain end-to-end functional, regression, and performance test suites for ETL workflows across multiple databases and file systems. Validate source-to-target mappings, data transformations, and incremental loads to guarantee 100% data integrity and reconciliation. Develop SQL queries, Unix shell scripts, and automated jobs to drive repeatable test execution, logging, and reporting. Identify, document, and triage defects using JIRA/HP ALM, partnering with data engineers to resolve root causes quickly. Create reusable test data sets and environment configurations that accelerate Continuous Integration/Continuous Deployment (CI/CD) cycles. Contribute to test strategy, coverage metrics, and best-practice playbooks while mentoring junior testers on ETL quality standards. Skills & Qualifications Must-Have: 3-6 years hands-on ETL testing experience in data warehouse or big-data environments. Advanced SQL for complex joins, aggregations, and data profiling. Exposure to leading ETL tools such as Informatica, DataStage, or Talend. Proficiency in Unix/Linux command-line and shell scripting for job orchestration. Solid understanding of SDLC, STLC, and Agile ceremonies; experience with JIRA or HP ALM. Preferred: Automation with Python, Selenium, or Apache Airflow for data pipelines. Knowledge of cloud data platforms (AWS Redshift, Azure Synapse, or GCP BigQuery). Performance testing of large datasets and familiarity with BI tools like Tableau or Power BI. Benefits & Culture Highlights Merit-based growth path with dedicated ETL automation upskilling programs. Collaborative, process-mature environment that values quality engineering over quick fixes. Comprehensive health cover, on-site cafeteria, and generous leave policy to support work-life balance. Workplace Type: On-site | Location: India | Title Used Internally: ETL Test Engineer. Skills: agile methodologies,aws redshift,jira,hp alm,datastage,apache airflow,test automation,power bi,selenium,advanced sql,data warehouse,unix/linux,azure synapse,stlc,gcp bigquery,shell scripting,sql,performance testing,agile,python,sdlc,tableau,defect tracking,informatica,etl testing,dimension modeling,talend

Posted 1 month ago

Apply

5.0 - 7.0 years

10 - 13 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering,BTech,BSc,Bachelor of Pharmacy,MTech,MCA,Master Of Engineering,Master of Pharmacy Service Line Engineering Services Responsibilities At least 4-8 years of experience in LIMS(LV/LW) – Implementation /Configuration/Customization using Java, Java script. integration with Lab applications, and should have implemented at least 2-3 projects with role involving development using LabVantage platform and Jasper/iReport/Java reporting tool Interface with key stakeholders and apply your technical proficiency across different stages of the Software Development Life Cycle including Elicitation and translating to functional and/or design documentation for LabVantage LIMS solution, Application Architecture definition and Design, Development, Validation and release. Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Technical and Professional : Primary skills:Technology-Life Sciences-LIMS Experience in developing instrument drivers using SDMS/Talend/Java is to have. At least 5 years of experience in software development life cycle. At least 5 years of experience in Project life cycle activities on development and maintenance projects. At least 5 years of experience in Design and architecture review. Good understanding of sample management domain and exposure to life sciences projects Ability to work in team in diverse/ multiple stakeholder environment Analytical skills Very Good Communication skills Preferred Skills: Technology-Life Sciences-LIMS

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 9 Lacs

Navi Mumbai

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Collibra Data Quality & Observability Good to have skills : Collibra Data GovernanceMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are functioning optimally. You will also engage in problem-solving activities, providing support and enhancements to existing applications while maintaining a focus on quality and efficiency. Key Responsibilities:Configure and implement Collibra Data Quality (CDQ) rules, workflows, dashboards, and data quality scoring metrics.Collaborate with data stewards, data owners, and business analysts to define data quality KPIs and thresholds.Develop data profiling and rule-based monitoring using CDQ's native rule engine or integrations (e.g., with Informatica, Talend, or BigQuery).Build and maintain Data Quality Dashboards and Issue Management workflows within Collibra.Integrate CDQ with Collibra Data Intelligence Cloud for end-to-end governance visibility.Drive root cause analysis and remediation plans for data quality issues.Support metadata and lineage enrichment to improve data traceability.Document standards, rule logic, and DQ policies in the Collibra Catalog.Conduct user training and promote data quality best practices across teams.Required Skills and Experience:3+ years of experience in data quality, metadata management, or data governance.Hands-on experience with Collibra Data Quality & Observability (CDQ) platform.Knowledge of Collibra Data Intelligence Cloud including Catalog, Glossary, and Workflow Designer.Proficiency in SQL and understanding of data profiling techniques.Experience integrating CDQ with enterprise data sources (Snowflake, BigQuery, Databricks, etc.).Familiarity with data governance frameworks and data quality dimensions (accuracy, completeness, consistency, etc.).Excellent analytical, problem-solving, and communication skills. Additional Information:- The candidate should have minimum 7.5 years of experience in Collibra Data Quality & Observability.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

8.0 - 13.0 years

5 - 10 Lacs

Hyderabad

Work from Office

Sr Devloper with special emphasis and experience of 8 to 10 years on Pyspark and Python along with ETL Tools ( Talend / Ab initio / informatica / Similar) . Also have good exposure to ETL tools to understand the flow and rewrite them into Python and Pyspark and executing the test plans.8-10 years of experience in designing and developing Pyspark applications and ETL Jobs using ETL Tools. 5+ years of sound knowledge on Pyspark to implement ETL logics. Strong understanding of frontend technologies such as HTML, CSS, React & JavaScript. Proficiency in data modeling and design, including PL/SQL development Creating test plans to understand current ETL flow and rewriting them to Pyspark. Providing ongoing support and maintenance for ETL applications, including troubleshooting and resolving issues. Expertise in practices like Agile, Peer reviews, Continuous Integration.

Posted 1 month ago

Apply

5.0 - 10.0 years

5 - 10 Lacs

Hyderabad

Work from Office

Design, develop, and maintain ETL processes using Ab Initio and other ETL tools. Manage and optimize data pipelines on AWS. Write and maintain complex PL/SQL queries for data extraction, transformation, and loading. Provide Level 3 support for ETL processes, troubleshooting and resolving issues promptly. Collaborate with data architects, analysts, and other stakeholders to understand data requirements and deliver effective solutions. Ensure data quality and integrity through rigorous testing and validation. Stay updated with the latest industry trends and technologies in ETL and cloud computing. : Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Certification in Ab Initio. Proven experience with AWS and cloud-based data solutions. Strong proficiency in PL/SQL and other ETL tools. Experience in providing Level 3 support for ETL processes. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities. Preferred Qualifications: Experience with other ETL tools such as Informatica, Talend, or DataStage. Knowledge of data warehousing concepts and best practices. Familiarity with scripting languages (e.g., Python, Shell scripting).

Posted 1 month ago

Apply

3.0 - 7.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Talend - Designing, developing, and documenting existing Talend ETL processes, technical architecture, data pipelines, and performance scaling using tools to integrate Talend data and ensure data quality in a big data environment. AWS / Snowflake - Design, develop, and maintain data models using SQL and Snowflake / AWS Redshift-specific features. Collaborate with stakeholders to understand the requirements of the data warehouse. Implement data security, privacy, and compliance measures. Perform data analysis, troubleshoot data issues, and provide technical support to end-users. Develop and maintain data warehouse and ETL processes, ensuring data quality and integrity. Stay current with new AWS/Snowflake services and features and recommend improvements to existing architecture. Design and implement scalable, secure, and cost-effective cloud solutions using AWS / Snowflake services. Collaborate with cross-functional teams to understand requirements and provide technical guidance.

Posted 1 month ago

Apply

8.0 - 13.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Key Responsibilities Set up and maintain monitoring dashboards for ETL jobs using Datadog, including metrics, logs, and alerts. Monitor daily ETL workflows and proactively detect and resolve data pipeline failures or performance issues. Create Datadog Monitors for job status (success/failure), job duration, resource utilization, and error trends. Work closely with Data Engineering teams to onboard new pipelines and ensure observability best practices. Integrate Datadog with tools. Conduct root cause analysis of ETL failures and performance bottlenecks. Tune thresholds, baselines, and anomaly detection settings in Datadog to reduce false positives. Document incident handling procedures and contribute to improving overall ETL monitoring maturity. Participate in on call rotations or scheduled support windows to manage ETL health. Required Skills & Qualifications 3+ years of experience in ETL/data pipeline monitoring, preferably in a cloud or hybrid environment. Proficiency in using Datadog for metrics, logging, alerting, and dashboards. Strong understanding of ETL concepts and tools (e.g., Airflow, Informatica, Talend, AWS Glue, or dbt). Familiarity with SQL and querying large datasets. Experience working with Python, Shell scripting, or Bash for automation and log parsing. Understanding of cloud platforms (AWS/GCP/Azure) and services like S3, Redshift, BigQuery, etc. Knowledge of CI/CD and DevOps principles related to data infrastructure monitoring. Preferred Qualifications Experience with distributed tracing and APM in Datadog. Prior experience monitoring Spark, Kafka, or streaming pipelines. Familiarity with ticketing tools (e.g., Jira, ServiceNow) and incident management workflows.

Posted 1 month ago

Apply

4.0 - 9.0 years

4 - 8 Lacs

Hyderabad

Work from Office

SkillData Engineer RoleT3, T2 Key responsibility Data Engineer Must have 5+ years of experience in below mentioned skills. Must HaveBig Data Concepts , Python(Core Python- Able to write code), SQL, Shell Scripting, AWS S3 Good to HaveEvent-driven/AWA SQS, Microservices, API Development, Kafka, Kubernetes, Argo, Amazon Redshift, Amazon Aurora

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies