Home
Jobs
Companies
Resume

170 Snaplogic Jobs - Page 2

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

110.0 years

0 Lacs

Itanagar, Arunachal Pradesh, India

On-site

Linkedin logo

Our Company We’re Hitachi Digital, a company at the forefront of digital transformation and the fastest growing division of Hitachi Group. We’re crucial to the company’s strategy and ambition to become a premier global player in the massive and fast-moving digital transformation market. Our group companies, including GlobalLogic, Hitachi Digital Services, Hitachi Vantara and more, offer comprehensive services that span the entire digital lifecycle, from initial idea to full-scale operation and the infrastructure to run it on. Hitachi Digital represents One Hitachi, integrating domain knowledge and digital capabilities, and harnessing the power of the entire portfolio of services, technologies, and partnerships, to accelerate synergy creation and make real-world impact for our customers and society as a whole. Imagine the sheer breadth of talent it takes to unleash a digital future. We don’t expect you to ‘fit’ every requirement – your life experience, character, perspective, and passion for achieving great things in the world are equally as important to us. The Team We are seeking a highly skilled and experienced Senior API Integration Developer to join our team. The ideal candidate will have extensive experience in developing and managing API integrations, ensuring seamless data flow between systems. This role will be crucial in supporting our data fabric and data mesh hybrid data foundation. The Role I this role you will design, develop, and maintain API integrations to support business applications and data systems. Youw will collaborate with cross-functional teams to understand integration requirements and provide technical solutions as well as ensure the security, scalability, and performance of API integrations. You will troubleshoot and resolve integration issues in a timely manner, develop and maintain documentation for API integrations and processes, with experience in Swagger-like API documentation being a must. You will always stay updated with the latest API integration technologies and best practices; will provide training and support to junior developers on API integration techniques, while working closely with the Data Integration Architect to ensure alignment with overall data architecture. What You’ll Bring Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Extensive experience in API development and integration. Proficiency in API management tools such as Postman, Swagger, or Apigee. Strong knowledge of RESTful and SOAP APIs. Experience with programming languages such as Java, Python, or JavaScript. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Ability to work independently and as part of a team. Prior experience working with SnapLogic integration platform is a plus. Prior knowledge of creating APIs on Graph DB is required. Preferred Skills Experience with cloud-based integration platforms such as AWS, GCP, or Azure. Familiarity with microservices architecture. Knowledge of data integration tools such as MuleSoft or Dell Boomi. Certification in API development or integration. About Us We’re a global, 1000-stong, diverse team of professional experts, promoting and delivering Social Innovation through our One Hitachi initiative (OT x IT x Product) and working on projects that have a real-world impact. We’re curious, passionate and empowered, blending our legacy of 110 years of innovation with our shaping our future. Here you’re not just another employee; you’re part of a tradition of excellence and a community working towards creating a digital future. Championing diversity, equity, and inclusion Diversity, equity, and inclusion (DEI) are integral to our culture and identity. Diverse thinking, a commitment to allyship, and a culture of empowerment help us achieve powerful results. We want you to be you, with all the ideas, lived experience, and fresh perspective that brings. We support your uniqueness and encourage people from all backgrounds to apply and realize their full potential as part of our team. How We Look After You We help take care of your today and tomorrow with industry-leading benefits, support, and services that look after your holistic health and wellbeing. We’re also champions of life balance and offer flexible arrangements that work for you (role and location dependent). We’re always looking for new ways of working that bring out our best, which leads to unexpected ideas. So here, you’ll experience a sense of belonging, and discover autonomy, freedom, and ownership as you work alongside talented people you enjoy sharing knowledge with. We’re proud to say we’re an equal opportunity employer and welcome all applicants for employment without attention to race, colour, religion, sex, sexual orientation, gender identity, national origin, veteran, age, disability status or any other protected characteristic. Should you need reasonable accommodations during the recruitment process, please let us know so that we can do our best to set you up for success. Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

Your work days are brighter here. At Workday, it all began with a conversation over breakfast. When our founders met at a sunny California diner, they came up with an idea to revolutionize the enterprise software market. And when we began to rise, one thing that really set us apart was our culture. A culture which was driven by our value of putting our people first. And ever since, the happiness, development, and contribution of every Workmate is central to who we are. Our Workmates believe a healthy employee-centric, collaborative culture is the essential mix of ingredients for success in business. That’s why we look after our people, communities and the planet while still being profitable. Feel encouraged to shine, however that manifests: you don’t need to hide who you are. You can feel the energy and the passion, it's what makes us unique. Inspired to make a brighter work day for all and transform with us to the next stage of our growth journey? Bring your brightest version of you and have a brighter work day here. At Workday, we value our candidates’ privacy and data security. Workday will never ask candidates to apply to jobs through websites that are not Workday Careers. Please be aware of sites that may ask for you to input your data in connection with a job posting that appears to be from Workday but is not. In addition, Workday will never ask candidates to pay a recruiting fee, or pay for consulting or coaching services, in order to apply for a job at Workday. About The Team The Business Technology Go-To-Market team works in close partnership with our business partners to help fuel growth and revenue goals for Workday, along with driving outstanding Customer and employee experiences. The team is responsible for developing and supporting innovative architecture-led solutions for our Marketing, Sales, Services, Customer Support & Legal business functions with Salesforce being the primary platform alongside other groundbreaking platforms like SnapLogic for Integrations, Conga/Apttus for CPQ, CLM, AWS as PaaS, Coveo Search Platform, OKTA for SSO and others. About The Role We’re looking for a Senior Quality Engineer who is sharp, creative, and a relentless problem-solver to lead test automation efforts across our enterprise platforms— primarily Salesforce, third party application integration and Adobe Experience Manager . You’ll take ownership of both strategic and hands-on aspects of quality assurance, focusing on building scalable automation frameworks using Tricentis Tosca . This role demands technical excellence, strategic thinking , and a strong understanding of quality engineering practices in Agile environments. You’ll lead testing for key releases, design reusable automation components, and influence quality at every step of the SDLC. You’ll also serve as a mentor and guide for other QA engineers to raise the overall bar for engineering quality. Key Responsibilities Drive in-sprint automation delivery by collaborating closely with developers, business analysts, and product owners during Agile ceremonies, sprint planning, and story refinement. Lead end-to-end testing for Salesforce applications (Sales Cloud, Service Cloud, CPQ), including regression, UAT, smoke, and sanity tests. Design and develop Tosca automation scripts using RTB (Reusable Test Blocks), TCD, TDS, TCP, and steer efforts for automation in DEX environments. Translate business requirements into effective test strategies and scenarios, applying positive, negative, and edge-case design techniques to maximize coverage. Document test plans, test cases, test data sets, and ensure traceability to user stories and acceptance criteria. Identify and manage defects through RCA (Root Cause Analysis) and partner with cross-functional teams to validate and close defects. Lead test optimization and script maintainability by reusing assets, modularizing tests, and applying best practices for low-maintenance automation. Mentor and coach QA team members, conduct reviews of test assets, and foster a culture of continuous improvement and innovation. Integrate automation into CI/CD pipelines in collaboration with DevOps/DevSecOps teams to ensure rapid feedback and early defect detection. About You Required Qualifications 5+ years in Software Quality Engineering, with a strong blend of Automation and Manual expertise in Salesforce. Proven experience with Tosca automation or Selenium/Playwright for Salesforce and/or Adobe Experience Manager. 2 years in automation. Strong understanding of Salesforce architecture, including custom objects, workflows, CPQ, RBAC, and multi-cloud integrations. Salesforce experience is a must. Experience in agile development environments, including working with JIRA or Azure DevOps or similar, qTest or any other test management tool, and version control tools like Git. Preferred Qualifications ISTQB Advanced Test Analyst or equivalent certification. Hands-on experience in CI/CD tools (e.g., Jenkins, GitLab), cloud platforms (AWS – EC2, S3, RDS), and cloud-based testing strategies. Familiarity with REST APIs, OAuth, SSO, and tools such as Postman, JSON/XML, Selenium, Cucumber, or Cypress. Programming/scripting knowledge in C#, Java, JavaScript, Python, etc., is a strong plus. Demonstrated ability to mentor junior QA engineers, review automation artifacts, and troubleshoot complex testing challenges. Proven experience creating and maintaining robust Tosca automation frameworks, with deep understanding of DEX execution and environment-agnostic test design. Solid understanding of test data management, especially for dynamic test generation and environment-agnostic scripting. TA1 Tricentis Tosca certification required. Our Approach to Flexible Work With Flex Work, we’re combining the best of both worlds: in-person time and remote. Our approach enables our teams to deepen connections, maintain a strong community, and do their best work. We know that flexibility can take shape in many ways, so rather than a number of required days in-office each week, we simply spend at least half (50%) of our time each quarter in the office or in the field with our customers, prospects, and partners (depending on role). This means you'll have the freedom to create a flexible schedule that caters to your business, team, and personal needs, while being intentional to make the most of time spent together. Those in our remote "home office" roles also have the opportunity to come together in our offices for important moments that matter. Are you being referred to one of our roles? If so, ask your connection at Workday about our Employee Referral process! , Show more Show less

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

Your work days are brighter here. At Workday, it all began with a conversation over breakfast. When our founders met at a sunny California diner, they came up with an idea to revolutionize the enterprise software market. And when we began to rise, one thing that really set us apart was our culture. A culture which was driven by our value of putting our people first. And ever since, the happiness, development, and contribution of every Workmate is central to who we are. Our Workmates believe a healthy employee-centric, collaborative culture is the essential mix of ingredients for success in business. That’s why we look after our people, communities and the planet while still being profitable. Feel encouraged to shine, however that manifests: you don’t need to hide who you are. You can feel the energy and the passion, it's what makes us unique. Inspired to make a brighter work day for all and transform with us to the next stage of our growth journey? Bring your brightest version of you and have a brighter work day here. At Workday, we value our candidates’ privacy and data security. Workday will never ask candidates to apply to jobs through websites that are not Workday Careers. Please be aware of sites that may ask for you to input your data in connection with a job posting that appears to be from Workday but is not. In addition, Workday will never ask candidates to pay a recruiting fee, or pay for consulting or coaching services, in order to apply for a job at Workday. About The Team About Go-To-Market Team: The Business Technology Go-To-Market team works in close partnership with our business partners to help fuel growth and revenue goals for Workday, along with driving exceptional Customer and employee experiences. The team is responsible for developing and supporting innovative architecture-led solutions for our Marketing, Sales, Services, Customer Support & Legal business functions with Salesforce being the primary platform alongside other cutting edge platforms like SnapLogic for Integrations, Conga/Apttus CPQ or equivalent, CLM, AWS as PaaS, Coveo Search Platform, OKTA for SSO and others. About The Role The Sr Associate Quality Assurance Engineer will develop, modify, and execute software test plans, automated scripts, and programs for testing. This role involves debugging software products through systematic tests to ensure and maintain quality standards for Workday’s products. The Quality Assurance Engineer will also ensure that system tests are effectively completed, documented, and resolved. Key Responsibilities Design and implement automated and manual test cases for Salesforce applications and related systems. Develop, maintain, and execute functional, regression, and integration test cases for Salesforce applications. Create and run automation test scripts using tools such as Selenium, Playwright, Cypress, TOSCA, or Provar. Conduct API testing using Postman, Swagger, or similar tools, validating request/response behavior across integrations. Actively participate in Agile ceremonies, sprint planning, and story refinement, contributing QA insights early in the development cycle. Document test plans, test cases, and test data sets, ensuring clear traceability to requirements. Identify, log, and track defects, perform root cause analysis (RCA), and collaborate with cross-functional teams to revalidate fixes. Support test optimization efforts through reusable components and automation improvements. Provide input on story refinement and share QA insights during sprint ceremonies. Apply test design techniques (including positive, negative, and edge case coverage) to improve overall quality outcomes. Validate Salesforce workflows, custom objects, and system integrations across different environments. Collaborate with developers and product teams to identify gaps, clarify requirements, and ensure robust test coverage. About You Basic Qualifications: Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field from a reputed university. 3 - 5 years of experience in Software Quality Assurance, with a balance of both test automation and manual testing. Hands-on experience working in an Agile environment, actively contributing to sprint planning and test-driven development. Strong hands-on automation experience with programming skills. Other Qualifications: Experience with Selenium WebDriver and automation test suites at scale using tools like JUnit, Bamboo, Silk Test, Selenium, or Cypress. Preferred certifications: TOSCA Certification (TA1 or equivalent) or Salesforce Certification such as Administrator, Marketing Cloud Associate or Administrator, CPQ Specialist, Business Analyst, Platform App Builder. Familiarity with XSLT, REST API, and Web Services. Experience with end-to-end testing, including test planning, execution, UAT, and regression testing. Strong communication skills with the ability to collaborate effectively across teams. Self-motivated, enthusiastic, and curious, with a proactive approach to problem-solving and continuous learning. Strong analytical and problem-solving skills with a keen eye for detail. Our Approach to Flexible Work With Flex Work, we’re combining the best of both worlds: in-person time and remote. Our approach enables our teams to deepen connections, maintain a strong community, and do their best work. We know that flexibility can take shape in many ways, so rather than a number of required days in-office each week, we simply spend at least half (50%) of our time each quarter in the office or in the field with our customers, prospects, and partners (depending on role). This means you'll have the freedom to create a flexible schedule that caters to your business, team, and personal needs, while being intentional to make the most of time spent together. Those in our remote "home office" roles also have the opportunity to come together in our offices for important moments that matter. Are you being referred to one of our roles? If so, ask your connection at Workday about our Employee Referral process! Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

India

Remote

Linkedin logo

About us Founded in 2008, CitNOW is an innovative, enterprise-level software product suite that allows automotive dealerships globally to sell more vehicles and parts more profitably. CitNOW’s app-based platform provides a secure, brand-compliant solution – for dealers to build trust, transparency and long-lasting relationships. CitNOW Group was formed in 2021 to unite a portfolio of 12 global software companies leveraging innovation to aid retailers and manufacturers in delivering an outstanding customer experience. We have over 300 employees worldwide who all contribute to our vision to provide market-leading automotive solutions to drive efficiencies, seamlessly transforming every customer moment. The CitNOW Group is no ordinary technology company, we live a series of One Team values and this guiding principle forms the foundation of CitNOW Group’s award winning, collaborative and inclusive culture. Recognised recently within the Top 25 Best Mid Sized Companies to work for within the UK, we pride ourselves on being a great place to work. About the role We are seeking a highly skilled Technical Business Analyst to join our team, working with a leading Automotive Showroom Enquiry Management System. The ideal candidate will bring extensive experience in business analysis, a deep understanding of the automotive sector and the ability to manage and deliver complex Agile projects from inception to completion. This role involves working closely with stakeholders to analyse business needs, create critical Agile artefacts and support delivery throughout the project lifecycle. Automotive industry expertise is preferred. Experience working with public cloud-based enterprise data transform and load products and services (e.g., Mulesoft, SnapLogic, Boomi, Kafka) is preferred but not required. Proficiency in JIRA is also highly advantageous. Key responsibilities include: Lead Analysis Projects Drive and manage complex analysis initiatives, ensuring alignment with business objectives and technical requirements Engage with internal and external stakeholders to gather, refine, and validate requirements Ability to analyse relational databases and web APIs Agile Artefact Creation Produce all standard Agile delivery artefacts, including user stories, process flows, wireframes, acceptance criteria, and business rules documentation. Project Lifecycle Support Collaborate with delivery teams to support the full project lifecycle, from requirements gathering and planning to testing, deployment, and post-implementation reviews. Facilitate Agile Ceremonies Lead estimation sessions, sprint planning, backlog refinement, stand-ups, and retrospectives, ensuring a smooth and transparent delivery process. Business Process Mapping Conduct detailed analysis and documentation of automotive business processes, with a focus on areas such as DMS and integration workflows. Identify opportunities for process improvement and optimisation. Integration Analysis Analyse and document system integration requirements and dependencies between Dealerweb Showroom and other platforms, ensuring seamless data flow and system performance. Work with the engineering teams to understand and decipher the technical aspects of the system to be considered when specifying the requirements Stakeholder Engagement Act as a bridge between business and technical teams, facilitating communication and ensuring clarity in objectives and outcomes. We are looking for: Minimum of 5 years’ experience in business analysis in complex, Agile environments Strong understanding of APIs and API authentication methods Experience writing and maintaining API documentation for both internal and external stakeholders In-depth knowledge of Agile methodologies and hands-on experience in Agile project delivery. Experience working with public cloud-based enterprise data transform and load products and services (e.g., Mulesoft, SnapLogic, Boomi, Kafka) preferred but not required. Excellent stakeholder management and communication skills Ability to work independently and collaboratively in a remote working environment In addition to a competitive salary, our benefits package is second to none. Employee wellbeing is at the heart of our people strategy, with a number of innovative wellness initiatives such as flexi-time, where employees can vary their start and finish times within our core business hours and/or extend their lunch break by up to 2 hours per day. Employees also benefit from an additional two half days paid leave per year to focus on their personal wellbeing. We recognise the development of our people is vital to the ongoing success of the business and proudly promote a culture of continuous learning and improvement, along with opportunities to develop and progress a successful career with us. The CitNOW Group is an equal opportunities employer that celebrates diversity across our international teams. We are passionate about creating an inclusive workplace where everyone’s individuality is valued. View our candidate privacy policy here - CitNOW-Group-Candidate-Privacy-Policy.pdf (citnowgroup.com) Show more Show less

Posted 1 week ago

Apply

6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

6+ years of experience in data engineering. Strong knowledge in SQL. Expertise in Snowflake, DBT, and Python minimum 2+ years. SnapLogic or FivTran tool knowledge is an added advantage. The following skills are mandatory for an Astrid Data Engineer: Knowledge in AWS Cloud (AWS S3 & Lambda). Show more Show less

Posted 1 week ago

Apply

110.0 years

0 Lacs

India

On-site

Linkedin logo

Our Company: We’re Hitachi Digital, a company at the forefront of digital transformation and the fastest growing division of Hitachi Group. We’re crucial to the company’s strategy and ambition to become a premier global player in the massive and fast-moving digital transformation market. Our group companies, including GlobalLogic, Hitachi Digital Services, Hitachi Vantara and more, offer comprehensive services that span the entire digital lifecycle, from initial idea to full-scale operation and the infrastructure to run it on. Hitachi Digital represents One Hitachi, integrating domain knowledge and digital capabilities, and harnessing the power of the entire portfolio of services, technologies, and partnerships, to accelerate synergy creation and make real-world impact for our customers and society as a whole. Imagine the sheer breadth of talent it takes to unleash a digital future. We don’t expect you to ‘fit’ every requirement – your life experience, character, perspective, and passion for achieving great things in the world are equally as important to us. The team: We are seeking a highly skilled and experienced Senior API Integration Developer to join our team. The ideal candidate will have extensive experience in developing and managing API integrations, ensuring seamless data flow between systems. This role will be crucial in supporting our data fabric and data mesh hybrid data foundation. The role: Design, develop, and maintain API integrations to support business applications and data systems. Collaborate with cross-functional teams to understand integration requirements and provide technical solutions. Ensure the security, scalability, and performance of API integrations. Troubleshoot and resolve integration issues in a timely manner. Develop and maintain documentation for API integrations and processes, with experience in Swagger-like API documentation being a must. Stay updated with the latest API integration technologies and best practices. Provide training and support to junior developers on API integration techniques. Work closely with the Data Integration Architect to ensure alignment with overall data architecture. What you’ll bring: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Extensive experience in API development and integration. Proficiency in API management tools such as Postman, Swagger, or Apigee. Strong knowledge of RESTful and SOAP APIs. Experience with programming languages such as Java, Python, or JavaScript. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Ability to work independently and as part of a team. Prior experience working with SnapLogic integration platform is a plus. Prior knowledge of creating APIs on Graph DB is required. Preferred Skills: Experience with cloud-based integration platforms such as AWS, GCP, or Azure. Familiarity with microservices architecture. Knowledge of data integration tools such as MuleSoft or Dell Boomi. Certification in API development or integration. Why Join Us: Be part of a forward-thinking company that values innovation and creativity. Work with a talented and dynamic team on cutting-edge projects. Competitive salary and benefits package. Opportunities for professional growth and development. About us We’re a global, 1000-stong, diverse team of professional experts, promoting and delivering Social Innovation through our One Hitachi initiative (OT x IT x Product) and working on projects that have a real-world impact. We’re curious, passionate and empowered, blending our legacy of 110 years of innovation with our shaping our future. Here you’re not just another employee; you’re part of a tradition of excellence and a community working towards creating a digital future. Championing diversity, equity, and inclusion Diversity, equity, and inclusion (DEI) are integral to our culture and identity. Diverse thinking, a commitment to allyship, and a culture of empowerment help us achieve powerful results. We want you to be you, with all the ideas, lived experience, and fresh perspective that brings. We support your uniqueness and encourage people from all backgrounds to apply and realize their full potential as part of our team. How We Look After You We help take care of your today and tomorrow with industry-leading benefits, support, and services that look after your holistic health and wellbeing. We’re also champions of life balance and offer flexible arrangements that work for you (role and location dependent). We’re always looking for new ways of working that bring out our best, which leads to unexpected ideas. So here, you’ll experience a sense of belonging, and discover autonomy, freedom, and ownership as you work alongside talented people you enjoy sharing knowledge with. We’re proud to say we’re an equal opportunity employer and welcome all applicants for employment without attention to race, colour, religion, sex, sexual orientation, gender identity, national origin, veteran, age, disability status or any other protected characteristic. Should you need reasonable accommodations during the recruitment process, please let us know so that we can do our best to set you up for success. Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Summary We are seeking a skilled and proactive Data Engineer with a strong background in ETL development and a focus on integrating data quality frameworks. In this role, you will be responsible for designing, developing, and maintaining ETL pipelines while ensuring data quality is embedded throughout the process. You will play a crucial role in building robust and reliable data pipelines that deliver high-quality data to our data warehouse and other systems. Responsibilities Design, develop, and implement ETL processes to extract data from various source systems, transform it according to business requirements, and load it into target systems (e.g., data warehouse, data lake) Implement data validation and error handling within ETL pipelines. Build and maintain scalable, reliable, and efficient data pipelines. Design and implement data quality checks, validations, and transformations within ETL processes. Automate data quality monitoring, alerting, and reporting within ETL pipelines. Develop and implement data quality rules and standards within ETL processes. Integrate data from diverse sources, including databases, APIs, flat files, and cloud-based systems. Utilize ETL tools and technologies (e.g., SnapLogic, Informatica PowerCenter, Talend, AWS Glue, Apache Airflow, Azure Data Factory, etc.). Write SQL queries to extract, transform, load, and validate data. Use scripting languages (e.g., Python) to automate ETL processes, data quality checks, and data transformations. Show more Show less

Posted 1 week ago

Apply

6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

6 years of experience in data engineering. Expertise in Snowflake, DBT, and Python minimum 2 years. Knowledge in AWS Cloud (AWS S3 & Lambda). SnapLogic or FivTran tool knowledge is an added advantage. Having knowledge of streaming data pipelines (Kafka, HiveMQ) will be helpful. Willing to learn and upskill in the Cognite data platform. Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Haryana

On-site

Haryana Job ID 30180893 Job Category Digital Technology Country: India Location: Capital Cyberscape, 2nd Floor, Ullahwas, Sector 59, Gurugram, Haryana 122102 Role: Data Engineer Location: Gurgaon Full/ Part-time: Full Time Build a career with confidence. Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do.: Summary Established Data Science & Analytics professional. Creating data mining architectures/models/protocols, statistical reporting, and data analysis methodologies to identify trends in large data sets About the role: Work with data to solve business problems, building and maintaining the infrastructure to answer questions and improve processes Help streamline our data science workflows, adding value to our product offerings and building out the customer lifecycle and retention models Work closely with the data science and business intelligence teams to develop data models and pipelines for research, reporting, and machine learning Be an advocate for best practices and continued learning Key Responsibilities: Expert coding proficiency On Snowflake Exposure to SnowSQL, Snowpipe, Role based access controls, ETL / ELT tools like Nifi,Snaplogic,DBT Familiar with building data pipelines that leverage the full power and best practices of Snowflake as well as how to integrate common technologies that work with Snowflake (code CICD, monitoring, orchestration, data quality, monitoring) Designing Data Ingestion and Orchestration Pipelines using nifi, AWS, kafka, spark, control M Establish strategies for data extraction, ingestion, transformation, automation, and consumption. Role Responsibilities: Experience in Data Lake Concepts with Structured, Semi-Structured and Unstructured Data Experience in strategies for Data Testing, Data Quality, Code Quality, Code Coverage Hands on expertise with Snowflake preferably with SnowPro Core Certification Develop a data model/architecture providing integrated data architecture that enables business services with strict quality management and provides the basis for the future knowledge management processes. Act as interface between business and development teams to guide thru solution end-to-end. Define tools used for design specifications, data modelling and data management capabilities with exploration into standard tools. Good understanding of data technologies including RDBMS, No-SQL databases. Requirem ents A minimum of 3 years prior relevant experience Strong exposure to Data Modelling, Data Access Patterns and SQL Knowledge of Data Storage Fundamentals, Networking Good to Have Exposure of AWS tools/Services Ability to conduct testing at different levels and stages of the project Knowledge of scripting languages like Java, Python Education: Bachelor's degree in computer systems, Information Technology, Analytics, or related business area. Benefits We are committed to offering competitive benefits programs for all of our employees and enhancing our programs when necessary. Have peace of mind and body with our health insurance Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Programme Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class.

Posted 1 week ago

Apply

6.0 years

0 Lacs

Haryana

On-site

Haryana Job ID 30180894 Job Category Digital Technology Country: India Location: Capital Cyberscape, 2nd Floor, Ullahwas, Sector 59, Gurugram, Haryana 122102 Role: Data Engineer Location: Gurgaon Full/ Part-time: Full Time Build a career with confidence. Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do.: Summary Established Data Science & Analytics professional. Creating data mining architectures/models/protocols, statistical reporting, and data analysis methodologies to identify trends in large data sets About the role: Work with data to solve business problems, building and maintaining the infrastructure to answer questions and improve processes Help streamline our data science workflows, adding value to our product offerings and building out the customer lifecycle and retention models Work closely with the data science and business intelligence teams to develop data models and pipelines for research, reporting, and machine learning Be an advocate for best practices and continued learning Key Responsibilities: Expert coding proficiency On Snowflake Exposure to SnowSQL, Snowpipe, Role based access controls, ETL / ELT tools like Nifi,Snaplogic,DBT Familiar with building data pipelines that leverage the full power and best practices of Snowflake as well as how to integrate common technologies that work with Snowflake (code CICD, monitoring, orchestration, data quality, monitoring) Designing Data Ingestion and Orchestration Pipelines using nifi, AWS, kafka, spark, control M Establish strategies for data extraction, ingestion, transformation, automation, and consumption. Role Responsibilities: Experience in Data Lake Concepts with Structured, Semi-Structured and Unstructured Data Experience in strategies for Data Testing, Data Quality, Code Quality, Code Coverage Hands on expertise with Snowflake preferably with SnowPro Core Certification Develop a data model/architecture providing integrated data architecture that enables business services with strict quality management and provides the basis for the future knowledge management processes. Act as interface between business and development teams to guide thru solution end-to-end. Define tools used for design specifications, data modelling and data management capabilities with exploration into standard tools. Good understanding of data technologies including RDBMS, No-SQL databases. Requirem ents A minimum of 6 years prior relevant experience Strong exposure to Data Modelling, Data Access Patterns and SQL Knowledge of Data Storage Fundamentals, Networking Good to Have Exposure of AWS tools/Services Ability to conduct testing at different levels and stages of the project Knowledge of scripting languages like Java, Python Education: Bachelor's degree in computer systems, Information Technology, Analytics, or related business area. Benefits We are committed to offering competitive benefits programs for all of our employees and enhancing our programs when necessary. Have peace of mind and body with our health insurance Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Programme Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class.

Posted 1 week ago

Apply

15.0 years

0 Lacs

Bengaluru

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SnapLogic Good to have skills : NA Minimum 2 year(s) of experience is required Educational Qualification : 15 years of full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will collaborate with teams to ensure seamless integration and functionality. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Develop and implement efficient SnapLogic solutions. - Collaborate with cross-functional teams to integrate applications. - Troubleshoot and resolve application issues effectively. - Stay updated with the latest SnapLogic trends and technologies. - Provide technical guidance and support to junior team members. Professional & Technical Skills: - Must To Have Skills: Proficiency in SnapLogic. - Strong understanding of ETL processes and data integration. - Experience in designing and implementing complex data pipelines. - Knowledge of API integration and web services. - Hands-on experience with cloud-based integration platforms. Additional Information: - The candidate should have a minimum of 2 years of experience in SnapLogic. - This position is based at our Bengaluru office. - A 15 years of full-time education is required. 15 years of full time education

Posted 1 week ago

Apply

0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

Strong working knowledge of modern programming languages, ETL/Data Integration tools (preferably SnapLogic) and understanding of Cloud Concepts. SSL/TLS, SQL, REST, JDBC, JavaScript, JSON Has Strong hands-on experience in Snaplogic Design/Development. Has good working experience using various snaps for JDBC, SAP, Files, Rest, SOAP, etc. Good to have the ability to build complex mappings with JSON path expressions, and Python scripting. Good to have experience in ground plex and cloud plex integrations. Has Strong hands-on experience in Snaplogic Design/Development. Has good working experience using various snaps for JDBC, SAP, Files, Rest, SOAP, etc. Should be able to deliver the project by leading a team of the 6-8member team. Should have had experience in integration projects with heterogeneous landscapes. Good to have the ability to build complex mappings with JSON path expressions, flat files and cloud. Good to have experience in ground plex and cloud plex integrations. Experience in one or more RDBMS (Oracle, DB2, and SQL Server, PostgreSQL and RedShift) Real-time experience working in OLAP & OLTP database models (Dimensional models). Show more Show less

Posted 1 week ago

Apply

15.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Title: Analytics and Reporting (A&R) – Biometrics Dev Ops Lead Career Level: E Introduction to role Are you prepared to apply your expertise in biometrics and DevOps to make a real impact? As an A&R Biometrics DevOps Lead, you will be a key member of the A&R Product IT DevOps team, managing core A&R applications such as Entimice and SAS GRID. Collaborate with Platform Managers, Product Managers/Owners, Senior Platform Engineers, Release Managers, and Solution/Integration Architects to establish standard processes in software development, business analysis, solution design, enterprise software integration, and project management within the BizDevOps IT Platform Delivery Model. Accountabilities Collaborate with vendors and internal collaborators to define requirements and develop implementation plans. Monitor system change requests and ensure alignment to IT standards. Serve as Technical Subject Matter Expert (SME) for SAS-related applications. Provide guidance and assist in the preparation of system-related specifications and documentation. Maintain day-to-day application systems, identifying and solving issues. Ensure all production changes align with life-cycle methodology and risk guidelines. Work on application enhancements and upgrades as vital. Liaise with internal teams/vendors to address application issues. Communicate effectively with users during planned/unplanned outages. Follow good documentation practices by creating and publishing Knowledge Base Articles (KBAs). Create technical backlogs/Story/Epic in JIRA based on product priority. Essential Skills/Experience Biometrics (A&R) Domain/Technology Bachelor or master’s degree in computer science or life science related field with IT/CRO/Pharma experience with a total of 15 years of experience. Industry experience working or implementing solutions based on CDISC Clinical Reporting Standards (SDTM, AdaM, TLF Reporting). Minimum 7-8 years working experience in SAS GRID related administration. Accredited Certifications SAS Administration from Vendor. Good understanding of SAS Programming & shell scripting, reporting tools, infrastructure. Ensure system security and control procedures are implemented and maintained. Maintain configuration specification documentation for functional and integration configuration. Work with software vendors on product requirements and issues related to the platform. Knowledge of client-server networking and database management. Administration background in SAS Viya 3.5/4.0 environments. Experience with SAS Visual Analytics & Power BI highly preferred. Familiarity with shell scripting & groovy scripting. Experience in R and Python is an added advantage. End-to-end knowledge of clinical trial development processes and associated system landscape. Solid grasp of common industry standard business practices related to clinical trials. Strong knowledge of ICH/G guidelines, 21 CFR Part 11, clinical trial methodology, and software development lifecycle activities. Knowledge of support processes like Incident Management, Problem Management, Change Management. Knowledge or experience with JMP Clinical, JReview, StatXact, CR Toolkit, Pinnacle 21. IT Engineering: Expertise and engineering attitude to help design and implement Clinical Data Solutions adhering to Products on Platform Strategy. Demonstrable ability to handle diverse collaborators and ensure high satisfaction delivery. Ability to work independently in a dynamic environment. Good communication and interpersonal skills to lead customers in urgent situations. Experience in Gxp validated systems implementation and maintenance of applications. Knowledge of Identity management solutions based on AZURE, OKTA, Oauth, Ping Federate/AD technologies. Understanding database concepts for optimizing reporting, data mapping, and programming. Experience in developing Interfaces/Integration between on-premises and SaaS Platform enabled Products using API’s, Web Services (MuleSoft), and ETL (SnapLogic) tech enablers. Knowledge of using/creating build tools and CI/CD (maven, ant, gradle, Sonar). Experience with Cloud Platforms such as AWS, AZURE. Experience working with JIRA, Confluence, Bitbucket, Automated Testing tools. Experience working in agile teams using methodologies such as SCRUM and SAFe & leading a team of 5+ & working in an Agile DevOps model. Experience programming in Java or other object-oriented programming languages. Experience delivering or developing workshop material, training, gap analyses, or requirements gathering sessions with business and system collaborators. Certification in IT delivery framework, Scrum Master, DevOps Lead role. Participate in business requirement gathering and design activities with business & IT collaborators as part of scrum team. Perform delivery activities through Design/Build/Test/Deploy phases for regular releases on A&R platform. When we put unexpected teams in the same room, we fuel bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and daring world. At AstraZeneca, we demonstrate technology to impact patients' lives by developing life-changing medicines. We are a purpose-led global organization that pushes the boundaries of science. Our work directly impacts patients by redefining our ability to develop life-changing medicines. We empower our business to perform at its peak by combining ground breaking science with leading digital technology platforms and data. Join us at a crucial stage of our journey to become a digital and data-led enterprise. Here you can innovate, take ownership, experiment with groundbreaking technology, and solve challenges that have never been addressed before. Ready to make a difference? Apply now! Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Visakhapatnam, Andhra Pradesh, India

On-site

Linkedin logo

Company Description Ritefit AI Pvt. Ltd. is a dynamic Talent and Technology Enablement firm dedicated to empowering organizations with AI-driven solutions and comprehensive human capital management services. Founded in 2025 and based in Visakhapatnam, Andhra Pradesh, the company leverages the extensive industry expertise of its founders to optimize workforce management and enhance operational efficiency. Ritefit AI Pvt. Ltd. specializes in HCM implementations, non-HCM consulting, and advanced analytics, offering tailored solutions to streamline HR processes and enable data-driven decision-making. Committed to excellence, the company provides exceptional staff augmentation, project management capabilities, and ongoing support to ensure clients maximize their technology investments. Role Description This is a full-time on-site role for a Snaplogic Developer. The Snaplogic Developer will be responsible for designing, developing, and maintaining high-quality integration solutions using Snaplogic. Day-to-day tasks include creating pipelines, debugging integration issues, and collaborating with cross-functional teams to assess business requirements. The developer will work on optimizing performance, ensuring data security, and providing technical support for Snaplogic solutions. This role is located in Visakhapatnam. Qualifications Computer Science background, preferably with a degree in Computer Science or related field Strong skills in Back-End Web Development and Software Development Proficiency in Programming and Object-Oriented Programming (OOP) Experience with integration platforms, preferably Snaplogic Excellent problem-solving and analytical skills Strong communication and team collaboration skills Familiarity with data security best practices Previous experience in the technology or AI-driven solutions industry is a plus Show more Show less

Posted 1 week ago

Apply

8.0 - 13.0 years

15 - 30 Lacs

Bengaluru

Remote

Naukri logo

About us: Cloud Raptor is Australias fastest growing and dynamic Cloud Service advisory partners. With global big 4 consulting DNA our key focus is to bring cool world class leading Cloud solutions and a customer experience that is second to none experience & people are at the heart of everything we do! A specialised technology services consultancy working across Australia, APAC, EMEA and NA. We are headquartered In Australia however our offshore development centre is in Bangalore. Cloud Raptor is a Salesforce, MuleSoft, Freshworks, and ServiceNow partner. A multi-technologies consultancy, specializing in Cloud, Data, Cyber security, programming, SAAS, Infra, testing, and project services. We’re currently working with several major brands both locally and internationally to help them improve the way they work by leveraging the power and pushing the boundaries of what Cloud technology can achieve. Not only do we provide amazing Cloud services capability, but we are also extremely focused on providing a work environment that is supportive, nurturing, safe and above all equal. Everyone deserves to be recognised for their hard work and an opportunity to progress in their career – if you’re up to the challenge and don’t mind hanging out with some cool people while providing awesome solutions – Cloud Raptor is the place for you! Job Description: We are seeking a dynamic and experienced Snaplogic Solution Designer to join our team. The ideal candidate will have a strong background in ETL development, data integration, and cloud services. As a Snaplogic Solution Designer, you will be responsible for designing, developing, and implementing integration solutions using the Snaplogic platform. Key Responsibilities: Design and develop integration solutions using Snaplogic to meet business requirements. Collaborate with stakeholders to understand business needs and translate them into technical solutions. Develop, document, and test ETL interfaces and data pipelines. Optimize and troubleshoot existing Snaplogic integrations. Provide technical expertise and support for Snaplogic-related projects. Ensure best practices in ETL development and data integration. Collaborate with cross-functional teams to deliver high-quality solutions. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Minimum of 5 years of experience in ETL development and data integration. Proficiency in Snaplogic and other ETL tools. Experience with cloud services such as AWS, Azure, or Google Cloud. Strong SQL and PL/SQL skills. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Ability to work independently and as part of a team. Preferred Qualifications: Experience with other integration tools such as Informatica or Talend. Knowledge of data warehousing concepts and best practices. Certification in Snaplogic or related technologies.

Posted 1 week ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

IAC deploy scalable, secure, and high-performing Snowflake environments in line with data governance and security in palce using Terraform and other automation scripit Automate infrastructure provisioning, testing, and deployment for seamless operations. Strong SQL & DBT Expertise Experience building and maintaining scalable data models in DBT. Proficient in modular SQL, Jinja templating, testing strategies, and DBT best practices. Data Warehouse Proficiency Hands-on experience with Snowflake including: Dimensional and data vault modeling (star/snowflake schemas) Performance optimization and query tuning Role-based access and security management Data Pipeline & Integration Tools Experience with Kafka (or similar event streaming tools) for ingesting real-time data. Familiarity with SnapLogic for ETL/ELT workflow design, orchestration, and monitoring. Version Control & Automation Proficient in Git and GitHub for code versioning and collaboration. Experience with GitHub Actions or other CI/CD tools to automate DBT model testing, deployment, and documentation updates. Data Quality & Governance Strong understanding of data validation, testing (e.g., dbt tests), and lineage tracking. Emphasis on maintaining data trust across pipelines and models. Stakeholder Management Partner with business and technical stakeholders to define data needs and deliver insights. Ability to explain complex data concepts in clear, non-technical terms. Documentation & Communication Maintain clear documentation for models, metrics, and data transformations (using DBT docs or similar). Strong verbal and written communication skills; able to work cross-functionally across teams. Problem-Solving & Ownership Proactive in identifying and resolving data gaps or issues. Self-starter with a continuous improvement mindset and a focus on delivering business value through data. Show more Show less

Posted 1 week ago

Apply

8.0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

At PwC, we connect people with diverse backgrounds and skill sets to solve important problems together and lead with purpose—for our clients, our communities and for the world at large. It is no surprise therefore that 429 of 500 Fortune global companies engage with PwC. Acceleration Centers (ACs) are PwC’s diverse, global talent hubs focused on enabling growth for the organization and value creation for our clients. The PwC Advisory Acceleration Center in Bangalore is part of our Advisory business in the US. The team is focused on developing a broader portfolio with solutions for Risk Consulting, Management Consulting, Technology Consulting, Strategy Consulting, Forensics as well as vertical specific solutions. PwC's high-performance culture is based on passion for excellence with focus on diversity and inclusion. You will collaborate with and receive support from a network of people to achieve your goals. We will also provide you with global leadership development frameworks and the latest in digital technologies to learn and excel in your career. At the core of our firm's philosophy is a simple construct: We care for our people. Globally PwC is ranked the 3rd most attractive employer according to Universum. Our commitment to Responsible Business Leadership, Diversity & Inclusion, work-life flexibility, career coaching and learning & development makes our firm one of the best places to work, learn and excel. PwC US ADVISORY - MANAGEMENT CONSULTING Our Management Consulting team works with our global clients to design and implement growth, operational and customer focused strategies for sustainable competitive advantage. Our thought leadership and unparalleled experience help clients turn formidable challenges into market advantage across the value chain and around the globe. Additionally, our extensive expertise in various industries allow us to serve clients with consulting services with a focus on the most profitable elements of the value chain to create scalable businesses that will deliver increased sustainable profits. Our Management Consultants work with the client and project teams to support global engagements from India through activities that are driven towards delivering results - conduct analyses and develop insights, prepare excel models, analyze large sets of data, capture as-is processes, prepare work plans, design to-be processes. In addition, consultants also support project lead, work directly with client teams and facilitate meetings to enable decision making, organize and prepare recommendations on client issues, and participate actively in new business development, thought leadership and firm building activities. COMPETENCY OVERVIEW: OPERATIONS The Operations team works with clients across industry verticals supporting engagements in - Differentiated Supply Chain (Planning, Inventory and Logistics) Strategic Supply Management (Sourcing) Competitive Manufacturing Innovation & Development (Product Strategy & Development, PLM, R&D Operations) Capital Projects & Infrastructure Position Requirements Knowledge Preferred: Candidates Should Demonstrate Substantial Experience And / Or Knowledge In Any Sector (experience In The Automotive, Aerospace & Defense, Health Industries, Including Pharmaceuticals And Medical Devices, Industrial Products, Energy, Chemicals, Utilities, Oil & Gas, Consumer Markets, Technology & Telecom And Retail Sectors Would Be An Added Bonus) In The Following Areas General Sourcing / Procurement: Spend Analysis, Category savings assessment, Procurement Diagnostics, Operating Model Design, Procurement Process Design, end-to-end Procurement Transformation & Modernization, Design and execution of Strategic Sourcing initiatives including RFI / RFP (Request for Information / Request for Proposal) design, development and analysis, Negotiation strategy, Supplier Management, Third Party Lifecycle Management, Supplier Risk Management, Contract Management, M&A - Procurement Synergy Assessments, Source-to-Pay Assessment, Design & Implementation, Category management with knowledge of various Direct & Indirect Categories. Candidates with experience in setting up large scale procurement COE for clients would be plus. Experience with Procure-to-Pay (P2P) platforms such as GEP SMART, Coupa, SAP Ariba / Fieldglass, Celonis and/or Ivalua as an implementer and administrator; ERP knowledge – SAP / PeopleSoft / ORACLE or others is preferred. Ability to take responsibility for all technical phases of an implementation project including architecture, design, development, test, customization, documentation and data migration solutions for business applications in the SaaS and other ERP systems. Understanding of fundamental principles of P2P including spend and commodity taxonomies, requisitioning, PO generation, receiving, invoice processing & payment processes. Understanding of Sourcing and Contracts Module Including GEP Price books, Supplier Module along with its Integration to ERP (SAP S/4 HANA, ORACLE etc.) Ability to Configure & Customize the fields in the GEP transactional documents like PO’s, ASN and Invoices. Ability to design the Approval work flows in GEP transactional documents like PO’s, Service Confirmations and Invoices. Understanding of GEP REST & Bulk API’s and their selection for integration with ERP Systems like SAP S/4HANA and other edge systems in the procurement Space. Functional Understanding of SAP Material Management Module for creation and update of transactional documents (PRs, POs, GRs, Invoices) Understanding of SAP S/4 HANA APIs for creation and update of transactional documents (PRs, POs, GRs, Invoices) Ability to gather technical requirements and conduct data mapping sessions to design the integrations between GEP and SAP and other third-Party edge systems like Enverus, ENFOS, SailPoint, Horizon, On Call, Storm Force etc. Ability to work with Middleware tools Like GEP Click, IICS, SAP CI, SAP PO, Dell Boomi to implement middleware logic as part of the Integrations between GEP SMART/QUANTUM & ERP system. Ability to load the Master data and Transactional data using the GEP Bulk API’s and interface utility tool. Ability to estimate the tasks and resources required to design, build, and test the integration design. Ability to work with clients during the hyper care phase to resolve integration issues between GEP and edge systems. Ability to provide expert advice to clients on integration API and flat file best practices and delivering efficient and simple integration design for easy maintenance of system post Go-Live. Skills Preferred Delivering significant business results that utilize strategic and creative thinking, problem solving, and taking individual initiative. Leading project work streams, providing oversight, delegating work to aid efficiency, and working autonomously with minimal direction. Collaborating with leadership to deliver client engagements and associated outputs, supporting the team in day-to-day client management, project management, and the development of high-quality client deliverables. Emphasizing the ability to build solid and collaborative relationships with team members and also taking initiative to build relationships with client stakeholders. Communicating effectively (written and verbal) to various situations and audiences. Managing and conducting or overseeing quantitative and qualitative benchmarking and primary and secondary analyses of large and complex data. Intermediate skills in developing integrations with Web Services, XML, JSON, SQL and or other integration technologies are mandatory. Hands on experience facilitating application integration architecture discussions and workshops. Proven experience in developing integrations using REST API, SOAP Web Services and other integration architectures. Hands-on experience working with middleware platforms such as GEP Click, IICS, SAP CI, SAP PO, Dell Boomi, MuleSoft, Snaplogic etc. Proven skills as a team member, team lead or project manager on at least three full life cycle implementations Functional knowledge of Procurement business processes is preferred. Technical aptitude and comfort in understanding Web-based applications and the SaaS, PaaS and IaaS Cloud models, and Integration concepts as well as application of these technologies in the enterprise environment. Highly organized with the ability to manage multiple simultaneous projects. Excellent written and verbal communication skills. Experience presenting a clear and concise technical narrative. Educational Background Bachelor's degree in computer science or equivalent preferred. A full-time master's degree/equivalent is a bonus. Additional Information Travel Requirements: Travel to client locations may be required as per project requirements. Line of Service: Advisory Industry: Management Consulting Designation: Senior Associate Location: Bangalore, India Past Experience: 8 - 12 years of prior relevant work experience, 3+ years of experience as Integration/ Lead Technical Architect capacity in an Agile/ Hybrid development environment for digital transformation in Source-to-Pay Programs or equivalent Preferred Work Experience: Experience in either of the following industries will be preferred- Hi-Tech, Consumer Electronics & Hardware, Software/ IT, Semiconductors, Telecommunications Show more Show less

Posted 1 week ago

Apply

4.0 - 6.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Role- Sr. Analyst - Programmer Experience- 4 - 6 Years Location- Gurgaon Notice- Immediate Joiners Job Description- Essential Skills and Experience 4-8 years' experience in development and strong analytical skills related to working data warehousing projects. Strong hands on experience advanced SQL Queries, stored procedures, packages, views, work on RDBMS like Oracle. Develop and maintain data pipelines, ETL processes, and data transformations using any ETL tool like Informatics, Snaplogic etc. Candidate should have hands on experience with job schedular tool Control-M and Unix shell scripting. Hand on experience and understanding of CI/CD pipelines to automate code deployment into multiple environments using Jenkins. Optimize data workflows for performance and efficiency. knowledge of Python, Java or Kubernetes will be a plus. Create and maintain technical documentation, including system configurations and best practices. Ability to analyse complex problems in a structured manner and demonstrate multitasking capabilities. Flexible and approachable team-worker. Ability to operate under pressure and deliver to demanding deadlines. Strong verbal and written communication skills. Show more Show less

Posted 1 week ago

Apply

8.0 - 12.0 years

27 - 42 Lacs

Chennai

Work from Office

Naukri logo

Job Summary We are seeking a highly skilled Sr. Developer with 8 to 12 years of experience to join our team. The ideal candidate will have expertise in Kubernetes Java Javascript and Snaplogic Elastic Platform along with domain experience in Data Models Finance and Accounting. This hybrid role requires a proactive individual who can contribute to our projects and help achieve our companys goals. Responsibilities Develop and maintain high-quality software solutions using Java and Javascript. Implement and manage containerized applications using Kubernetes. Utilize Snaplogic Elastic Platform to integrate various data sources and applications. Design and optimize data models to support financial and accounting processes. Collaborate with cross-functional teams to gather and analyze requirements. Provide technical guidance and support to team members. Ensure code quality through code reviews and automated testing. Troubleshoot and resolve complex technical issues. Stay updated with the latest industry trends and technologies. Contribute to the continuous improvement of development processes. Document technical specifications and project requirements. Participate in agile development practices including sprint planning and retrospectives. Communicate effectively with stakeholders to ensure project alignment. Qualifications Must have strong experience in Kubernetes for container orchestration. Must have proficiency in Java and Javascript for software development. Must have experience with Snaplogic Elastic Platform for data integration. Must have expertise in designing and optimizing data models. Must have domain knowledge in Finance and Accounting. Nice to have experience with hybrid work models. Nice to have excellent problem-solving and analytical skills. Nice to have strong communication and collaboration abilities. Certifications Required Certified Kubernetes Administrator (CKA) Oracle Certified Professional Java SE SnapLogic Certified Professional

Posted 1 week ago

Apply

8.0 - 13.0 years

10 - 20 Lacs

Pune, Chennai, Bengaluru

Work from Office

Naukri logo

Role & responsibilities Availability: Immediate Joiners Only Job Description: We are looking for an experienced SnapLogic Developer to join our team. The ideal candidate should have a strong background in designing and developing pipelines, along with performance optimization and fault tolerance techniques. Key Responsibilities: Design and develop new data pipelines using SnapLogic. Enhance and optimize existing pipelines to meet design and performance requirements. Handle configuration and routing design within SnapLogic. Implement strategies to work around the 10-request limit. Ensure fault tolerance, parallelism, and effective error handling in pipelines. Utilize SnapLogic debugging tools for issue resolution. Must-Have Skills: Deep expertise in SnapLogic pipeline development Strong understanding of pipeline configuration, routing, and error percolation Experience in implementing parallelism and fault tolerance Good problem-solving and debugging skills

Posted 1 week ago

Apply

15.0 - 25.0 years

17 - 27 Lacs

Mumbai

Work from Office

Naukri logo

Project Role : Integration Engineer Project Role Description : Provide consultative Business and System Integration services to help clients implement effective solutions. Understand and translate customer needs into business and technology solutions. Drive discussions and consult on transformation, the customer journey, functional/application designs and ensure technology and business solutions represent business requirements. Must have skills : SnapLogic Good to have skills : NA Minimum 15 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Integration Engineer, you will provide consultative Business and System Integration services to help clients implement effective solutions. You will understand and translate customer needs into business and technology solutions, drive discussions and consult on transformation, the customer journey, functional/application designs, and ensure technology and business solutions represent business requirements. Your typical day will involve collaborating with clients, analyzing their needs, designing integration solutions, and ensuring successful implementation. Roles & Responsibilities: Expected to be a SME with deep knowledge and experience. Should have Influencing and Advisory skills. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Expected to provide solutions to problems that apply across multiple teams. Collaborate with clients to understand their integration needs. Design and develop integration solutions using SnapLogic. Ensure successful implementation of integration solutions. Provide guidance and support to junior integration engineers. Professional & Technical Skills: Must To Have Skills:Proficiency in SnapLogic. Good To Have Skills:Experience with other integration platforms. Strong understanding of integration concepts and best practices. Experience in designing and implementing complex integration solutions. Knowledge of API management and web services. Familiarity with data transformation and mapping techniques. Additional Information: The candidate should have a minimum of 15 years of experience in SnapLogic. This position is based in Mumbai. A 15 years full-time education is required. Qualifications 15 years full time education

Posted 1 week ago

Apply

7.0 - 10.0 years

10 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Job location Bangalore Experience 7-10 Years Job Description Must have hands on exp (min 6-8 years) in SnapLogic Pipeline Development with good debugging skills. ETL jobs migration exp into Snaplogic, Platform Moderation and cloud exposure on AWS Good to have SnapLogic developer certification, hands on exp in Snowflake. Should be strong in SQL, PL/SQL and RDBMS. Should be strong in ETL Tools like DataStage, informatica etc with data quality. Proficiency in configuring SnapLogic components, including snaps, pipelines, and transformations Designing and developing data integration pipelines using the SnapLogic platform to connect various systems, applications, and data sources. Building and configuring SnapLogic components such as snaps, pipelines, and transformations to handle data transformation, cleansing, and mapping. Experience in Design, development and deploying the reliable solutions. Ability to work with business partners and provide long lasting solutions Snaplogic Integration - Pipeline Development. Staying updated with the latest SnapLogic features, enhancements, and best practices to leverage the platform effectively.

Posted 1 week ago

Apply

7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Title: Senior Consultant Career Level: D Introduction to role Are you ready to disrupt an industry and change lives? As a Senior Consultant within our Operations IT team, you'll be at the forefront of transforming our ability to develop life-changing medicines. Our work directly impacts patients, empowering the business to perform at its peak by combining innovative science with leading digital technology platforms and data. Join us in our journey to become a digital and data-led enterprise, where you'll collaborate with a diverse team of UI/UX designers, architects, full-stack engineers, data engineers, and DevOps engineers. Together, we'll develop high-impact, scalable, and innovative products that deliver actionable insights and support our business goals. Are you ready to make the impossible possible? Accountabilities As a Full Stack Data Engineer, you will: Develop, maintain, and optimize scalable data pipelines for data integration, transformation, and analysis, ensuring high performance and reliability. Demonstrate proficiency in ETL/ELT processes, including writing, testing, and maintaining high-quality code for data ingestion and transformation. Improve the efficiency and performance of data pipeline and workflows, applying advanced data engineering techniques and standard methodologies. Develop and maintain data models that represent data structure and relationships, ensuring alignment with business requirements and enhancing data usability. Develop APIs and microservices for seamless data integration across platforms and collaborate with software engineers to integrate front-end and back-end components with the data platform. Optimize and tune databases and queries for maximum performance and reliability and maintain existing data pipelines to improve performance and quality. Mentor other developers on standard methodologies, conduct peer programming, code reviews, and help evolve systems architecture to consistently improve development efficiency. Ensure compliance with data security and privacy regulations and implement data validation and cleansing techniques to maintain consistency. Stay updated with emerging technologies, standard methodologies in data engineering and software development, and contribute to all phases of the software development lifecycle (SDLC) processes. Work closely with data scientists, analysts, partners, and product managers to understand requirements, deliver high-quality data solutions, and support alignment of data sources and specifications. Perform unit testing, system integration testing, regression testing, and assist with user acceptance testing to ensure data solutions meet quality standards. Work with the QA Team to develop testing protocols and identify and correct challenges. Maintain clear documentation for Knowledge Base Articles (KBAs), data models, pipeline documentation, and deployment release notes. Diagnose and resolve complex issues related to data pipelines, backend services, and frontend applications, ensuring smooth operation and user satisfaction. Use and manage cloud-based services (e.g., AWS) for data storage and processing, and implement and manage CI/CD pipelines, version control, and deployment processes. Liaise with internal teams and third-party vendors to address application issues and project needs effectively. Create and maintain data visualizations and dashboards to provide actionable insights. Essential Skills/Experience Minimum 7+ years of experience in developing and delivering software engineering and data engineering solutions. Extensive experience with ELT/ETL tools such as SnapLogic, FiveTran, or similar. Deep expertise in Snowflake, DBT (Data Build Tool), and similar data warehousing technologies. Proficient in designing and optimizing data models and transformations for large-scale data systems. Strong knowledge of data pipeline principles, including dimensional modelling, schema design, and data integration patterns. Familiarity with Data Mesh and Data Product concepts, including experience in delivering and managing data products. Strong data orchestration skills to effectively manage and streamline data workflows and processes. Proficiency in data visualization technologies, with experience in advanced use of tools such as Power BI or similar. Solid understanding of DevOps practices, including CI/CD pipelines, version control systems like GitHub. Ability to implement and maintain automated deployment and integration processes. Experience with containerization technologies like Docker and orchestration tools like Kubernetes. Automated testing frameworks (Unit Test, system integration testing, regression testing & data testing). Strong proficiency in programming languages such as Python, Java, or Similar. Experience with both relational (e.g., MySQL, PostgreSQL) and NoSQL databases. Deep technical expertise in building software and analytical solutions with modern JavaScript stack (Node.js, ReactJS, AngularJS). Strong knowledge of cloud-based data, compute, and storage services, including AWS (S3, EC2, RDS, EBS, Lambda), orchestration services (e.g., Airflow, MWAA), containerization services (e.g., ECS, EKS). Excellent communication and interpersonal skills, with a proven ability of managing partner expectations, gathering requirements, translating them into technical solutions. Experience working in Agile development environments, with a strong understanding of Agile principles and practices. Ability to adapt to changing requirements and contribute to iterative development cycles. Advanced SQL skills for data analysis. Expertise on problem-solving skills with a focus on finding innovative solutions to complex data challenges. Strong analytical and reasoning skills, with the ability to visualize processes and outcomes. Strategic thinker with a focus on finding innovative solutions to complex data challenges. Desirable Skills/Experience Bachelor's or Master's degree in health sciences, Life Sciences, Data Management, Information Technology or a related field or equivalent experience. Significant experience working in the pharmaceuticals industry with a deep understanding of industry-specific data requirements. Demonstrated ability to manage and collaborate with a diverse range of partners ensuring high levels of satisfaction and successful project delivery. Proven capability to work independently and thrive in a dynamic fast-paced environment managing multiple tasks adapting to evolving conditions. Experience working in large multinational organizations especially within pharmaceutical or similar environments demonstrating familiarity with global data systems processes. Certification in AWS Cloud or other relevant data engineering or software engineering certifications showcasing advanced knowledge technical proficiency. Awareness of use case specific GenAI tools available in the market and their application in day-to-day work scenarios. Working knowledge of basic prompting techniques and commitment to continuous improvement of these skills. Ability to stay up to date with developments in AI and GenAI, applying new insights to work-related situations. When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work on average a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique ambitious world. AstraZeneca is where innovation meets impact! We couple technology with an inclusive approach to cross international boundaries developing a leading ecosystem. Our diverse teams work multi-functionally at scale bringing together the best minds from across the globe uncovering new solutions. We think holistically about applying technology building partnerships inside out driving simplicity efficiencies making a real difference. Ready to make your mark? Apply now! Show more Show less

Posted 1 week ago

Apply

7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Title: Senior Consultant Career Level: D Introduction to role Are you ready to disrupt an industry and change lives? As a Senior Consultant within our Operations IT team, you'll be at the forefront of transforming our ability to develop life-changing medicines. Our work directly impacts patients, empowering the business to perform at its peak by combining innovative science with leading digital technology platforms and data. Join us in our journey to become a digital and data-led enterprise, where you'll collaborate with a diverse team of UI/UX designers, architects, full-stack engineers, data engineers, and DevOps engineers. Together, we'll develop high-impact, scalable, and innovative products that deliver actionable insights and support our business goals. Are you ready to make the impossible possible? Accountabilities As a Full Stack Data Engineer, you will: Develop, maintain, and optimize scalable data pipelines for data integration, transformation, and analysis, ensuring high performance and reliability. Demonstrate proficiency in ETL/ELT processes, including writing, testing, and maintaining high-quality code for data ingestion and transformation. Improve the efficiency and performance of data pipeline and workflows, applying advanced data engineering techniques and standard methodologies. Develop and maintain data models that represent data structure and relationships, ensuring alignment with business requirements and enhancing data usability. Develop APIs and microservices for seamless data integration across platforms and collaborate with software engineers to integrate front-end and back-end components with the data platform. Optimize and tune databases and queries for maximum performance and reliability and maintain existing data pipelines to improve performance and quality. Mentor other developers on standard methodologies, conduct peer programming, code reviews, and help evolve systems architecture to consistently improve development efficiency. Ensure compliance with data security and privacy regulations and implement data validation and cleansing techniques to maintain consistency. Stay updated with emerging technologies, standard methodologies in data engineering and software development, and contribute to all phases of the software development lifecycle (SDLC) processes. Work closely with data scientists, analysts, partners, and product managers to understand requirements, deliver high-quality data solutions, and support alignment of data sources and specifications. Perform unit testing, system integration testing, regression testing, and assist with user acceptance testing to ensure data solutions meet quality standards. Work with the QA Team to develop testing protocols and identify and correct challenges. Maintain clear documentation for Knowledge Base Articles (KBAs), data models, pipeline documentation, and deployment release notes. Diagnose and resolve complex issues related to data pipelines, backend services, and frontend applications, ensuring smooth operation and user satisfaction. Use and manage cloud-based services (e.g., AWS) for data storage and processing, and implement and manage CI/CD pipelines, version control, and deployment processes. Liaise with internal teams and third-party vendors to address application issues and project needs effectively. Create and maintain data visualizations and dashboards to provide actionable insights. Essential Skills/Experience Minimum 7+ years of experience in developing and delivering software engineering and data engineering solutions. Extensive experience with ELT/ETL tools such as SnapLogic, FiveTran, or similar. Deep expertise in Snowflake, DBT (Data Build Tool), and similar data warehousing technologies. Proficient in designing and optimizing data models and transformations for large-scale data systems. Strong knowledge of data pipeline principles, including dimensional modelling, schema design, and data integration patterns. Familiarity with Data Mesh and Data Product concepts, including experience in delivering and managing data products. Strong data orchestration skills to effectively manage and streamline data workflows and processes. Proficiency in data visualization technologies, with experience in advanced use of tools such as Power BI or similar. Solid understanding of DevOps practices, including CI/CD pipelines, version control systems like GitHub. Ability to implement and maintain automated deployment and integration processes. Experience with containerization technologies like Docker and orchestration tools like Kubernetes. Automated testing frameworks (Unit Test, system integration testing, regression testing & data testing). Strong proficiency in programming languages such as Python, Java, or Similar. Experience with both relational (e.g., MySQL, PostgreSQL) and NoSQL databases. Deep technical expertise in building software and analytical solutions with modern JavaScript stack (Node.js, ReactJS, AngularJS). Strong knowledge of cloud-based data, compute, and storage services, including AWS (S3, EC2, RDS, EBS, Lambda), orchestration services (e.g., Airflow, MWAA), containerization services (e.g., ECS, EKS). Excellent communication and interpersonal skills, with a proven ability of managing partner expectations, gathering requirements, translating them into technical solutions. Experience working in Agile development environments, with a strong understanding of Agile principles and practices. Ability to adapt to changing requirements and contribute to iterative development cycles. Advanced SQL skills for data analysis. Expertise on problem-solving skills with a focus on finding innovative solutions to complex data challenges. Strong analytical and reasoning skills, with the ability to visualize processes and outcomes. Strategic thinker with a focus on finding innovative solutions to complex data challenges. Desirable Skills/Experience Bachelor's or Master's degree in health sciences, Life Sciences, Data Management, Information Technology or a related field or equivalent experience. Significant experience working in the pharmaceuticals industry with a deep understanding of industry-specific data requirements. Demonstrated ability to manage and collaborate with a diverse range of partners ensuring high levels of satisfaction and successful project delivery. Proven capability to work independently and thrive in a dynamic fast-paced environment managing multiple tasks adapting to evolving conditions. Experience working in large multinational organizations especially within pharmaceutical or similar environments demonstrating familiarity with global data systems processes. Certification in AWS Cloud or other relevant data engineering or software engineering certifications showcasing advanced knowledge technical proficiency. Awareness of use case specific GenAI tools available in the market and their application in day-to-day work scenarios. Working knowledge of basic prompting techniques and commitment to continuous improvement of these skills. Ability to stay up to date with developments in AI and GenAI, applying new insights to work-related situations. When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work on average a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique ambitious world. AstraZeneca is where innovation meets impact! We couple technology with an inclusive approach to cross international boundaries developing a leading ecosystem. Our diverse teams work multi-functionally at scale bringing together the best minds from across the globe uncovering new solutions. We think holistically about applying technology building partnerships inside out driving simplicity efficiencies making a real difference. Ready to make your mark? Apply now! Show more Show less

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

About the Role The role involves designing, developing, and maintaining scalable Python applications and APIs, as well as managing data integration workflows using SnapLogic. Experience- 4 to 8 Years Location- Gurugram ( Hybrid) Responsibilities Design, develop, and maintain scalable Python applications and APIs. Develop and manage data integration workflows using SnapLogic. Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications. Create and maintain detailed technical documentation. Ensure code quality through unit testing, code reviews, and best development practices. Troubleshoot and debug issues in a timely manner. Optimize performance of data pipelines and Python applications. Follow Agile development methodologies and participate in sprint planning and reviews. Qualifications Bachelor’s degree in Computer Science, Information Technology, or related field. 4 to 8 years of experience in Python development. Hands-on experience with SnapLogic for at least 1–2 years. Strong understanding of RESTful APIs, data structures, and algorithms. Experience in data integration, ETL processes, and working with large datasets. Proficiency in working with databases such as SQL Server, PostgreSQL, or MongoDB. Familiarity with version control systems such as Git. Good problem-solving skills and attention to detail. Excellent communication and collaboration skills. Show more Show less

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies