Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
You should have 8 to 12 years of experience in working with MuleSoft or Snaplogic with Integration experience, Fullstack development, .Net Framework, and Power Platforms. Your expertise in these technologies will be crucial in successfully fulfilling the responsibilities of the role.,
Posted 10 hours ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
As the Lead Consultant Bespoke Engineering at our organization, you will have the exciting opportunity to lead a team that disrupts the industry and changes lives. We are seeking a seasoned and innovative professional to spearhead our bespoke engineering team within the Enterprise IT organization. If you possess deep technical expertise in Python, SQL, AI/ML technologies, Mulesoft, SnapLogic, .Net, and MS Power Platforms, this role is perfect for you. Your responsibilities will include designing, developing, and delivering custom engineering solutions to address complex business challenges, all while fostering a culture of innovation and collaboration. If you have a proven track record of leading diverse engineering teams and driving the adoption of modern technologies, we are eager to hear from you! Your key responsibilities in this role will include owning the design and development of bespoke engineering solutions utilizing technologies such as Python, SQL, AI/ML, Mulesoft, SnapLogic, .Net, and MS Power Platforms. You will drive the adoption of modern development frameworks and cloud-native architectures, collaborate with enterprise architects to design scalable and secure software architectures, oversee the software development lifecycle, and partner with stakeholders across IT and business units to align engineering solutions with organizational goals. Additionally, you will champion the integration of enterprise systems using Mulesoft and SnapLogic, explore and promote the use of MS Power Platforms for rapid development of low-code/no-code applications, establish and enforce development standards and compliance requirements, and conduct regular code reviews and performance evaluations. To be successful in this role, you should possess a Bachelor's or master's degree in computer science, Information Technology, Engineering, or a related field. You must have expertise in Python and SQL for application development, data processing, and analytics, a strong understanding of AI/ML frameworks, proficiency in Mulesoft and SnapLogic, solid experience in .Net Framework/Core, and hands-on knowledge of MS Power Platforms. Familiarity with cloud platforms and DevOps practices, strategic thinking, problem-solving abilities, and a passion for innovation are essential qualities for this role. Desirable skills and experiences include certification in AI/ML, Mulesoft, or Microsoft Power Platform, experience in enterprise-level integrations and data engineering projects, and familiarity with data visualization tools such as Tableau or Power BI. Join us in our unique and ambitious world where we make a direct impact on patients by transforming our ability to develop life-changing medicines. Apply now to be a part of our dynamic environment that offers countless opportunities to learn, grow, and make a meaningful impact. At AstraZeneca, we are driving cross-company change to disrupt the entire industry.,
Posted 2 days ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we are a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth - bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hiring ETL (Extract, Transform, Load) Professionals with the following requirements: **Experience:** 8-10 Years **Job Description:** - 8 to 10 years of experience in designing and developing reliable solutions. - Ability to work with business partners and provide long-lasting solutions. - Minimum 5 years of experience in Snowflake. - Strong knowledge in Any ETL, Data Modeling, and Data Warehousing. - Minimum 2 years of work experience on Data Vault modeling. - Strong knowledge in SQL, PL/SQL, and RDBMS. - Domain knowledge in Manufacturing / Supply chain / Sales / Finance areas. - Good to have Snaplogic knowledge or project experience. - Good to have cloud platform knowledge AWS or Azure. - Good to have knowledge in Python/Pyspark. - Experience in Data migration / Modernization projects. - Zeal to pick up new technologies and do POCs. - Ability to lead a team to deliver the expected business results. - Good analytical and strong troubleshooting skills. - Excellent communication and strong interpersonal skills. At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles: - Flexible work arrangements, Free spirit, and emotional positivity. - Agile self-determination, trust, transparency, and open collaboration. - All Support needed for the realization of business goals. - Stable employment with a great atmosphere and ethical corporate culture.,
Posted 2 days ago
12.0 - 20.0 years
0 Lacs
chennai, tamil nadu
On-site
You have an exciting opportunity to join as a Lead Consultant specializing in Reltio MDM in Chennai. With 12-20 years of experience, you will leverage your expertise in Master and Reference Data management, particularly focusing on the Reltio MDM tool. Your proficiency in MDM/RDM technical design, industry standards, and unit test automation practices will be key in this role. Additionally, your operational understanding of data models, data standards, and vocabularies will support the centralized repository implementation. Your skills in Java/J2EE technologies and development software like Java and Python will be essential for success. As a Reltio Certified Developer/Architect, you will collaborate with stakeholders, provide leadership to project teams, and ensure progress reporting to leadership. Your ability to mentor teams and deliver high-quality written documentation will be crucial. Desirable skills include experience in a regulated environment, MDM/RDM Architecture, and Data Modelling principles. Familiarity with ITIL or similar landscapes, as well as hands-on experience in building REST APIs using technologies like MuleSoft or SnapLogic, will be advantageous. Knowledge of automated deployment tools such as Jenkins CI/CD and DevOps experience will further enhance your profile for this role.,
Posted 2 days ago
0 years
0 Lacs
India
Remote
Title: SnapLogic Developer Experience: 6+ yrs Timings: 8:30PM to 5:30AM EST Timezone Location: Remote Salary: Upto 1 Lakh/ month (Depend upon experience) *This is a Freelancing role. Not a permanant position Role: We are seeking a Senior SnapLogic Developer to lead the design, development, and maintenance of complex data integration pipelines using SnapLogic. This role will play a key part in managing all incoming and outgoing data flows across the enterprise, with a strong emphasis on EDI (X12) parsing, Salesforce integrations, and SnapLogic best practices. The ideal candidate is a technical expert who can also mentor junior developers and contribute to the evolution of our integration standards and architecture. Key Responsibilities: Lead and own SnapLogic pipeline development for various enterprise integration needs. Design, build, and maintain scalable integration workflows involving EDI X12 formats, Salesforce Snaps, REST/SOAP APIs, and file-based transfers (SFTP, CSV, etc.). Parse and transform EDI documents, particularly X12 837, 835, 834, 270/271, into target system formats like Salesforce, databases, or flat files. Manage and monitor SnapLogic dataflows for production and non-production environments. Collaborate with business and technical teams to understand integration requirements and deliver reliable solutions. Lead a team of SnapLogic developers, providing technical guidance, mentorship, and code reviews. Document integration flows, error handling mechanisms, retry logic, and operational procedures. Establish and enforce SnapLogic development standards and reusable components (SnapPacks, pipelines, assets). Collaborate with DevOps/SecOps to ensure deployments are automated and compliant. Troubleshoot issues in existing integrations and optimize performance where needed. Required Skills and Experience: Proven expertise in parsing and transforming EDI X12 transactions (especially 837, 835, 834, 270/271). Strong experience using Salesforce Snaps, including data sync between Salesforce and external systems. Deep understanding of SnapLogic architecture, pipeline execution patterns, error handling, and best practices. Experience working with REST APIs, SOAP services, OAuth, JWT, and token management in integrations. Knowledge of JSON, XML, XSLT, and data transformation logic. Strong leadership and communication skills; ability to mentor junior developers and lead a small team. Comfortable working in Agile environments with tools like Jira, Confluence, Git, etc. Experience with data privacy and security standards (HIPAA, PHI) is a plus, especially in healthcare integrations.
Posted 3 days ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Operations Management Level Associate Job Description & Summary At PwC, our people in audit and assurance focus on providing independent and objective assessments of financial statements, internal controls, and other assurable information enhancing the credibility and reliability of this information with a variety of stakeholders. They evaluate compliance with regulations including assessing governance and risk management processes and related controls. Those in internal audit at PwC help build, optimise and deliver end-to-end internal audit services to clients in all industries. This includes IA function setup and transformation, co-sourcing, outsourcing and managed services, using AI and other risk technology and delivery models. IA capabilities are combined with other industry and technical expertise, in areas like cyber, forensics and compliance, to address the full spectrum of risks. This helps organisations to harness the power of IA to help the organisation protect value and navigate disruption, and obtain confidence to take risks to power growth. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " We are looking for a Senior Data Engineer with deep experience in SnapLogic, SQL, ETL pipelines, and data warehousing, along with at least 3-4 years of hands-on experience with Databricks. The ideal candidate has a strong background in designing scalable data solutions and working across cloud and big data environments. Familiarity with Python is a strong plus. Responsibilities Design, build, and maintain data integration and ETL pipelines using SnapLogic Develop and optimize complex SQL queries to support business analytics and reporting Work with structured and unstructured data in large-scale data warehouse environments Leverage Databricks for advanced data processing, transformation, and analytics Collaborate with data analysts, data scientists, and business stakeholders to gather and understand data requirements Ensure data quality, integrity, and governance across platforms Create clear documentation for data workflows, architecture, and processes Participate in code reviews and promote best practices in data engineering Required Qualifications: 5+ years of experience with SnapLogic in enterprise-level data integration projects 6+ years of experience with ETL pipeline development and data warehousing Strong proficiency in SQL (performance tuning, complex joins, stored procedures, etc.) 3+ years of hands-on experience with Databricks (Spark, Delta Lake, etc.) Solid understanding of cloud data ecosystems and data modeling principles Excellent problem-solving and communication skills Preferred / Nice-to-Have Skills: Experience with Python for scripting or data processing tasks Familiarity with CI/CD practices Knowledge of data governance, privacy, and compliance best practices SAC JD: Solution Design & Development: o Design, develop, and implement SAP SAC solutions. o Create data models, stories, and dashboards in SAC. o Develop custom SAC applications using scripting and advanced analytics features. Data Integration & Management: o Integrate SAC with various data sources including SAP HANA, BW, S/4HANA, and other external sources. o Ensure data accuracy, consistency, and quality in SAC solutions. Stakeholder Collaboration: o Work closely with business stakeholders to gather requirements and translate them into technical specifications. o Collaborate with cross-functional teams to deliver end-to-end analytics solutions. Performance Optimization: o Optimize SAC solutions for performance and scalability. o Troubleshoot and resolve issues related to SAC solutions. Documentation & Training: o Document SAC solutions, including data models, design specifications, and user manuals. o Provide training and support to end-users and other team members. Proficiency in SAP SAC, including data modeling, story creation, and dashboard development. Strong understanding of SAC data connectivity options and integration with various data sources. Experience with SAP HANA, SAP BW, and S/4HANA. Proficient in SAC scripting and advanced analytics capabilities. Solid understanding of data visualization principles and best practices. Mandatory Skills Sets CSV Preferred Skills Sets LIMS/QMS Years Of Experience Required 4-8 years Education Qualifications B.Tech/MBA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Creating Shared Value (CSV) Optional Skills Accepting Feedback, Accepting Feedback, Accounting and Financial Reporting Standards, Active Listening, Artificial Intelligence (AI) Platform, Auditing, Auditing Methodologies, Business Process Improvement, Communication, Compliance Auditing, Corporate Governance, Data Analysis and Interpretation, Data Ingestion, Data Modeling, Data Quality, Data Security, Data Transformation, Data Visualization, Emotional Regulation, Empathy, Financial Accounting, Financial Audit, Financial Reporting, Financial Statement Analysis, Generally Accepted Accounting Principles (GAAP) {+ 19 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 3 days ago
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
The role of Support Engineer requires monitoring and maintaining integration pipelines, data flows, and jobs to ensure system uptime, performance, and stability. You will be responsible for troubleshooting issues promptly and ensuring timely resolution to minimize business impact. Additionally, you will monitor and maintain data models to ensure data accuracy and consistency. Utilizing ITIL best practices, you will efficiently manage incidents to ensure minimal disruption to operations. In Digital Transformation (DT) projects, you will triage incidents to identify and resolve issues promptly. Handling access requests, you will ensure proper authorization and security protocols are followed. For Change and Problem Management, you will raise and manage Change Requests (CRs) for any system modifications or updates. It will be essential to conduct root cause analysis for recurring issues and document Problem Tickets for long-term solutions. Adherence to ITIL processes for managing changes and resolving problems effectively is crucial. Your role will also involve Pipeline Validation and Analysis where you will apply SnapLogic knowledge to troubleshoot issues within SnapLogic pipelines, APIs, and other integration points. Collaboration with stakeholders to understand integration requirements and recommend solutions will be necessary. In terms of Service Delivery and Improvement, you will be responsible for developing, implementing, and maintaining service delivery processes in accordance with ITIL best practices. Identifying opportunities for process improvements and automation to enhance service delivery will be a continuous effort. Providing regular updates and reports on ongoing initiatives to stakeholders and PMO is also a key aspect of the role. Collaboration with team members and stakeholders to understand requirements and provide effective support solutions will be crucial. Communication with stakeholders, including senior management, business users, and other teams, to provide updates on incident status and resolution efforts is essential. Facilitating User Acceptance Testing (UAT) of projects and Change Requests will also be part of your responsibilities. Qualifications required for this role include a Bachelor's degree in computer science, Information Technology, Data Science, or a related field. A minimum of 4 years of experience in a support engineer role with 2 years relevant in SnapLogic is preferred, preferably in the pharmaceutical or related domain. Proven experience in monitoring and maintaining jobs, schedules, and data models is required. Strong hands-on experience with SnapLogic integration platform and proficiency in working with integration technologies is essential. Knowledge of common data formats, various databases, diagnostic and troubleshooting skills, as well as strong ITIL skills are also necessary. Excellent communication and collaboration skills, problem-solving abilities, and organizational skills are key attributes required for this role.,
Posted 3 days ago
0 years
4 - 6 Lacs
Chennai
On-site
You have experience in ETL pipeline development and data warehouse (DWH) design, with hands-on expertise in Snowflake, including Snowflake SQL, Snowflake scripts using Unix and Python, and Snowflake utilities. Your background includes strong data modeling skills, with the ability to create and work with data models and ER diagrams using both Star Schema and Snowflake Schema approaches. You are proficient in SQL and PL/SQL, including the development and maintenance of stored procedures. You also have experience with Jenkins for CI/CD pipelines and SnapLogic for data integration. Familiarity with modern data architecture concepts, such as Data Mesh and Data Products, is a valuable addition, along with a good understanding of data security principles. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 4 days ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
You have experience in ETL pipeline development and data warehouse (DWH) design, with hands-on expertise in Snowflake, including Snowflake SQL, Snowflake scripts using Unix and Python, and Snowflake utilities. Your background includes strong data modeling skills, with the ability to create and work with data models and ER diagrams using both Star Schema and Snowflake Schema approaches. You are proficient in SQL and PL/SQL, including the development and maintenance of stored procedures. You also have experience with Jenkins for CI/CD pipelines and SnapLogic for data integration. Familiarity with modern data architecture concepts, such as Data Mesh and Data Products, is a valuable addition, along with a good understanding of data security principles.
Posted 4 days ago
4.0 - 9.0 years
6 - 10 Lacs
Navi Mumbai, Pune, Bengaluru
Work from Office
We are looking for an SAP MDG Consultant to join ourSAP Practicein India. Location: Bangalore (Whitefield), Mumbai (Airoli), Pune (Talwade), Chennai (Siruseri). Responsibilities: Design and conduct the replication of Vendor / Customer Master Data B and C segments between 33 ECC systems and S/4 MDG hub solution for Business Partner. Sound functional knowledge of Vendor and Customer Master Data and their conversion / mapping into S/4HANA Business Partner Consult the customers internal IT organization in the development of XLST to map SAP Web Services to EDIFACT messages for the usage on SnapLogic Groundplex Technical solution design for IDOC Inbound & Outbound, field mapping, integration, end-to-end testing & coordination English business fluent - Ability to work in an international, remote team environment, effectively interacting with others Requirements Education: Bachelors degree/masters degree. Candidate should have 4+ years of work experience as a MDG consultant. Experience in & knowledge of: Experience in Data Replication using SOAP Service and ALE/IDOCs using DRF+ Experience with BRF+, AIF integration, EDI, middleware solutions like SnapLogic - Knowledge of S/4HANA MDG What we offer: Competitive salary package. Leave Policies: 10 Days of Public Holiday (Includes 2 days optional) & 22 days of Earned Leave (EL) & 11 days for sick or caregiving leave. Office Requirement: 3 Days WFO
Posted 5 days ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SnapLogic Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and guidance to your team members while continuously seeking opportunities for improvement in application design and functionality. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate knowledge sharing sessions to enhance team capabilities. - Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in SnapLogic. - Good To Have Skills: Experience with cloud integration platforms. - Strong understanding of application development methodologies. - Experience with API management and integration. - Familiarity with data transformation and ETL processes. Additional Information: - The candidate should have minimum 5 years of experience in SnapLogic. - This position is based at our Hyderabad office. - A 15 years full time education is required.
Posted 5 days ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Operations Management Level Associate Job Description & Summary At PwC, our people in audit and assurance focus on providing independent and objective assessments of financial statements, internal controls, and other assurable information enhancing the credibility and reliability of this information with a variety of stakeholders. They evaluate compliance with regulations including assessing governance and risk management processes and related controls. Those in internal audit at PwC help build, optimise and deliver end-to-end internal audit services to clients in all industries. This includes IA function setup and transformation, co-sourcing, outsourcing and managed services, using AI and other risk technology and delivery models. IA capabilities are combined with other industry and technical expertise, in areas like cyber, forensics and compliance, to address the full spectrum of risks. This helps organisations to harness the power of IA to help the organisation protect value and navigate disruption, and obtain confidence to take risks to power growth. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " We are looking for a Senior Data Engineer with deep experience in SnapLogic, SQL, ETL pipelines, and data warehousing, along with at least 3-4 years of hands-on experience with Databricks. The ideal candidate has a strong background in designing scalable data solutions and working across cloud and big data environments. Familiarity with Python is a strong plus. Responsibilities: • Design, build, and maintain data integration and ETL pipelines using SnapLogic • Develop and optimize complex SQL queries to support business analytics and reporting • Work with structured and unstructured data in large-scale data warehouse environments • Leverage Databricks for advanced data processing, transformation, and analytics • Collaborate with data analysts, data scientists, and business stakeholders to gather and understand data requirements • Ensure data quality, integrity, and governance across platforms • Create clear documentation for data workflows, architecture, and processes • Participate in code reviews and promote best practices in data engineering Required Qualifications: • 5+ years of experience with SnapLogic in enterprise-level data integration projects • 6+ years of experience with ETL pipeline development and data warehousing • Strong proficiency in SQL (performance tuning, complex joins, stored procedures, etc.) • 3+ years of hands-on experience with Databricks (Spark, Delta Lake, etc.) • Solid understanding of cloud data ecosystems and data modeling principles • Excellent problem-solving and communication skills Preferred / Nice-to-Have Skills: • Experience with Python for scripting or data processing tasks • Familiarity with CI/CD practices • Knowledge of data governance, privacy, and compliance best practices SAC JD: • Solution Design & Development: o Design, develop, and implement SAP SAC solutions. o Create data models, stories, and dashboards in SAC. o Develop custom SAC applications using scripting and advanced analytics features. • Data Integration & Management: o Integrate SAC with various data sources including SAP HANA, BW, S/4HANA, and other external sources. o Ensure data accuracy, consistency, and quality in SAC solutions. • Stakeholder Collaboration: o Work closely with business stakeholders to gather requirements and translate them into technical specifications. o Collaborate with cross-functional teams to deliver end-to-end analytics solutions. • Performance Optimization: o Optimize SAC solutions for performance and scalability. o Troubleshoot and resolve issues related to SAC solutions. • Documentation & Training: o Document SAC solutions, including data models, design specifications, and user manuals. o Provide training and support to end-users and other team members. • Proficiency in SAP SAC, including data modeling, story creation, and dashboard development. • Strong understanding of SAC data connectivity options and integration with various data sources. • Experience with SAP HANA, SAP BW, and S/4HANA. • Proficient in SAC scripting and advanced analytics capabilities. • Solid understanding of data visualization principles and best practices. Mandatory Skills sets: CSV Preferred Skills sets: LIMS/QMS Years of Experience Required 4-8 years Education Qualifications B.Tech/MBA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Creating Shared Value (CSV) Optional Skills Accepting Feedback, Accepting Feedback, Accounting and Financial Reporting Standards, Active Listening, Artificial Intelligence (AI) Platform, Auditing, Auditing Methodologies, Business Process Improvement, Communication, Compliance Auditing, Corporate Governance, Data Analysis and Interpretation, Data Ingestion, Data Modeling, Data Quality, Data Security, Data Transformation, Data Visualization, Emotional Regulation, Empathy, Financial Accounting, Financial Audit, Financial Reporting, Financial Statement Analysis, Generally Accepted Accounting Principles (GAAP) {+ 19 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 5 days ago
7.0 years
7 - 8 Lacs
Bengaluru
On-site
We are looking for an experienced and motivated Senior Teamcenter Developer to join our ITL PLM Development Team. ITL is Integrated tool and process landscape for Siemens Healthineers product data management. The ideal candidate will bring deep technical expertise in Siemens Teamcenter implementation & customization, strong hands-on experience in T4S integration to SAP. This role will serve as a key technical partner to the Product Owner for BOM Management and interfaces. You will play a key role in defining BOM Management and integration solutions. Key Responsibilities: Act as a competent technical partner for the Product Owner in refining and defining optimal requirements from Business stakeholders and SMEs. Lead the definition of solution design and implementation of SAFe Features. Mentor and support junior team members in best practices, code quality, and system understanding. Perform impact assessments for new features and enhancements in the Teamcenter ecosystem, including integrations with SAP and Azure DevOps. Collaborate with other system Points of Contact (POCs) such as SAP and Azure DevOps teams to ensure robust, scalable, and flexible system integrations. Must-Have Skills: 7+ years of hands-on experience in Teamcenter implementation and customization. Strong understanding of T4S/T4EA installation & upgrades. 5+ years of expertise in T4S (Teamcenter for SAP Integration), including key object integrations (e.g., DIR, MM, BOM, BOP, Variant Rules). In-depth understanding of Teamcenter BOM management (dBOM, eBOM). Strong understanding of CBA2/CBA3 (CAD BOM Alignment) methodology. Proficiency in Active Workspace customization and deployment. Excellent communication skills to illustrate solution concepts. Nice-to-Have Skills: Agile methodology certification (e.g., SAFe). Familiarity with SnapLogic/Snowflake or similar middleware tools for system integration. Experience with NX-Teamcenter integration & Integrated Materials Management.
Posted 6 days ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Senior Developer specializing in SnapLogic and Apache Airflow, you will be responsible for designing, developing, and maintaining enterprise-level data integration solutions. Your expertise in ETL development, workflow orchestration, and cloud technologies will be crucial for automating data workflows, optimizing performance, and ensuring the reliability and scalability of data systems. Your key responsibilities will include designing, developing, and managing ETL pipelines using SnapLogic to ensure efficient data transformation and integration across various systems and applications. You will leverage Apache Airflow for workflow automation, job scheduling, and task dependencies to ensure optimized execution and monitoring. Collaboration with cross-functional teams such as Data Engineering, DevOps, and Data Science will be essential to understand data requirements and deliver effective solutions. In this role, you will be involved in designing and implementing data pipeline architectures to support large-scale data processing in cloud environments like AWS, Azure, and GCP. Developing reusable SnapLogic pipelines, integrating with third-party applications and data sources, optimizing pipeline performance, and providing guidance to junior developers will be part of your responsibilities. Additionally, troubleshooting pipeline failures, implementing automated testing, continuous integration (CI), and continuous delivery (CD) practices for data pipelines will be crucial for maintaining high data quality and minimal downtime. The required skills and experience for this role include at least 6 years of hands-on experience in data engineering with a focus on SnapLogic and Apache Airflow. Proficiency in SnapLogic Designer, SnapLogic cloud environment, and Apache Airflow for building data integrations and ETL pipelines is essential. You should have a strong understanding of ETL concepts, data integration, cloud platforms like AWS, Azure, or Google Cloud, data storage systems such as S3, Azure Blob, and Google Cloud Storage, as well as experience with SQL, relational databases, NoSQL databases, REST APIs, and CI/CD pipelines. Your problem-solving skills, ability to work in an Agile development environment, and strong communication and collaboration skills will be valuable assets in this role. By staying current with new SnapLogic features, Airflow upgrades, and industry best practices, you will contribute to the continuous improvement of data integration solutions. Join our team at Virtusa, where teamwork, quality of life, and professional development are values we embody. Be part of a global team that cares about your growth and provides exciting projects, opportunities, and exposure to state-of-the-art technologies throughout your career with us. At Virtusa, great minds come together to nurture new ideas and foster excellence in a dynamic environment.,
Posted 6 days ago
10.0 - 14.0 years
0 Lacs
chennai, tamil nadu
On-site
The ideal candidate should have hands-on experience in Collibra Workflows, asset model creation, cataloguing, and assessment creation. Additionally, exposure to AI platforms such as Open AI, Bedrock, and integration platforms like snaplogic and mulesoft is required. A deep understanding and practical knowledge of IDEs such as Eclipse/PyCharm or any Workflow Designer is essential. Experience with one or more of the following languages: Java, JavaScript, Groovy, Python is preferred. Moreover, deep understanding and hands-on experience of CICD processes and tooling e.g., GitHub is necessary. Candidates should have experience working in Dev-Ops teams based on Kubernetes tools and converting a business workflow into an automated set of actions. Proven knowledge in scripting and a willingness to learn new languages is expected. Excellent communication skills in written & spoken English, interpersonal skills, and a collaborative approach to delivery are crucial. An enthusiasm for great documentation including high level designs, low level designs, coding standards, and Knowledge Base Articles is highly appreciated. Desirable qualifications include an Engineering Degree in IT/Computer Science with a minimum of 10 years of experience. Knowledge and experience of the Collibra Data Governance platform, exposure to AI models, AI governance, data policies, and governance are advantageous. Basic AWS knowledge is a plus. Familiarity with integration technologies like Mulesoft and Snaplogic is beneficial. Excellent Jira skills, including the ability to rapidly generate JQL on-the-fly and save JQL queries/filters/views/etc for publishing to fellow engineers & senior stakeholders, are desired. Candidates should have experience in the creation of documentation in Confluence and Agile practices, preferably having been part of an Agile team for several years. Joining Virtusa means becoming part of a team that values teamwork, quality of life, professional and personal development. With a global team of 27,000 people, Virtusa aims to provide exciting projects, opportunities, and work with state-of-the-art technologies throughout your career with us. Great minds, great potential, and a dynamic environment await you at Virtusa, where collaboration and excellence are nurtured.,
Posted 6 days ago
3.0 - 6.0 years
12 - 16 Lacs
Bengaluru
Hybrid
Job Summary: We are looking for a SnapLogic Integration Developer to build robust and scalable data integration pipelines and optimize enterprise workflows. Key Responsibilities: Design and implement data pipelines using SnapLogic. Integrate various systems (SAP, Salesforce, Workday, AWS, Azure). Manage APIs and troubleshoot data integration issues. Collaborate with business teams to understand data requirements. Required Skills: 3+ years experience with SnapLogic or other iPaaS platforms. Strong command of JSON, XML, REST/SOAP APIs, SQL. Experience in integrating cloud/on-premise applications. Debugging, performance tuning, and automation experience. Preferred: Knowledge of scripting (JavaScript/Python) in SnapLogic. Background in ETL, data warehousing, and API management.
Posted 6 days ago
6.0 years
0 Lacs
Andhra Pradesh, India
On-site
We are seeking a Senior Developer with expertise in SnapLogic and Apache Airflow to design, develop, and maintain enterprise-level data integration solutions. This role requires strong technical expertise in ETL development, workflow orchestration, and cloud technologies. You will be responsible for automating data workflows, optimizing performance, and ensuring the reliability and scalability of our data systems. Key Responsibilities include designing, developing, and managing ETL pipelines using SnapLogic, ensuring efficient data transformation and integration across various systems and applications. Leverage Apache Airflow for workflow automation, job scheduling, and task dependencies, ensuring optimized execution and monitoring. Work closely with cross-functional teams such as Data Engineering, DevOps, and Data Science to understand data requirements and deliver solutions. Collaborate in designing and implementing data pipeline architectures to support large-scale data processing in cloud environments like AWS, Azure, and GCP. Develop reusable SnapLogic pipelines and integrate with third-party applications and data sources including databases, APIs, and cloud services. Optimize SnapLogic pipeline performance to handle large volumes of data with minimal latency. Provide guidance and mentoring to junior developers in the team, conducting code reviews and offering best practice recommendations. Troubleshoot and resolve pipeline failures, ensuring high data quality and minimal downtime. Implement automated testing, continuous integration (CI), and continuous delivery (CD) practices for data pipelines. Stay current with new SnapLogic features, Airflow upgrades, and industry best practices. Required Skills & Experience include 6+ years of hands-on experience in data engineering, focusing on SnapLogic and Apache Airflow. Strong experience with SnapLogic Designer and SnapLogic cloud environment for building data integrations and ETL pipelines. Proficient in Apache Airflow for orchestrating, automating, and scheduling data workflows. Strong understanding of ETL concepts, data integration, and data transformations. Experience with cloud platforms like AWS, Azure, or Google Cloud and data storage systems such as S3, Azure Blob, and Google Cloud Storage. Strong SQL skills and experience with relational databases like PostgreSQL, MySQL, Oracle, and NoSQL databases. Experience working with REST APIs, integrating data from third-party services, and using connectors. Knowledge of data quality, monitoring, and logging tools for production pipelines. Experience with CI/CD pipelines and tools such as Jenkins, GitLab, or similar. Excellent problem-solving skills with the ability to diagnose issues and implement effective solutions. Ability to work in an Agile development environment. Strong communication and collaboration skills to work with both technical and non-technical teams.
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: Senior Consultant - Cloud Data Engineer Introduction to role Are you ready to disrupt an industry and change lives? Join us at a crucial stage of our journey in becoming a digital and data-led enterprise. As a Senior Consultant - Cloud Data Engineer, you will have the opportunity to lead and innovate, transforming our ability to develop life-changing medicines. Your work will directly impact patients, empowering the business to perform at its peak by combining ground breaking science with leading digital technology platforms and data. Accountabilities Lead the design, development, and maintenance of reliable, scalable data pipelines and ETL processes using tools such as SnapLogic, Snowflake, DBT, Fivetran, Informatica, and Python. Work closely with data scientists to understand model requirements and prepare the right data pipelines for training and deploying machine learning models. Collaborate with data scientists, analysts, and business teams to understand and optimize data requirements and workflows. Apply Power BI, Spotfire, Domo, Qlik Sense to create actionable data visualizations and reports that drive business decisions. Implement standard methodologies for version control and automation using Git Actions, Liquibase, Flyway, and CI/CD tools. Optimize data storage, processing, and integration bringing to bear AWS Data Engineering tools (e.g., AWS Glue, Amazon Redshift, Amazon S3, Amazon Kinesis, AWS Lambda, Amazon EMR). Troubleshoot, debug, and resolve issues related to existing data pipelines and architectures. Ensure data security, privacy, and compliance with industry regulations and organizational policies. Provide mentorship to junior engineers, offering guidance on best practices and supporting technical growth within the team. Essential Skills/Experience SnapLogic: Expertise in SnapLogic for building, managing, and optimizing both batch and real-time data pipelines. Proficiency in using SnapLogic Designer for designing, testing, and deploying data workflows. In-depth experience with SnapLogic Snaps (e.g., REST, SOAP, SQL, AWS S3) and Ultra Pipelines for real-time data streaming and API management. AWS: Strong experience with AWS Data Engineering tools, including AWS Glue, Amazon Redshift, Amazon S3, AWS Lambda, Amazon Kinesis, AWS DMS, and Amazon EMR. Expertise in cloud data architectures, data migration strategies, and real-time data processing on AWS platforms. Snowflake: Extensive experience in Snowflake cloud data warehousing, including data modeling, query optimization, and managing ETL pipelines using DBT and Snowflake-native tools. Fivetran: Proficient in Fivetran for automating data integration from various sources to cloud-based data warehouses, optimizing connectors for data replication and transformation. Real-Time Messaging and Stream Processing: Experience with real-time data processing frameworks (e.g., Apache Kafka, Amazon Kinesis, RabbitMQ, Apache Pulsar). Desirable Skills/Experience Exposure to other cloud platforms such as Azure or Google Cloud Platform (GCP). Familiarity with data governance, data warehousing, and data lake architectures. When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. At AstraZeneca, we combine technology skills with a scientific mindset to make a meaningful impact. Our dynamic environment offers countless opportunities to learn and grow while working on cutting-edge technologies. We are committed to driving cross-company change to disrupt the entire industry. Ready to take on this exciting challenge? Apply now! Date Posted 16-Jul-2025 Closing Date 30-Jul-2025 AstraZeneca embraces diversity and equality of opportunity. We are committed to building an inclusive and diverse team representing all backgrounds, with as wide a range of perspectives as possible, and harnessing industry-leading skills. We believe that the more inclusive we are, the better our work will be. We welcome and consider applications to join our team from all qualified candidates, regardless of their characteristics. We comply with all applicable laws and regulations on non-discrimination in employment (and recruitment), as well as work authorization and employment eligibility verification requirements.
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Title: Consultant - DevOps Platform Engineer Career Level - C3 Introduction to role AstraZeneca is a global, innovation-driven biopharmaceutical business that focuses on the discovery, development, and commercialization of prescription medicines for some of the world's most serious diseases. We span the entire value chain of a medicine from discovery, early- and late-stage development to manufacturing and distribution, and the global commercialization of primary care, specialty care-led and specialty care medicines that transform lives. At AstraZeneca, we pride ourselves on encouraging an outstanding workplace culture that drives innovation and collaboration. Here, we encourage our teams to express different perspectives - making you feel valued, energized, and rewarded for your ideas and creativity. AstraZeneca is currently looking for a Platform Engineer to join our R&D IT Development Platform Management team to manage and maintain our Analytics & Reporting platform, which is GxP compliant and supports the important function of running clinical trials both internally and across external partners. This platform includes EntimICE, SAS Grid, SAS LSAF, SAS Viya, and Visual Analytics software suite with a mixture of hosted solutions and on-prem solutions based primarily on SAS technology. You will be part of our core product team operating under the BizDevOps model, working multi-functionally with other team members such as Product Lead, DevOps Lead, Release Manager, Business Analysts, QM’s, and key team members such as Business Partners and Product Owners. Accountabilities Participate in business requirement gathering and design activities with business & IT customers as part of the product team Perform delivery activities through Design/Build/Test/Deploy phases for regular releases on A&R product Deploy patches and any version upgrades for on-prem applications and coordinate with the vendor for hosted solutions that fall under A&R landscape Develop SAS Macros based on API’s and perform unit testing Maintain configuration specification documentation for both functional and integration configurations Prepare SDLC documentation, KB articles, confluence documentation Ensure that all system security and control procedures are implemented and maintained Generate and implement ideas to streamline our integration landscape to simplify BAU support Solve issues, supervise key metrics, and maintain overall health of the platform Serve as an SME on the A&R platform for IT and business partners across regions and business areas Work with centralized integration team and Globalscape teams to maintain legacy integration interfaces Work with software vendors on product requirements and issues related to the platform, security set-up, and functional configuration Apply JIRA for requirements, tasks, validation/testing, and shipment activities during releases Solve day-to-day incidents, service requests, and work towards incident-reduction and automation of service-related activities. Update Service Now (ticket management tool) for all events with respect to incidents, service requests, changes, and problems Essential Skills/Experience Proven experience in engineering and software architecture design Experience in Programming with BASE SAS Experience of working in agile teams using methodologies such as SCRUM, Kanban, and SAFe Experience with Security and Authentication and SSO topics (related to the access of source data, the access of data within SAS on the SAS servers across different business organizations as well as SAS integration with tools such as Active Directory/LDAP/Kerberos) Excellent communication skills and ability to work independently Ability to provide technical system solutions, resolve overall design direction, and provide hardware recommendations for sophisticated technical issues Experience planning and developing support processes and adhering to standard methodologies Knowledge of support processes like Incident Management, Problem Management, Change Management, and experience working in support teams Experience working with JIRA, Confluence, Git, & Service Now Demonstrate willingness and demeanor to take on different roles within the product team as when opportunity arises Desirable Skills/Experience Knowledge of cloud technologies (AWS & Azure) Understanding of SAS Viya architecture SAS platform administration in a multi-node GRID environment on Linux with SAS 9.4M7 Knowledge of file systems, storage devices, and ACL’s Experience working with R & Python Programming Knowledge of CI/CD practices and utilization of Jenkins, Docker, Kubernetes Experience with automated testing tools such as JIRA (X-Ray), UFT or equivalent experience & Leapwork Knowledge of Splunk, Snaplogic & reporting with Power BI Familiar with GxP systems and working in regulated environments Familiar with relational databases (MySQL, PostgreSQL) and non-SQL databases such as Mongo DB Passion for learning, innovating, and delivering valuable software to people Date Posted 14-Jul-2025 Closing Date 25-Jul-2025 AstraZeneca embraces diversity and equality of opportunity. We are committed to building an inclusive and diverse team representing all backgrounds, with as wide a range of perspectives as possible, and harnessing industry-leading skills. We believe that the more inclusive we are, the better our work will be. We welcome and consider applications to join our team from all qualified candidates, regardless of their characteristics. We comply with all applicable laws and regulations on non-discrimination in employment (and recruitment), as well as work authorization and employment eligibility verification requirements.
Posted 1 week ago
100.0 years
0 Lacs
Delhi, India
On-site
Department: Technology Location: India Compensation: ₱1,000,000 - ₱1,989,675 / year Description 🚢 Discover OTG :Ocean Technologies Group (OTG) is a leading provider of maritime software solutions. Our foundation is rooted in historic and iconic maritime brands with over 100 years of collective experience, including Seagull, Videotel, Marlins, MTS, Tero Marine, and COMPAS. These brands were founded on the principle of delivering advanced performance through superior technology. At OTG, we're more than a company; we're a collective of maritime enthusiasts, tech innovators, and visionaries. With a century-long legacy, we have been guiding the industry toward safety and operational excellence. From fleet management to unparalleled learning resources, OTG is shaping the future of maritime solutions and forming strategic alliances with global organizations. 🎯 Our Mission: Our mission is clear; to provide comprehensive software and training solutions to diverse organizations in the global maritime sector. Recognizing the maritime industry's global significance, our goal is to empower its professionals by equipping them with the skills and tools to maximize their potential, optimize ship performance, and ensure the safe and efficient operation of marine assets. To date, we have built a strong network, serving over 1,400 clients, reaching 20,000 vessels, and positively impacting the lives of more than 1,000,000 seafarers. Join us on our journey to make a significant difference in the maritime industry. Our portfolio includes Learning & Assessment, Fleet Management, and Crew Management, uniting seven iconic maritime brands with over a century of collective experience. 🔍 Why Join OTG’s Crew? Legacy & Innovation: A century of maritime prowess meets cutting-edge solutions. Global Impact : Serving 1,400+ clients, 20,000 vessels, and over a million seafarers. Inclusive Culture: United by passion, Join an impact-driven crew and bask in our inclusive cultural tide. Backed & Bold: Powered by Private Equity, we're charting a thrilling course to reshape the industry. Growth Aboard: Sail into opportunities with our culture of continuous learning and internal progression. Tech meets Maritime: Dive into a vibrant atmosphere where passion for Maritime and technology merges seamlessly. 🧭 Navigating the position: Database Specialist In this role you will be designing, developing, and maintaining data ingestion, transformation, and modelling pipelines as part of the organisation's analytics landscape and wider data ecosystem. In this critical role you will provide the right data at the right time to enable decision making by both internal stakeholders and our customers. We are embarking on an exciting journey, re-designing our company's approach to data and reporting. We want to revolutionise the way we deliver value from data to our business and customers and we're looking for innovative data professionals to help us realise that vision. We’re on an exciting journey to reimagine our approach to data and reporting. Our goal is to revolutionize how we deliver value from data across the business and to our customers. We're seeking innovative and forward-thinking data professionals to help us turn this vision into reality. If you're ready to make an impact and thrive in a dynamic, evolving data environment, we’d love to hear from you! 🚢 Your Voyage Ahead: Create and maintain optimal data pipeline architecture from multiple data sources. Assemble large, complex data sets that meet functional/non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimising data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources. Ensure robust data validation and verification processes to ensure the correctness of data for reporting. Develop and maintain data warehousing approaches, including the modelling, and pipelining of data for storage and retrieval. Keeping data separated and secure as per best practice guidelines. Ensure that the data taken from the information sources is accurate, liaising with other business functions when necessary. Proactively keep up to date with the latest cloud data technology Produce supporting documentation, such as specifications, data models, relation between data and other data objects, required for the effective development, usage, and communication of the data operations solutions with different business units, as and when required to ensure that overall objectives are met. Providing data exports for customer verification and sunsetting customers. Providing data exports as seed data for external systems integrations. Data mining & analysis. Documents migration. 🚢 Recommended to bring on board: Experience working with databases on-premises and/or cloud-based environments such as MSSQL, MySQL, PostgreSQL, SQL Azure, Aurora MySQL etc. Microsoft Accredited or Industry recognized qualification for Microsoft SQL server will be preferred. Good understanding on how data management, cleansing and query optimisation influences pipeline and data model design. Experience using cloud technologies (AWS preferred) Experience with scripting languages (e. g. Java Script, C#) Demonstrably strong problem-solving skills to help design and implement solutions to data problems. Highly numerate and logical Experience of CICD technologies and Source control. Understand and apply infrastructure knowledge to develop and maintain an efficient environment. Modelling datasets in a way applicable for use in different visualisation tools (e. g. PowerBI) Strong stakeholder management skills Ability to clearly communicate and present outputs to stakeholders. Experience of working in fast paced environments to strict deadlines. Team player and can communicate effectively to technical and non-technical colleagues. Passionate about continuous improvement, collaboration and knowledge sharing. The ability to manage time, prioritise tasks, self-review your work and produce deliverables of a high quality under tight client deadlines in time pressured environments. It would be useful if you've got experience with some of these technologies- AWS Glue, AWS RDS, AWS Aurora, AWS S3, Amazon Athena, Amazon EMR, PostgreSQL, Python, Bitbucket, Liquibase, Jenkins, Octopus. Experience working with REST API. Good command of the English language written and oral Good to have – Experience with building and maintaining ETL/ELT pipelines with SnapLogic as part of an automated workflow on high-volume, high-dimensionality data from varying sources. Experience of using project tracking and reporting system Monday, Zendesk Maritime sector experience 🛳️Navigating Life with OTG: Unveil a Treasure Trove of Benefits Safeguard your tomorrow: Future Security with SSS & HDMF: Secure your financial future and home ownership dreams with contributions to Social Security and Home Development Mutual Fund. Healthcare Assurance with Philhealth: Rest easy knowing your health is safeguarded by the Philippine Health Insurance Corporation. Extra Allowances for Daily Comfort: Boost Your Budget: Receive a monthly allowance of 2,500 PHP for your everyday expenses. Top-Tier Health and Wellness Coverage: Comprehensive Medical Insurance: After your initial 3 months, enjoy Maxicare's extensive medical insurance for you and one dependent, including up to 200,000 PHP per illness annually. Around-the-Clock Teleconsult: Have 24/7 access to medical consultations, ensuring you and your family's health concerns are promptly addressed. Annual Health Maintenance: Benefit from regular check-ups and dental care to maintain your health year-round. Life Insurance Peace of Mind: Gain additional security with a life insurance policy valued at 250,000 PHP, protecting what's most important to you. Employee Support and Wellbeing: Employee Assistance Programme (EAP): Our EAP, provided by Health Assured and Comp Psych, offers confidential counseling, financial, and legal support via phone, ensuring your well-being is always a priority. ⭐ OTG’s Guiding Stars: ⭐ Pioneering: Constantly charting new courses in innovation. ⭐ Caring: Keeping the maritime community's safety and sustainability at the helm. ⭐ Collaborating: Navigating together with clear communication and shared goals. ⭐ Optimizing: Always in pursuit of excellence and constructive evolution. 🚢 Will You Navigate the Next Chapter with Us Join us on a journey that transcends a traditional job—it’s a mission to innovate at the intersection of technology and education. Discover more about our vision at Ocean Technologies Group and see if your path aligns with our pioneering direction. We’re eager to welcome aboard our next visionary Junior Application Consultant, ready to make a significant impact on both tech and educational fronts. Cast your resume into our waters and embark on a journey that transcends a mere job - it’s an adventure in innovation and support. Visit us at Ocean Technologies Group and see if your compass aligns with ours. We’re excited to welcome our next Oceaneer onboard. ✊🏼 All Hands on Deck: We steer with equality and celebrate the diversity of our Oceaneer’s. OTG is a proud equal opportunity employer, where passion unites and differences are celebrated.
Posted 1 week ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job description: Experience: 5+ Years Shift: 9 AM to 6 PM Hybrid work Mode Job Description: We are looking for a ServiceNow ITSM and Integration Specialist to identify, design and deliver larger or more complex architectures, strategies and specific solutions through the ServiceNow platform. The role holder will • In-depth knowledge of the ServiceNow Platform, technology, development, integration and modules, to design, develop, implement and test modules in line with agreed timelines, budget, costs, quality and development standards. • Hands on ITSM process implementation experience on ITSM modules: Incident Management, Problem Management, Change Management, Knowledge Management, etc. • Hands on Integration implementation experience and In-depth knowledge of designing, developing complex solution of integration of ServiceNow with external tools (e.g., SalesForce, SCOM , MuleSoft, SnapLogic, Perspectium etc.) And ServiceNow to ServiceNow Integration as per Business needs. • In-depth knowledge of Inbound and Outbound Integrations and concepts Rest and Soap services. • Hands on implementation experience on Bi-directional Integrations custom integration. • Hands on experience on implementation of Custom Rest and Soap services. • Identification & design of ServiceNow integration requirements, including • ServiceNow Mid-tier setup • Event Management integration • Service Mapping & Discovery integration (CMDB) • Develop and configure ServiceNow ITSM application, Service Portal in accordance with customer requirements and best practices. • Collaborate with business stakeholders to gather and analyze requirements and translate them into technical specifications and solutions. • Develop UI forms, UI Actions, notifications, workflows, Transform Maps, and Flows via Flow Designer. • Create JavaScript server/client code and components: Script Includes, Business Rules, Client Scripts, ACLs, etc. • Create, modify, and publish service catalog and Record producers. • Suggest and implement proposals for process improvements within projects, once agreed. • Provide technical documentation on modules / products, following company standards on customer projects. • Maintain high level of product and solution knowledge and maintains an up-to-date skills profile in ServiceNow and Java Scripting, HTML. • Ability to drive business requirements and take an active/leading role with customer stakeholders including C-level / director level / users. • Advise, mentor, and provide hands-on assistance to other team members regarding the technical solution design and execution. • Develop and document best practice approaches and reusable assets for the deployment team. • Create and design user stories as well as assess solution built by developers and Support and Perform system testing and systems integration testing. • Support user acceptance testing • Support go-live and early life/warranty support periods. • Stay up to date with the latest ServiceNow features and functionalities and make recommendations for improvement.
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
Rojo Integrations, a comprehensive SAP integration leader, was founded in 2011. Partnering with top software vendors like SAP, Coupa, SnapLogic, and Solace, Rojo specializes in seamless enterprise integration and data analytics solutions. Trusted by global Bluechip companies such as Heineken and Siemens, Rojo delivers tailored services to meet unique business needs. The company is headquartered in the Netherlands and operates globally from offices in the Netherlands, Spain, and India, focusing on SAP integration modernization and business processes to improve data integration and business strategies. Rojo's portfolio includes consultancy, software development, and managed services to streamline integration, enhance observability, and drive growth. The Rojo Managed Services team ensures customer satisfaction with real-time monitoring, error reporting, troubleshooting, and active performance improvements. The team aims to prevent incidents and provide sustainable solutions promptly while tackling new challenges daily. Join the team of puzzlers and contribute to solving the next big challenge. To succeed in this role, you should have 3-6 years of experience within an IT organization, preferably in Integration Support. Knowledge of Monitoring/Observability tools like Splunk/Data Dog and leading integration platforms such as SAP CI, SnapLogic, or MuleSoft is essential. A passion for technology and programming, professional English proficiency, strong customer service orientation, and the ability to work in a diverse, global 24/7 team are required. Familiarity with Event Driven Architecture and past experience with applications like Salesforce, AWS, Snowflake, MS Dynamics CRM, and other ERP tools is beneficial. Basic programming experience, familiarity with JIRA service desk, and flexibility to work rotational/flexible/weekend shifts in a hybrid work environment are necessary. Immediate joiners are preferred. Additional desired skills include a bachelor's degree in computer science, Software Engineering, or equivalent, analytical skills, a Continuous Improvement mindset, the ability to work according to procedures and best practices, and autonomy in decision-making. Rojo offers the opportunity to gain work experience in a dynamic environment with growth opportunities, innovative projects, and a supportive learning environment. Training, mentoring, an international atmosphere, and diverse working climate are provided with exciting region-specific benefits. If you are interested in this opportunity, apply now as Rojo values diversity and encourages applicants who may not meet all criteria to still apply.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
You should have experience working with BigID or Collibra, along with knowledge of data classification and data products. It is important to have an understanding of data loss and personal information security. Exposure to platforms such as Snowflake, S3, Redshift, SharePoint, and Box is required. You should also have knowledge of connecting to various source systems. A deep understanding and practical knowledge of IDEs like Eclipse, PyCharm, or any Workflow Designer is essential. Experience with one or more of the following languages - Java, JavaScript, Groovy, Python is preferred. Hands-on experience with CI/CD processes and tooling such as GitHub is necessary. Working experience in DevOps teams based on Kubernetes tools is also expected. Proficiency in database concepts and a basic understanding of data classification, lineage, and storage would be advantageous. Excellent written and spoken English, interpersonal skills, and a collaborative approach to delivery are essential. Desirable Skills And Experience: - A total of 8 to 12 years of overall IT experience - Technical Degree to support your experience - Deep technical expertise - Demonstrated understanding of the required technology and problem-solving skills - Analytical, focused, and capable of working independently with minimal supervision - Good collaborator management and team player - Exposure to platforms like Talend Data Catalog, BigID, or Snowflake is beneficial - Basic knowledge of AWS is a plus - Knowledge and experience with integration technologies such as Mulesoft and SnapLogic - Proficiency in Jira, including the ability to quickly generate JQL queries and save them for reference - Proficient in creating documentation in Confluence - Experience with Agile practices, preferably having been part of an Agile team for several years,
Posted 1 week ago
12.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Category: Software Development/ Engineering Main location: India, Karnataka, Bangalore Position ID: J0725-1154 Employment Type: Full Time Position Description: Senior SnapLogic ETL Company Profile: - At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve. At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve results for our clients - and for our members. Come grow with us. Learn more at www.cgi.com. This is a great opportunity to join a winning team. CGI offers a competitive compensation package with opportunities for growth and professional development. Benefits for full-time, permanent members start on the first day of employment and include a paid time-off program and profit participation and stock purchase plans. We wish to thank all applicants for their interest and effort in applying for this position, however, only candidates selected for interviews will be contacted. No unsolicited agency referrals please. Job Title: Senior SnapLogic ETL Engineer Position: Senior Software Engineer (SSE) Experience: 12+ years Category: zINACTIVE - Information Technology Main location: Bangalore Position ID: J0725-1154 Employment Type: Full Time Job Description: - Senior SnapLogic ETL Engineer: - 1. Min. 10 yrs of relevant experience 2. Should be able to work independently and should be able to lead the project Responsibilities: These resources will be leading the integration work—including designing, developing, and implementing SnapLogic-based point-to-point pipelines. we’re looking for senior-level talent with the ability to: Work independently and lead technical delivery Collaborate as part of an integrated team with client-side stakeholders Communicate clearly and confidently Use tools like Jira and Confluence for tracking, documentation, and updates Deep understanding of SnapLogic components: This includes Snaps, pipelines, accounts, tasks, SnapLogic Designer for designing integrations, Snaplexes for executing pipelines, and API Management for exposing and governing APIs. Proficiency in pipeline development: Senior developers should be highly skilled in building, optimizing, and troubleshooting complex SnapLogic pipelines for various integration scenarios. Skills: Confluence ETL MuleSoft Data Engineering What you can expect from us: Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
At PwC, we specialise in consulting services for a variety of business applications, assisting clients in optimising operational efficiency. As a member of the SAP data and analytics team, you will focus on providing consulting services for data and analytics solutions using SAP technologies. Your responsibilities will include analysing client requirements, designing and implementing data management and analytics solutions, and offering training and support to ensure effective utilization of SAP tools. By closely collaborating with clients, you will develop data models, perform data analysis, and create visualisations and reports to facilitate data-driven decision-making, ultimately helping clients enhance data management processes, improve data quality, and derive valuable insights for achieving strategic objectives. Moreover, having expertise in Data Migration and familiarity with SAP BODS will be advantageous for this role. You will work alongside SAP functional and technical consultants to analyze data migration requirements and legacy data structures. Your tasks will involve designing suitable data migration solutions, including transformation routines where necessary, and conducting data source analysis and profiling within SAP ECC to ensure data quality and conformity to the target S/4 system. Furthermore, you will be responsible for building, testing, executing, and managing data migrations using tools like SAP SDS LSMW and SAP Data Services solutions. Experience with SAP Migration Cockpit, particularly LTMC, and customisation of LTMOM for adding new fields will be essential. You will oversee pre-load and post-load validations of migration results, address dropouts, and provide guidance on data cleansing needs. Additionally, you will conduct migration dress rehearsals, cutover tasks, and offer post-go-live support. In your advisory capacity, you will guide businesses in understanding data management challenges and recommend appropriate strategies and techniques. Your background should include extensive experience in data management and migration activities within a SAP environment. Proficiency in designing and implementing SAP's ETL solutions, such as SAP Data Services and SAP LSMW, as well as integration technologies like IDOCs and BAPIs, is crucial. You will be required to implement a Data Migration framework according to the data migration architecture using SAP Data Services and Information Steward, and create Technical Specification documents based on data migration experience and programming knowledge for Data Services Jobs. Furthermore, you will support data cleansing activities using tools like Information Steward, Excel, Data Services, and SQL, and assist in data validation exercises during system integration testing (SIT), user acceptance testing (UAT), and production cut-over phases. Familiarity with Snaplogic for data migration to S/4 HANA will be considered a valuable asset for this role. Positional Requirement: - Strong Technical Knowledge on Data Migration and Prior Project Lead experience Experience: - 5 to 9 years of experience Preferred Skills: - Snaplogic, SAP BODS, LTMC, Information Steward Preferred Knowledge: - Technical Knowledge on Snaplogic Professional & Educational Background: - B.TECH, B.E./Any Graduate,
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough