Home
Jobs
Companies
Resume

178 Snaplogic Jobs - Page 5

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 6.0 years

10 - 14 Lacs

Pune

Work from Office

Naukri logo

A Snapshot of Your Day Proficient in a domain expertise in SnapLogic and poised to take on the role of Integration Architect ! Your recognitions will be global since you will be supporting several Bus with ~70K users. You will be provided an atmosphere to learn & develop your skills in various dimensions & technologies. How You ll Make an Impact Implementing the integration scenarios based on the provided inputs and advising for possible improvements on the proposed solutions. Partner with other architects, vendors, partners, business, and technical teams to understand business needs and translate them into capability/platform roadmaps and work toward realizing the roadmaps Working across ocean to support Siemens Energy internal and external customers. Implementing standard processes/design patterns and providing continuous coaching/mentoring to team members will be part of your responsibilities. Act as primary developer evangelist on behalf of Data integration team for various integration patterns involving SnapLogic and API s in optimizing adoption and use of its service. Follow guidelines and collaborate with the team to build innovative solutions emphasizing quality code. Build and maintain effective working relationship with key technology team members. What You Bring 2 - 6 years of work experience into Application Integration with at least 3 years of experience into SnapLogic. Should have bachelor s or master s degree in Engineering with Computer Science/IT or equivalent experience. Experience into technology consulting, architecture and design focusing Enterprise Integration, SOA, Application Integration & good to have Business to Business (B2B). Key expertise in providing Integration Modernization and recommendation of leading iPaaS products (web Methods, SnapLogic) Developing integration pipelines for Snaplogic or alternative platforms and working as an Integration Developer involved in multiple integration developments. Should have a commanding skill on core snaps along with hands-on experience into designing, developing, debugging of pipelines, knowledge on Ultra, triggered tasks, and deployment process. Should have good understanding of web methodologies, REST Architecture, SOA, ESB, APIs, Web services, WSDL, Open API, XML, JSON, EDIFACT, X.12, Flat file. Protocols - AS2, HTTP(S), SOAP, JDBC, JMS. Mail, (S)FTP, SAP Adapters - Idoc, RFC, XI. Good to have interface architecture knowledge with one or more on-premises/ SaaS packaged solutions for ERP, CRM, PLM systems (e.g., SAP, S/4 Hana, Salesforce, Teamcenter, Oracle etc.) Experienced in SNAP Development and SNAP Operations Good Knowledge of SAP especially IDOC generation Knowledge of EDIFACT Standard for B2B Knowledge of different EDIFACT message types like ORDRSP, ORDERS, INVOIC etc. Languages: Python and Java Script. About the Team Who is Siemens Energy? At Siemens Energy, we are more than just an energy technology company. With ~100,000 dedicated employees in more than 90 countries, we develop the energy systems of the future, ensuring that the growing energy demand of the global community is met reliably and sustainably. The technologies created in our research departments and factories drive the energy transition and provide the base for one sixth of the worlds electricity generation. Our global team is committed to making sustainable, reliable, and affordable energy a reality by pushing the boundaries of what is possible. We uphold a 150-year legacy of innovation that encourages our search for people who will support our focus on decarbonization, new technologies, and energy transformation. Find out how you can make a difference at Siemens Energy: https: / / www.siemens-energy.com / employeevideo Our Commitment to Diversity Lucky for us, we are not all the same. Through diversity we generate power. We run on inclusion and our combined creative energy is fueled by over 130 nationalities. Siemens Energy celebrates character - no matter what ethnic background, gender, age, religion, identity, or disability. We energize society, all of society, and we do not discriminate based on our differences. Rewards/Benefits All employees are automatically covered under the Medical Insurance. Company paid considerable Family floater cover covering employee, spouse and 2 dependent children up to 25 years of age. Siemens Energy provides an option to opt for Meal Card to all its employees which will be as per the terms and conditions prescribed in the company policy. - As a part of CTC, tax saving measure Flexi Pay empowers employees with the choice to customize the amount in some of the salary components within a defined range thereby optimizing the tax benefits. Accordingly, each employee is empowered to decide on the best Possible net income out of the same fixed individual base pay on a monthly basis

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Dentsu’s master data management (MDM) team utilizes Semarchy for mastering critical enterprise data domains such as client, customer, etc for driving data governance, efficient operations, and improving the quality and trust of our data and insights. The Lead Semarchy developer will work closely with business and technical teams to design, develop, test, deploy, and maintain data products meeting functional and nonfunctional requirements. The lead developer will also lead other developers, impacting best practices and development standards, conduct process/code reviews, and manage the technical team for implementing tasks. Job Description: Core Requirements: Knowledge of Master Data Management (MDM) Experience of working with MDM systems (such as Informatica, IBM Data Stage, Trillium, Semarchy, etc.) Has a Computer Science or numerate degree. Minimum 5 years of SQL experience working in a data warehouse, analytics or data migration environment. Technical Leadership People Management Excellent communication skills with ability to document and present design patterns, code reviews, runbook. Knowledge of record matching and/or data quality issues. Experience of working with integration tools (such as Azure Data Factory, SnapLogic, BizTalk, etc.) Understands project management principals. Has performed demonstrations to stakeholders. Understanding of how to implement algorithms. Database design using normalisation techniques. Experienced in designing Entity Relationship Diagrams. Has worked in a technical team to deliver team goals. Has worked in an Agile environment (using Jira or Azure DevOps or other agile technologies). Worked on internal stakeholder or customer projects. Understands the technical development lifecycle. Must understand the difference between good design and bad design. Use of coding standards. Has created test plans/scripts. Must be a team player. Must be a strong problem solver. The following are preferred requirements: Experience of programming languages (such as C, C++, C#, Python) Experience of Reporting tools (Tableau or Power BI) Location: DGS India - Mumbai - Thane Ashar IT Park Brand: Dentsu Time Type: Full time Contract Type: Permanent Show more Show less

Posted 3 weeks ago

Apply

5.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

: As a Python Developer, you will play a critical role in our software development and data engineering initiatives. You will work closely with data engineers, architects, and other developers to build and maintain our applications and data pipelines. Your expertise in Python development, API design, and cloud technologies will be essential to your success. Responsibilities : - Design, develop, and maintain applications using the latest Python frameworks and technologies (Django, Flask, FastAPI). - Utilize Python libraries and tools (Pandas, NumPy, SQLAlchemy) for data manipulation and analysis. - Develop and maintain RESTful APIs, ensuring security, authentication, and authorization (OAuth, JWT). - Deploy, manage, and scale applications on AWS services (EC2, S3, RDS, Lambda). - Utilize infrastructure-as-code tools (Terraform, CloudFormation) for infrastructure management (Good to have). - Design and develop database solutions using PL/SQL (Packages, Functions, Ref cursors). - Implement data normalization and Oracle performance optimization techniques. - Design and develop data warehouse solutions, including data marts and ODS concepts. - Implement low-level design of warehouse solutions. - Work with Kubernetes for container orchestration, deploying, managing, and scaling applications on Kubernetes clusters.- - Utilize SnapLogic cloud-native integration platform for designing and implementing integration pipelines. Required Skills : - Expertise in Python frameworks (Django, Flask, FastAPI). - Proficiency in Python libraries (Pandas, NumPy, SQLAlchemy). - Strong experience in designing, developing, and maintaining RESTful APIs. - Familiarity with API security, authentication, and authorization mechanisms (OAuth, JWT). - Good experience and hands-on knowledge of PL/SQL (Packages/Functions/Ref cursors). - Knowledge of data normalization and Oracle performance optimization techniques. - Experience in development & low-level design of warehouse solutions. - Familiarity with Data Warehouse, Datamart and ODS concepts. - Proficiency in AWS services (EC2, S3, RDS, Lambda). Good to Have Skills : Kubernetes : - Hands-on experience with Kubernetes for container orchestration. Infrastructure as Code : - Experience with infrastructure-as-code tools (Terraform, CloudFormation). Integration Platforms : - Experience with SnapLogic cloud-native integration platform. Experience : - 5 to 8 years of experience as a Python Developer. Location : - Bangalore or Gurgaon

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Title: Middleware and API Developer Location: Chennai, India Job Type: Full-Time Experience Level: [Specify: Mid-Level / Senior-Level] Job Overview: We are looking for a skilled Middleware and API Developer to join our technology team in Chennai . The ideal candidate will have hands-on experience working with RESTful APIs , iPaaS platforms , SOAP web services , and data transformation tools like XSLT , Data Mapper , and Expression Language . This role involves designing, developing, and supporting enterprise integration solutions that enable seamless communication across internal and external systems. Key Responsibilities: Design, develop, and manage integration solutions using iPaaS platforms. Build and maintain RESTful and SOAP-based APIs for enterprise applications. Implement data transformation and routing using XSLT , Data Mapper , and Expression Language . Integrate third-party systems, cloud services, and legacy applications through middleware solutions. Collaborate with architects, business analysts, and development teams to understand integration requirements. Optimize API and middleware performance and ensure high availability and security. Prepare and maintain technical documentation and integration specifications. Provide support for production incidents and implement necessary fixes and improvements. Required Skills: Strong experience with RESTful APIs and SOAP web services . Hands-on experience with iPaaS platforms (e.g., Dell Boomi, MuleSoft, SnapLogic, Workato, Jitterbit, etc.). Proficiency in XSLT for XML transformations and Data Mapper tools. Familiarity with Expression Language used in integration platforms. Experience with API design, authentication (OAuth2, JWT), and security best practices. Understanding of enterprise integration patterns (EIP). Working knowledge of message queuing systems (e.g., JMS, RabbitMQ, Kafka) is a plus. Preferred Skills: Experience with cloud environments (AWS, Azure, GCP). Exposure to microservices architecture and containerized environments (Docker, Kubernetes). Familiarity with Agile/Scrum development practices. Experience in performance tuning and monitoring of integrations. Education & Experience: Bachelor’s degree in Computer Science, Information Technology, or a related field. 6–8 years of relevant experience in middleware and API development. Soft Skills: Excellent problem-solving and debugging skills. Strong verbal and written communication abilities. Ability to work independently and in a collaborative team environment. Commitment to quality and timely delivery. Show more Show less

Posted 3 weeks ago

Apply

9.0 years

0 Lacs

Greater Bengaluru Area

On-site

Linkedin logo

What if the work you did every day could impact the lives of people you know? Or all of humanity? At Illumina, we are expanding access to genomic technology to realize health equity for billions of people around the world. Our efforts enable life-changing discoveries that are transforming human health through the early detection and diagnosis of diseases and new treatment options for patients. Working at Illumina means being part of something bigger than yourself. Every person, in every role, has the opportunity to make a difference. Surrounded by extraordinary people, inspiring leaders, and world changing projects, you will do more and become more than you ever thought possible. Job Summary The position is an exciting opportunity to be a member of the Data Integration & Analytics team within GIS Application & Platform Services Dept. The team’s scope includes data services on enterprise data platforms like Snowflake cloud data platform, SAP HANA analytics and Denodo data virtualization. The team is responsible for managing the full software development lifecycle of data, its data quality, and operations. This role will support strategic solutions like the enterprise data lake on AWS/ Snowflake and the enterprise data warehouse on Snowflake. The role is responsible for collaborating with cross functional teams, planning, and coordinating requirements, providing data engineering services and helping build trust in the data being managed. Job Duties Translate business requirements into data requirements, data warehouse design and sustaining data management strategies on enterprise data platforms like Snowflake. Work with project leads, stakeholders, and business SMEs to define technical specifications to develop data modeling requirements and maintain data infrastructure to provide business users with the tools and data needed. Solution architect, design and develop large scale and optimized analytics solutions. Requirements gathering and analysis, development planning and co-ordination in collaboration with stakeholder teams. Understand data requirements & data latency to design & develop data ingestion pipelines Design and architect data lake solutions on AWS S3 and Snowflake, considering scalability, automation, security, and performance requirements Responsible for architecting, building, and optimizing data lake to store, process and analyze large volumes of structure and unstructured data. Understand data architecture & solution design, design & develop dimensional/ semantic data models in an enterprise data warehouse environment. Development and automation of enterprise data transformation pipelines Work with cross functional teams and process owners on the development of test cases and scripts, test models and solutions to verify that requirements are met and ensuring high levels of data quality. Develop and apply quality assurance best practices. Design and apply data engineering best practices for data lakes and data warehouses. Analyze data and data behaviors to support business user queries. Excellent understanding of impact due to changes in data platforms, data models and data behaviors. Excellent problem-solving skills and ability to troubleshoot complex data engineering issues. Benchmark application operational performance periodically, track (metrics) and fix issues. Understand and comply with data governance and compliance practices as defined for risk management. This includes data encryption practices, RBAC and security policies. Promote and apply metadata management best practices supporting enterprise data catalogs. Support change and release management processes. Support incident and response management including problem solving and root cause analysis, documentation. Support automation and on-call processes (Tier 1 / Tier 2). Specific Skills Or Other Requirements Requires 9 years of experience with: Primary Experience: Data Integration and Data Warehousing Data Platforms Required: Snowflake Cloud Data Platform Preferred: SAP HANA analytics Data Engineering Data Integration: 8+ years of experience working and technologies and ETL/ ELT patterns of data delivery. Strong understanding and experience in implementation of SDLC practices. Snowflake: 3+ years of required expertise with Snowflake SnowSQL, Snowpipe (integrated with AWS S3), Streams and Tasks, Stored Procedures, Merge statements, Functions, RBAC, Security Policies, Compute and Storage usage optimization techniques, Performance optimization techniques. Dbt Cloud: 2+ years of expertise with dbt cloud platform, very good understanding of data models (views, data materializations, incremental data loads, snapshots), cross functional references, DAGs and its impact, job scheduling, audit and monitoring, working with code repositories, deployments. AWS: 2+ years of required expertise with AWS services like S3, Glue, Lambda, Athena. Apache Software: Preferred expertise in data processing with Spark & Flink, message broking using Kafka, orchestration using Airflow, processing high data volumes in open table formats like Iceberg. HVR (Fivetran): Experience or knowledge on data replication with HVR is a plus. SnapLogic: Experience or knowledge on data integrations with SnapLogic is a plus. Certifications: Snowflake and dbt Cloud data engineering certifications is a plus. Source Systems Required: Knowledge and experience integrating data from SAP ERP (On premises, Cloud), Salesforce CRM, Workday, ServiceNow, Relational databases, REST APIs, Flat Files, Cloud Storage Data Orchestration Required: Control-M, Apache Airflow Cloud Storage Platforms Required: Amazon Web Services Preferred: Microsoft Azure Programming/ Scripting Snowflake: Scripting (Snowpipes, Tasks, Streams, Merge Statements, Stored Procedures, Functions, Security Policies), SQL, Python, PySpark Code Management Required: Excellent understanding of working with code repositories like GitHub, GitLab ,code version management, branching and merging patterns in a central repository managing cross functional code, deployments. Data Operations Excellent understanding of Data Ops practices for data management. Solution Design Good understanding of end-to-end solution architecture and design practices, ability to document solutions (maintain diagrams) Stakeholder Engagement Ability to take the lead and drive project activities collaborating with Analytics stakeholders and ensure the requirement is completed. Data Warehousing Excellent on fundamental concepts of dimensional modeling, experience working on data warehouse solutions, requirement gathering, design & build, data analysis, data quality. data validations, developing data transformations using ELT/ ETL patterns. Data As-A-Product Preferred knowledge and experience working with data treated as data products. Illumina is following a hybrid data mesh architecture which promotes data as a product for data lifecycle management. Governance Good understanding of working with companies having regulated systems and processes for data. Adherence to data protection practices using tagging, security policies and data security (object-level, column-level, row-level). Promoting and applying best practices for data catalogs, following data classification practices and metadata management for data products within your scope. Operating Systems Windows, Linux Education & Experience Bachelor’s degree equivalent in Computer Science/Engineering or equivalent degree #illuminacareers Illumina believes that everyone has the ability to make an impact, and we are proud to be an equal opportunity employer committed to providing employment opportunity regardless of sex, race, creed, color, gender, religion, marital status, domestic partner status, age, national origin or ancestry, physical or mental disability, medical condition, sexual orientation, pregnancy, military or veteran status, citizenship status, and genetic information. Show more Show less

Posted 3 weeks ago

Apply

6.0 years

0 Lacs

Greater Bengaluru Area

On-site

Linkedin logo

What if the work you did every day could impact the lives of people you know? Or all of humanity? At Illumina, we are expanding access to genomic technology to realize health equity for billions of people around the world. Our efforts enable life-changing discoveries that are transforming human health through the early detection and diagnosis of diseases and new treatment options for patients. Working at Illumina means being part of something bigger than yourself. Every person, in every role, has the opportunity to make a difference. Surrounded by extraordinary people, inspiring leaders, and world changing projects, you will do more and become more than you ever thought possible. Job Summary The position is an exciting opportunity to be a member of the Data Integration & Analytics team within GIS Application & Platform Services Dept. The team’s scope includes data services on enterprise data platforms like Snowflake cloud data platform, SAP HANA analytics and Denodo data virtualization. The team is responsible for managing the full software development lifecycle of data, its data quality, and operations. This role will support strategic solutions like the enterprise data lake on AWS/ Snowflake and the enterprise data warehouse on Snowflake. The role is responsible for collaborating with cross functional teams, planning, and coordinating requirements, providing data engineering services and helping build trust in the data being managed. Job Duties Translate business requirements into data requirements, data warehouse design and sustaining data management strategies on enterprise data platforms like Snowflake. Work with project leads, stakeholders, and business SMEs to define technical specifications to develop data modeling requirements and maintain data infrastructure to provide business users with the tools and data needed. Solution architect, design and develop large scale and optimized analytics solutions. Requirements gathering and analysis, development planning and co-ordination in collaboration with stakeholder teams. Understand data requirements & data latency to design & develop data ingestion pipelines Design and architect data lake solutions on AWS S3 and Snowflake, considering scalability, automation, security, and performance requirements Responsible for architecting, building, and optimizing data lake to store, process and analyze large volumes of structure and unstructured data. Understand data architecture & solution design, design & develop dimensional/ semantic data models in an enterprise data warehouse environment. Development and automation of enterprise data transformation pipelines Work with cross functional teams and process owners on the development of test cases and scripts, test models and solutions to verify that requirements are met and ensuring high levels of data quality. Develop and apply quality assurance best practices. Design and apply data engineering best practices for data lakes and data warehouses. Analyze data and data behaviors to support business user queries. Excellent understanding of impact due to changes in data platforms, data models and data behaviors. Excellent problem-solving skills and ability to troubleshoot complex data engineering issues. Benchmark application operational performance periodically, track (metrics) and fix issues. Understand and comply with data governance and compliance practices as defined for risk management. This includes data encryption practices, RBAC and security policies. Promote and apply metadata management best practices supporting enterprise data catalogs. Support change and release management processes. Support incident and response management including problem solving and root cause analysis, documentation. Support automation and on-call processes (Tier 1 / Tier 2). Specific Skills Or Other Requirements Requires 6 years of experience with Primary Experience: Data Integration and Data Warehousing Data Platforms Required: Snowflake cloud data platform Preferred: SAP HANA analytics Data Engineering Data Integration: 5+ years of experience working and technologies and ETL/ ELT patterns of data delivery. Strong understanding and experience in implementation of SDLC practices. Snowflake: 2+ years of required expertise with Snowflake SnowSQL, Snowpipe (integrated with AWS S3), Streams and Tasks, Stored Procedures, Merge statements, Functions, RBAC, Security Policies, Compute and Storage usage optimization techniques, Performance optimization techniques. Dbt Cloud: 2+ years of expertise with dbt cloud platform, very good understanding of data models (views, data materializations, incremental data loads, snapshots), cross functional references, DAGs and its impact, job scheduling, audit and monitoring, working with code repositories, deployments. AWS: 1+ years of required expertise with AWS services like S3, Glue, Lambda, Athena. Apache Software: Preferred expertise in data processing with Spark & Flink, message broking using Kafka, orchestration using Airflow, processing high data volumes in open table formats like Iceberg. HVR (Fivetran): Experience or knowledge on data replication with HVR is a plus. SnapLogic: Experience or knowledge on data integrations with SnapLogic is a plus. Certifications: Snowflake and dbt Cloud data engineering certifications is a plus. Source Systems Required: Knowledge and experience integrating data from SAP ERP (On premises, Cloud), Salesforce CRM, Workday, ServiceNow, Relational databases, REST APIs, Flat Files, Cloud Storage Data Orchestration Required: Control-M, Apache Airflow Cloud Storage Platforms Required: Amazon Web Services Preferred: Microsoft Azure Programming/ Scripting Snowflake: Scripting (Snowpipes, Tasks, Streams, Merge Statements, Stored Procedures, Functions, Security Policies), SQL, Python, PySpark Code Management Required: Excellent understanding of working with code repositories like GitHub, GitLab ,code version management, branching and merging patterns in a central repository managing cross functional code, deployments. Data Operations Excellent understanding of Data Ops practices for data management. Solution Design Good understanding of end-to-end solution architecture and design practices, ability to document solutions (maintain diagrams) Stakeholder Engagement Ability to take the lead and drive project activities collaborating with Analytics stakeholders and ensure the requirement is completed. Data Warehousing Excellent on fundamental concepts of dimensional modeling, experience working on data warehouse solutions, requirement gathering, design & build, data analysis, data quality. data validations, developing data transformations using ELT/ ETL patterns. Data As-A-Product Preferred knowledge and experience working with data treated as data products. Illumina is following a hybrid data mesh architecture which promotes data as a product for data lifecycle management. Governance Good understanding of working with companies having regulated systems and processes for data. Adherence to data protection practices using tagging, security policies and data security (object-level, column-level, row-level). Promoting and applying best practices for data catalogs, following data classification practices and metadata management for data products within your scope. Operating Systems Windows, Linux Education & Experience Bachelor’s degree equivalent in Computer Science/Engineering or equivalent degree. #illuminacareers Illumina believes that everyone has the ability to make an impact, and we are proud to be an equal opportunity employer committed to providing employment opportunity regardless of sex, race, creed, color, gender, religion, marital status, domestic partner status, age, national origin or ancestry, physical or mental disability, medical condition, sexual orientation, pregnancy, military or veteran status, citizenship status, and genetic information. Show more Show less

Posted 3 weeks ago

Apply

6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

6+ years of experience in data engineering. Strong knowledge in SQL. Expertise in Snowflake, DBT, and Python minimum 2+ years. SnapLogic or FivTran tool knowledge is an added advantage. The following skills are mandatory for an Astrid Data Engineer: Knowledge in AWS Cloud (AWS S3 & Lambda). Show more Show less

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Job Description As a recognized authority and leading contributor within their practice, this senior-level consulting position provides consistent high quality and innovative solution leadership to a project team. Provides expertise to project team(s) ensuring high quality, integrated software solutions within constraints of time and budget. Analyzes business needs to help ensure Oracle solution meets the customer’s objectives by combining industry best practices, product knowledge, and business acumen. Exercises judgment and business acumen in selecting methods and techniques to deliver technical solutions on non-routine and very complex aspects of applications and technology installations. Provides direction to project teams, and effectively influences customer leadership on key decisions. Resolves complex customer issues by recommending and implementing solutions. Demonstrates expertise in multiple business processes across two or more product families or ability to architect and design technology solutions encompassing multiple products and make decisions based on impact across the stack. 8+ years of experience relevant to this position including 5 years of consulting experience. Undergraduate degree or equivalent experience preferred. Product or technical expertise relevant to practice focus. Ability to communicate effectively. Ability to build rapport with team members and clients. Overview of ACS Technical Team: Is an acknowledged authority within the Oracle NetSuite Global Business Unit (GBU), providing subject matter expertise and consulting services to the GBU's most significant, strategic, and most challenging customers around the globe. Supports customers' full life cycle, including services targeted to ensure the success of complex, large-scale NetSuite implementations and post-go-live services to ensure the ongoing success of NetSuite solutions, mitigating the technical risks commonly seen for large-scale and/or complex implementations. Career Level - IC4 Responsibilities As a trusted advisor, technical solution architect, and technical consultant, the TECHNICAL ARCHITECT role provides technical architect consulting services, including: Holistic technical design reviews Performance and scalability Optimization of integrations and NetSuite customizations Data management consultation and guidance Consultative guidance on ERP leading practices Leveraging deep technical experience, TECHNICAL ARCHITECTs analyze customers' business & technical requirements to ensure appropriate and long-term scalable use of NetSuite and work with partners to implement recommendations. TECHNICAL ARCHITECTs work with their customers and partners to review technical feature gaps that may arise and devise appropriate solutions across the NetSuite ecosystem. TECHNICAL ARCHITECTs lead customers and partners through the appropriate use of NetSuite environments, design and optimizing considerations for integrations and customizations, and practices for successful data migrations. TECHNICAL ARCHITECTs form the core of the ACS Technical Team subject-matter expertise and are leveraged across accounts when required. TECHNICAL ARCHITECTs are the thought leaders within their area of expertise and work with the Product organization to ensure new product technical changes and capabilities are understood and adopted by customers and partners. Preferred Qualifications include: Ability to be self-directed, multi-task, and lead others with minimal supervision Minimum of 5 years of technical consulting experience Strong written and verbal communication Adept at getting hands-on with technology and presenting concepts effectively at various levels within a customer's and/or partner's organization Strong analytical skills Demonstrated expertise in one or more of the following: performance, integrations, technical architecture or software development Demonstrated experience in end-to-end business process flows Hands on Experience in the following areas are required: Performance and scalability of ERP systems (Oracle EBS, Oracle Fusion, PeopleSoft, JD Edwards, NetSuite) Orchestrating and executing load and performance testing Tuning of SQL statements ODBC / JDBC data extraction strategy, design and tuning Data Modeling SaaS/Cloud architectures Oracle database architecture Architecting and tuning integrations (with products like Oracle Data Integrator, Boomi, Mulesoft, Celigo, Workato or Snaplogic) ETL tools and techniques Experience in the following areas is desired: Advanced understanding of: Software development Database concepts ERP technology frameworks and stack Infrastructure (hardware, operating system and networking) Performance assessment and tuning activities Strong analytical skills Strong communication (written and verbal) and presentation skills To be self-directed and motivated Release management and/or Agile scrum master experience Developing and optimizing NetSuite SuiteTalk, SOAP integrations, or REST integrations Travel: Modest to moderate, as appropriate Oracle is committed to creating an inclusive workplace and welcomes candidates from all backgrounds. Learn more about NetSuite Advanced Customer Support (ACS) – video on YouTube NetSuite channel About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Job Description As a recognized authority and leading contributor within their practice, this senior-level consulting position provides consistent high quality and innovative solution leadership to a project team. Provides expertise to project team(s) ensuring high quality, integrated software solutions within constraints of time and budget. Analyzes business needs to help ensure Oracle solution meets the customer’s objectives by combining industry best practices, product knowledge, and business acumen. Exercises judgment and business acumen in selecting methods and techniques to deliver technical solutions on non-routine and very complex aspects of applications and technology installations. Provides direction to project teams, and effectively influences customer leadership on key decisions. Resolves complex customer issues by recommending and implementing solutions. Demonstrates expertise in multiple business processes across two or more product families or ability to architect and design technology solutions encompassing multiple products and make decisions based on impact across the stack. 8+ years of experience relevant to this position including 5 years of consulting experience. Undergraduate degree or equivalent experience preferred. Product or technical expertise relevant to practice focus. Ability to communicate effectively. Ability to build rapport with team members and clients. Overview of ACS Technical Team: Is an acknowledged authority within the Oracle NetSuite Global Business Unit (GBU), providing subject matter expertise and consulting services to the GBU's most significant, strategic, and most challenging customers around the globe. Supports customers' full life cycle, including services targeted to ensure the success of complex, large-scale NetSuite implementations and post-go-live services to ensure the ongoing success of NetSuite solutions, mitigating the technical risks commonly seen for large-scale and/or complex implementations. Career Level - IC4 Responsibilities As a trusted advisor, technical solution architect, and technical consultant, the TECHNICAL ARCHITECT role provides technical architect consulting services, including: Holistic technical design reviews Performance and scalability Optimization of integrations and NetSuite customizations Data management consultation and guidance Consultative guidance on ERP leading practices Leveraging deep technical experience, TECHNICAL ARCHITECTs analyze customers' business & technical requirements to ensure appropriate and long-term scalable use of NetSuite and work with partners to implement recommendations. TECHNICAL ARCHITECTs work with their customers and partners to review technical feature gaps that may arise and devise appropriate solutions across the NetSuite ecosystem. TECHNICAL ARCHITECTs lead customers and partners through the appropriate use of NetSuite environments, design and optimizing considerations for integrations and customizations, and practices for successful data migrations. TECHNICAL ARCHITECTs form the core of the ACS Technical Team subject-matter expertise and are leveraged across accounts when required. TECHNICAL ARCHITECTs are the thought leaders within their area of expertise and work with the Product organization to ensure new product technical changes and capabilities are understood and adopted by customers and partners. Preferred Qualifications include: Ability to be self-directed, multi-task, and lead others with minimal supervision Minimum of 5 years of technical consulting experience Strong written and verbal communication Adept at getting hands-on with technology and presenting concepts effectively at various levels within a customer's and/or partner's organization Strong analytical skills Demonstrated expertise in one or more of the following: performance, integrations, technical architecture or software development Demonstrated experience in end-to-end business process flows Hands on Experience in the following areas are required: Performance and scalability of ERP systems (Oracle EBS, Oracle Fusion, PeopleSoft, JD Edwards, NetSuite) Orchestrating and executing load and performance testing Tuning of SQL statements ODBC / JDBC data extraction strategy, design and tuning Data Modeling SaaS/Cloud architectures Oracle database architecture Architecting and tuning integrations (with products like Oracle Data Integrator, Boomi, Mulesoft, Celigo, Workato or Snaplogic) ETL tools and techniques Experience in the following areas is desired: Advanced understanding of: Software development Database concepts ERP technology frameworks and stack Infrastructure (hardware, operating system and networking) Performance assessment and tuning activities Strong analytical skills Strong communication (written and verbal) and presentation skills To be self-directed and motivated Release management and/or Agile scrum master experience Developing and optimizing NetSuite SuiteTalk, SOAP integrations, or REST integrations Travel: Modest to moderate, as appropriate Oracle is committed to creating an inclusive workplace and welcomes candidates from all backgrounds. Learn more about NetSuite Advanced Customer Support (ACS) – video on YouTube NetSuite channel About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 3 weeks ago

Apply

0.0 - 5.0 years

2 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title Snaplogic Experience 0-5Years Location Bangalore : Snaplogic

Posted 3 weeks ago

Apply

6.0 - 8.0 years

3 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title:ETL Developer Snap Logic Experience:6-8 Years Location:Bangalore : Technical Skills: Design, develop, and maintain SnapLogic pipelines to support integration projects. Build and manage APIs using SnapLogic to connect various data sources and systems. Leverage SnapLogic agent functionality to enable secure and efficient data integration. Collaborate with cross-functional teams to gather requirements and ensure solutions meet business needs. Troubleshoot and optimize existing SnapLogic integrations to improve performance and reliability. Document integration processes and provide guidance to team members on best practices. Proven experience with SnapLogic, including API builds and agent functionality. Strong understanding of integration patterns and best practices. Proficiency in data integration and ETL processes. Expertise on Relational Databases Oracle, SSMS and familiar with NO SQL DB MongoDB Knowledge of data warehousing concepts and data modelling Experience of performing validations on large-scale datax`x` Strong Rest API ,JSON’s and Data transformations experience Experience with Unit Testing and Integration Testing Familiarity with large language models (LLMs) and their integration with data pipelines. Experience in database architecture and optimization. Knowledge of U.S. healthcare systems, data standards (e.g., HL7, FHIR), and compliance requirements (e.g., HIPAA). Behavioral Skills: Excellent documentation and presentation skills, analytical and critical thinking skills, and the ability to identify needs and take initiative Follow engineering best practices and principles within your organisation Work closely with a Lead Software Engineer Be an active member of the MMC Technology community – contribute, collaborate, and learn Build strong relationships with members of your engineering squad

Posted 3 weeks ago

Apply

0 years

0 Lacs

Bengaluru East, Karnataka, India

On-site

Linkedin logo

Primary skills:Snaplogic A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements. You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Ability to work with clients to identify business challenges and contribute to client deliverables by refining, analyzing, and structuring relevant data Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Show more Show less

Posted 3 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

P2-C2-TSTS JD This role is for a Senior Database Developer who can understand complex functional, technical requirements and has the ability to understand the implications associated with the chosen technical strategy well aligned with the business context. The successful candidate would lead & develop Oracle modules with a high performing team to deliver successful product delivery .The successful candidate will be expected to interact with all levels of the business and technical community. Along with Oracle, knowledge of Python or Java will be a plus. Design, develop and maintain scalable data pipelines to support data ingestion, integration and distribution. Be accountable for technical delivery and take ownership of solutions. Develop and Unit test the solutions. Provide L3 support in case Production support team need any help to investigate or fix any production issue. Collaborate with various upstream and downstream systems/stakeholders. Other data & various non-prod environment management tasks, as needed 4-8 years' experience in development and strong analytical skills related to working data warehousing projects. Strong hands on experience advanced SQL Queries, stored procedures, packages, views, work on RDBMS like Oracle. Develop and maintain data pipelines, ETL processes, and data transformations using any ETL tool like Informatics, Snaplogic etc. Candidate should have hands on experience with job schedular tool Control-M and Unix shell scripting. Hand on experience and understanding of CI/CD pipelines to automate code deployment into multiple environments using Jenkins. Optimize data workflows for performance and efficiency. Mandatory Skills knowledge of Python, Java or Kubernetes will be a plus. Create and maintain technical documentation, including system configurations and best practices Ability to analyse complex problems in a structured manner and demonstrate multitasking capabilities. Flexible and approachable teamworker. Ability to operate under pressure and deliver to demanding deadlines Strong verbal and written communication skills. Show more Show less

Posted 3 weeks ago

Apply

4.0 - 8.0 years

0 - 1 Lacs

Bengaluru

Work from Office

Naukri logo

Were Hiring: Sr. Software Engineer SnapLogic | Bangalore | 48 Years Experience Job Title: Sr. Software Engineer – SnapLogic Location: Bangalore Experience: 4–8 Years Client & Budget: Will be discussed during the call Notice Period: Immediate to 30 Days preferred Key Responsibilities -Design and develop SnapLogic pipelines for enterprise data integration -Migrate ETL jobs into SnapLogic and manage platform moderation on AWS -Work closely with cross-functional teams to gather integration requirements -Configure SnapLogic components (snaps, pipelines, transformations) for optimized performance -Ensure data quality and reliability through well-structured ETL processes -Keep up with new SnapLogic features and best practices to enhance platform usage -Collaborate with business stakeholders to deliver long-term, sustainable solutions Required Skills -SnapLogic: 2–4 years of hands-on experience in pipeline development & debugging -ETL Tools: Experience with tools like DataStage, Informatica -Cloud & Data Warehousing: AWS Cloud exposure and hands-on Snowflake experience -Databases: Strong in SQL, PL/SQL, and RDBMS concepts -ETL Best Practices: Data transformation, cleansing, and mapping Bonus: SnapLogic Developer Certification is a big plus! Why Join Us? -Work on cutting-edge integration projects with modern tech stacks -Be part of a collaborative and forward-thinking engineering team -Opportunity to work with enterprise clients and mission-critical data platforms Ready to Apply? Send your CV to [ YourEmail@example.com ] or DM me to learn more. #Hiring#SnapLogic#ETLDeveloper#SoftwareEngineer#BangaloreJobs#AWS#Snowflake#DataStage#Informatica#SQL#PLSQL#DataIntegration#Hurryup#Applynow#Bengalurujobs#Snaplogicjobs#Referfriends#Hriring#ImmediateJoiner#Rwefercolleuges#Experienced#Datastageskill#linkedinconnection#like#share#refer#opnetowork#urgentopprtunuties#indianjobs#

Posted 3 weeks ago

Apply

5.0 - 7.0 years

8 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

We are looking for a Senior Data Engineer with deep experience in SnapLogic, SQL, ETL pipelines, and data warehousing, along with hands-on experience with Databricks.in designing scalable data solutions and working across cloud and big data .

Posted 3 weeks ago

Apply

8.0 - 12.0 years

17 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

Dear Candidate, We have job opening for SnapLogic Developer with one of our client . If you are interested in this position, please share update resume in this email id : shaswati.m@bct-consulting.com Job location Bangalore Experience 7-10 Years Job Description Must have hands on exp (min 6-8 years) in SnapLogic Pipeline Development with good debugging skills. ETL jobs migration exp into Snaplogic, Platform Moderation and cloud exposure on AWS Good to have SnapLogic developer certification, hands on exp in Snowflake. Should be strong in SQL, PL/SQL and RDBMS. Should be strong in ETL Tools like DataStage, informatica etc with data quality. Proficiency in configuring SnapLogic components, including snaps, pipelines, and transformations Designing and developing data integration pipelines using the SnapLogic platform to connect various systems, applications, and data sources. Building and configuring SnapLogic components such as snaps, pipelines, and transformations to handle data transformation, cleansing, and mapping. Experience in Design, development and deploying the reliable solutions. Ability to work with business partners and provide long lasting solutions Snaplogic Integration - Pipeline Development. Staying updated with the latest SnapLogic features, enhancements, and best practices to leverage the platform effectively.

Posted 3 weeks ago

Apply

6.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Country India Location: Capital Cyberscape, 2nd Floor, Ullahwas, Sector 59, Gurugram, Haryana 122102 Role: Data Engineer Location: Gurgaon Full/ Part-time: Full Time Build a career with confidence. Summary Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do.: Established Data Science & Analytics professional. Creating data mining architectures/models/protocols, statistical reporting, and data analysis methodologies to identify trends in large data sets About The Role Work with data to solve business problems, building and maintaining the infrastructure to answer questions and improve processes Help streamline our data science workflows, adding value to our product offerings and building out the customer lifecycle and retention models Work closely with the data science and business intelligence teams to develop data models and pipelines for research, reporting, and machine learning Be an advocate for best practices and continued learning Key Responsibilities Expert coding proficiency On Snowflake Exposure to SnowSQL, Snowpipe, Role based access controls, ETL / ELT tools like Nifi,Snaplogic,DBT Familiar with building data pipelines that leverage the full power and best practices of Snowflake as well as how to integrate common technologies that work with Snowflake (code CICD, monitoring, orchestration, data quality, monitoring) Designing Data Ingestion and Orchestration Pipelines using nifi, AWS, kafka, spark, control M Establish strategies for data extraction, ingestion, transformation, automation, and consumption. Role Responsibilities Experience in Data Lake Concepts with Structured, Semi-Structured and Unstructured Data Experience in strategies for Data Testing, Data Quality, Code Quality, Code Coverage Hands on expertise with Snowflake preferably with SnowPro Core Certification Develop a data model/architecture providing integrated data architecture that enables business services with strict quality management and provides the basis for the future knowledge management processes. Act as interface between business and development teams to guide thru solution end-to-end. Define tools used for design specifications, data modelling and data management capabilities with exploration into standard tools. Good understanding of data technologies including RDBMS, No-SQL databases. Requirements A minimum of 6 years prior relevant experience Strong exposure to Data Modelling, Data Access Patterns and SQL Knowledge of Data Storage Fundamentals, Networking Good to Have Exposure of AWS tools/Services Ability to conduct testing at different levels and stages of the project Knowledge of scripting languages like Java, Python Education Bachelor's degree in computer systems, Information Technology, Analytics, or related business area. Benefits We are committed to offering competitive benefits programs for all of our employees and enhancing our programs when necessary. Have peace of mind and body with our health insurance Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Programme Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Apply Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class. Job Applicant's Privacy Notice Click on this link to read the Job Applicant's Privacy Notice Show more Show less

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Country India Location: Capital Cyberscape, 2nd Floor, Ullahwas, Sector 59, Gurugram, Haryana 122102 Role: Data Engineer Location: Gurgaon Full/ Part-time: Full Time Build a career with confidence. Summary Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do.: Established Data Science & Analytics professional. Creating data mining architectures/models/protocols, statistical reporting, and data analysis methodologies to identify trends in large data sets About The Role Work with data to solve business problems, building and maintaining the infrastructure to answer questions and improve processes Help streamline our data science workflows, adding value to our product offerings and building out the customer lifecycle and retention models Work closely with the data science and business intelligence teams to develop data models and pipelines for research, reporting, and machine learning Be an advocate for best practices and continued learning Key Responsibilities Expert coding proficiency On Snowflake Exposure to SnowSQL, Snowpipe, Role based access controls, ETL / ELT tools like Nifi,Snaplogic,DBT Familiar with building data pipelines that leverage the full power and best practices of Snowflake as well as how to integrate common technologies that work with Snowflake (code CICD, monitoring, orchestration, data quality, monitoring) Designing Data Ingestion and Orchestration Pipelines using nifi, AWS, kafka, spark, control M Establish strategies for data extraction, ingestion, transformation, automation, and consumption. Role Responsibilities Experience in Data Lake Concepts with Structured, Semi-Structured and Unstructured Data Experience in strategies for Data Testing, Data Quality, Code Quality, Code Coverage Hands on expertise with Snowflake preferably with SnowPro Core Certification Develop a data model/architecture providing integrated data architecture that enables business services with strict quality management and provides the basis for the future knowledge management processes. Act as interface between business and development teams to guide thru solution end-to-end. Define tools used for design specifications, data modelling and data management capabilities with exploration into standard tools. Good understanding of data technologies including RDBMS, No-SQL databases. Requirements A minimum of 3 years prior relevant experience Strong exposure to Data Modelling, Data Access Patterns and SQL Knowledge of Data Storage Fundamentals, Networking Good to Have Exposure of AWS tools/Services Ability to conduct testing at different levels and stages of the project Knowledge of scripting languages like Java, Python Education Bachelor's degree in computer systems, Information Technology, Analytics, or related business area. Benefits We are committed to offering competitive benefits programs for all of our employees and enhancing our programs when necessary. Have peace of mind and body with our health insurance Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Programme Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Apply Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class. Job Applicant's Privacy Notice Click on this link to read the Job Applicant's Privacy Notice Show more Show less

Posted 3 weeks ago

Apply

2.0 - 6.0 years

13 - 17 Lacs

Mumbai

Work from Office

Naukri logo

At Siemens Energy, we can. Our technology is key, but our people make the difference. Brilliant minds innovate. They connect, create, and keep us on track towards changing the world's energy systems. Their spirit fuels our mission. Our culture is defined by caring, agile, respectful, and accountable individuals. We value excellence of any kind. Sounds like you? Software Developer - Data Integration Platform- Mumbai or Pune , Siemens Energy, Full Time Looking for challenging role? If you really want to make a difference - make it with us We make real what matters. About the role Technical Skills (Mandatory) Python (Data Ingestion Pipelines) Proficiency in building and maintaining data ingestion pipelines using Python. Blazegraph Experience with Blazegraph technology. Neptune Familiarity with Amazon Neptune, a fully managed graph database service. Knowledge Graph (RDF, Triple) Understanding of RDF (Resource Description Framework) and Triple stores for knowledge graph management. AWS Environment (S3) Experience working with AWS services, particularly S3 for storage solutions. GIT Proficiency in using Git for version control. Optional and good to have skills Azure DevOps (Optional)Experience with Azure DevOps for CI/CD pipelines and project management (optional but preferred). Metaphactory by Metaphacts (Very Optional)Familiarity with Metaphactory, a platform for knowledge graph management (very optional). LLM / Machine Learning ExperienceExperience with Large Language Models (LLM) and machine learning techniques. Big Data Solutions (Optional)Experience with big data solutions is a plus. SnapLogic / Alteryx / ETL Know-How (Optional)Familiarity with ETL tools like SnapLogic or Alteryx is optional but beneficial. We don't need superheroes, just super minds. A degree in Computer Science, Engineering, or a related field is preferred. Professional Software DevelopmentDemonstrated experience in professional software development practices. Years of Experience3-5 years of relevant experience in software development and related technologies. Soft Skills Strong problem-solving skills. Excellent communication and teamwork abilities. Ability to work in a fast-paced and dynamic environment. Strong attention to detail and commitment to quality. Fluent in English (spoken and written) We've got quite a lot to offer. How about you? This role is based in Pune or Mumbai , where you'll get the chance to work with teams impacting entire cities, countries "“ and the shape of things to come. We're Siemens. A collection of over 379,000 minds building the future, one day at a time in over 200 countries. We're dedicated to equality, and we welcome applications that reflect the diversity of the communities we work in. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and imagination and help us shape tomorrow. Find out more about Siemens careers at: www.siemens.com/careers

Posted 3 weeks ago

Apply

5.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role : As a Python Developer, you will play a critical role in our software development and data engineering initiatives. You will work closely with data engineers, architects, and other developers to build and maintain our applications and data pipelines. Your expertise in Python development, API design, and cloud technologies will be essential to your success. Responsibilities : - Design, develop, and maintain applications using the latest Python frameworks and technologies (Django, Flask, FastAPI). - Utilize Python libraries and tools (Pandas, NumPy, SQLAlchemy) for data manipulation and analysis. - Develop and maintain RESTful APIs, ensuring security, authentication, and authorization (OAuth, JWT). - Deploy, manage, and scale applications on AWS services (EC2, S3, RDS, Lambda). - Utilize infrastructure-as-code tools (Terraform, CloudFormation) for infrastructure management (Good to have). - Design and develop database solutions using PL/SQL (Packages, Functions, Ref cursors). - Implement data normalization and Oracle performance optimization techniques. - Design and develop data warehouse solutions, including data marts and ODS concepts. - Implement low-level design of warehouse solutions. - Work with Kubernetes for container orchestration, deploying, managing, and scaling applications on Kubernetes clusters.- - Utilize SnapLogic cloud-native integration platform for designing and implementing integration pipelines. Required Skills : - Expertise in Python frameworks (Django, Flask, FastAPI). - Proficiency in Python libraries (Pandas, NumPy, SQLAlchemy). - Strong experience in designing, developing, and maintaining RESTful APIs. - Familiarity with API security, authentication, and authorization mechanisms (OAuth, JWT). - Good experience and hands-on knowledge of PL/SQL (Packages/Functions/Ref cursors). - Knowledge of data normalization and Oracle performance optimization techniques. - Experience in development & low-level design of warehouse solutions. - Familiarity with Data Warehouse, Datamart and ODS concepts. - Proficiency in AWS services (EC2, S3, RDS, Lambda). Good to Have Skills : Kubernetes : - Hands-on experience with Kubernetes for container orchestration. Infrastructure as Code : - Experience with infrastructure-as-code tools (Terraform, CloudFormation). Integration Platforms : - Experience with SnapLogic cloud-native integration platform. Experience : - 5 to 8 years of experience as a Python Developer. Location : - Bangalore or Gurgaon Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 3 weeks ago

Apply

6.0 - 11.0 years

18 - 33 Lacs

Pune

Hybrid

Naukri logo

Job Requirements: Any of the skills for the following Skills for SnapLogic development is OK. Not should or Must have. 1. Should have strong knowledge in Snaplogic pipeline development and architecture. 2. Should have hands on experience in using various snaps available in Snaplogic like REST Snaps, Transform Snaps, Database Snaps, Script snap etc. 3. Should have knowledge in task creation like scheduled task, triggered task etc. 4. Should have experience in working Agile. 5. Should have Pipeline monitoring and troubleshooting experience. 6. knowledge in Integration development using AWS/any other cloud technologies. Work on Microsoft Dynamics (schema/connect browser), JDBC, Service Now, Google Big query Snaps, Oracle, REST, SOAP 7.Building complex mappings with JSON path expressions, Python scripting. Qualifications: 6-10 years of overall IT experience 2-3 years of Development experience in building Snap logic pipelines, error handling, scheduling tasks & alerts. Analyze & translate functional specifications /user stories into technical specifications. Experience with end to end implementations in Snap logic (Develop/Test/ Implementations) Integration experience to work with third party/external vendors across all modules and providing solution for Snaplogic design. Good written and verbal communication capabilities Strong experience in coordinating with the Business Analysts to understand business requirement, functional requirements, and conversion of business rules into technical specifications Proven ability to work independently or in conjunction with a team.

Posted 3 weeks ago

Apply

5 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

About Client Our client is a market-leading company with over 30 years of experience in the industry. As one of the world’s leading professional services firms, with $19.7B, with 333,640 associates worldwide, helping their clients modernize technology, reimagine processes, and transform experiences, enabling them to remain competitive in our fast-paced world. Their Specialties in Intelligent Process Automation, Digital Engineering, Industry & Platform Solutions, Internet of Things, Artificial Intelligence, Cloud, Data, Healthcare, Banking, Finance, Fintech, Manufacturing, Retail, Technology, and Salesforce Hi....! We are hiring for below Positions Job Title: Snaplogic Key Skills: Snaplogic , Netsuite, ETL , SAP Job Locations: Pan India Experience: 5– 14 Years Budget: 1- 23LPA Education Qualification : Any Graduation Work Mode: Hybrid Employment Type: Contract Notice Period: Immediate - 15 Days Job Description: Should have hands on experience in Snaplogic at least 5+ years. Should have experience to build integration with Salesforce, Netsuite and SAP etc. Working in Ultra would be an added advantage. Should have implementation experience in ETL/ELT. Should have experience in Snaplogic public API and creation of API with Snaplogic Should have in-depth knowledge in JSON/XML/XSD/XPath/XSLT. Working with APIM would be added advantage. Should have hands on experience in web services like Restful/Soap. Should have hands on experience on Java Script. Should have working experience in Agile methodology. Good to have basic core Java concept. Should have prior experience to handle customers and managing team. Should work on at least any of 3 design patterns like - ETL/API/ESB/Cloud Native Interested Candidates please share your CV to sushma.n@people-prime.com Show more Show less

Posted 4 weeks ago

Apply

14 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Client: Our client is a market-leading company with over 30 years of experience in the industry. As one of the world’s leading professional services firms, with $19.7B, with 333,640 associates worldwide, helping their clients modernize technology, reimagine processes, and transform experiences, enabling them to remain competitive in our fast-paced world. Their Specialties in Intelligent Process Automation, Digital Engineering, Industry & Platform Solutions, Internet of Things, Artificial Intelligence, Cloud, Data, Healthcare, Banking, Finance, Fintech, Manufacturing, Retail, Technology, and Salesforce Hi....! We are hiring for below Positions Job Title: Snaplogic Developer Key Skills: Snaplogic, ETL, Salesforce, Netsuite, API, JSON, XML, SOAP, JavaScript, Agile Methodology, Job Locations: PAN INDIA Experience: 5 – 14 Years Budget: Based On Current CTC, Will give 30-40% Hike Education Qualification : Any Graduation Work Mode: Hybrid Employment Type: Contract Notice Period: Immediate - 15 Days Interview Mode: Virtual Job Description: Job Summary Snaplogic : Should have hands on experience in Snaplogic at least 5+ years. Should have experience to build integration with Salesforce, Netsuite and SAP etc. Working in Ultra would be an added advantage. Should have implementation experience in ETL/ELT. Should have experience in Snaplogic public API and creation of API with Snaplogic Should have in-depth knowledge in JSON/XML/XSD/XPath/XSLT. Working with APIM would be added advantage. Should have hands on experience in web services like Restful/Soap. Should have hands on experience on Java Script. Should have working experience in Agile methodology. Good to have basic core Java concept. Should have prior experience to handle customers and managing team. Should work on at least any of 3 design patterns like - ETL/API/ESB/Cloud Native Interested Candidates please share your CV to vamsi.v@people-prime.com Show more Show less

Posted 4 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

Your work days are brighter here. At Workday, it all began with a conversation over breakfast. When our founders met at a sunny California diner, they came up with an idea to revolutionize the enterprise software market. And when we began to rise, one thing that really set us apart was our culture. A culture which was driven by our value of putting our people first. And ever since, the happiness, development, and contribution of every Workmate is central to who we are. Our Workmates believe a healthy employee-centric, collaborative culture is the essential mix of ingredients for success in business. That’s why we look after our people, communities and the planet while still being profitable. Feel encouraged to shine, however that manifests: you don’t need to hide who you are. You can feel the energy and the passion, it's what makes us unique. Inspired to make a brighter work day for all and transform with us to the next stage of our growth journey? Bring your brightest version of you and have a brighter work day here. At Workday, we value our candidates’ privacy and data security. Workday will never ask candidates to apply to jobs through websites that are not Workday Careers. Please be aware of sites that may ask for you to input your data in connection with a job posting that appears to be from Workday but is not. In addition, Workday will never ask candidates to pay a recruiting fee, or pay for consulting or coaching services, in order to apply for a job at Workday. About The Team Come be a part of something big Workday is embarking on our next growth adventure. As our Business Technology team continues its mission to deliver unparalleled value to our business partners and customers, we are expanding our presence in the Asia-Pacific region with a new Business Technology office in Pune, India. This new office will be an essential development centre to propel the growth of our company through transformational programs for Go-To-Market and Enterprise Data Analytics teams. If you want to be a part of building something big that will drive value throughout the entire global organization, then this is the opportunity for you. You will be working on top priority initiatives that span new and existing technologies - all to deliver outstanding results and experiences for our Customers and employees. Our Go-To-Market (GTM) Enterprise Applications team / Enterprise Architecture and Data Services team is currently looking for a Sr. Quality Assurance Engineer. About Go-To-Market Team: The Business Technology Go-To-Market team works in close partnership with our business partners to help fuel growth and revenue goals for Workday, along with driving exceptional Customer and employee experiences. The team is responsible for developing and supporting innovative architecture-led solutions for our Marketing, Sales, Services, Customer Support & Legal business functions with Salesforce being the primary platform alongside other cutting edge platforms like SnapLogic for Integrations, Conga/Apttus CPQ or equivalent, CLM, AWS as PaaS, Coveo Search Platform, OKTA for SSO and others. About The Role The Sr Associate Quality Assurance Engineer will develop, modify, and execute software test plans, automated scripts, and programs for testing. This role involves debugging software products through systematic tests to ensure and maintain quality standards for Workday’s products. The Quality Assurance Engineer will also ensure that system tests are effectively completed, documented, and resolved. Key Responsibilities Design and implement automated and manual test cases for Salesforce applications and related systems. Develop, maintain, and execute functional, regression, and integration test cases for Salesforce applications. Create and run automation test scripts using tools such as Selenium, Playwright, Cypress, TOSCA, or Provar. Conduct API testing using Postman, Swagger, or similar tools, validating request/response behavior across integrations. Actively participate in Agile ceremonies, sprint planning, and story refinement, contributing QA insights early in the development cycle. Document test plans, test cases, and test data sets, ensuring clear traceability to requirements. Identify, log, and track defects, perform root cause analysis (RCA), and collaborate with cross-functional teams to revalidate fixes. Support test optimization efforts through reusable components and automation improvements. Provide input on story refinement and share QA insights during sprint ceremonies. Apply test design techniques (including positive, negative, and edge case coverage) to improve overall quality outcomes. Validate Salesforce workflows, custom objects, and system integrations across different environments. Collaborate with developers and product teams to identify gaps, clarify requirements, and ensure robust test coverage. About You Basic Qualifications: 3 - 5 years of experience in Software Quality Assurance, with a balance of both test automation and manual testing. Hands-on experience working in an Agile environment, actively contributing to sprint planning and test-driven development. Strong hands-on automation experience with programming skills. Other Qualifications: TOSCA Certification (TA1 or equivalent) or Salesforce Certification such as Administrator, Marketing Cloud Associate or Administrator, CPQ Specialist, Business Analyst, Platform App Builder. Familiarity with XSLT, REST API, and Web Services. Experience with end-to-end testing, including test planning, execution, UAT, and regression testing. Strong communication skills with the ability to collaborate effectively across teams. Self-motivated, enthusiastic, and curious, with a proactive approach to problem-solving and continuous learning. Strong analytical and problem-solving skills with a keen eye for detail. Our Approach to Flexible Work With Flex Work, we’re combining the best of both worlds: in-person time and remote. Our approach enables our teams to deepen connections, maintain a strong community, and do their best work. We know that flexibility can take shape in many ways, so rather than a number of required days in-office each week, we simply spend at least half (50%) of our time each quarter in the office or in the field with our customers, prospects, and partners (depending on role). This means you'll have the freedom to create a flexible schedule that caters to your business, team, and personal needs, while being intentional to make the most of time spent together. Those in our remote "home office" roles also have the opportunity to come together in our offices for important moments that matter. Are you being referred to one of our roles? If so, ask your connection at Workday about our Employee Referral process! Show more Show less

Posted 4 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra

Remote

Indeed logo

Your work days are brighter here. At Workday, it all began with a conversation over breakfast. When our founders met at a sunny California diner, they came up with an idea to revolutionize the enterprise software market. And when we began to rise, one thing that really set us apart was our culture. A culture which was driven by our value of putting our people first. And ever since, the happiness, development, and contribution of every Workmate is central to who we are. Our Workmates believe a healthy employee-centric, collaborative culture is the essential mix of ingredients for success in business. That’s why we look after our people, communities and the planet while still being profitable. Feel encouraged to shine, however that manifests: you don’t need to hide who you are. You can feel the energy and the passion, it's what makes us unique. Inspired to make a brighter work day for all and transform with us to the next stage of our growth journey? Bring your brightest version of you and have a brighter work day here. At Workday, we value our candidates’ privacy and data security. Workday will never ask candidates to apply to jobs through websites that are not Workday Careers. Please be aware of sites that may ask for you to input your data in connection with a job posting that appears to be from Workday but is not. In addition, Workday will never ask candidates to pay a recruiting fee, or pay for consulting or coaching services, in order to apply for a job at Workday. About the Team The vision of our Business Technology Organization is to be the trusted partner that fuels Workday's business technology innovations and products to enable company growth at scale. Our team cultivates relationships with built on collaboration and trust to support a rapidly growing business. We strive to improve efficiency and operational effectiveness through technology, innovation and inspiration. About the Role Responsibilities: Software Application Troubleshooting at Application, Database, Network & Integration layers Lead and deliver Automation of tasks/service requests Perform trend analysis, and develop action plans for improving SLAs and reducing case volume and problems Incident troubleshooting, resolution and technical root cause analysis to address the problem permanently Identify business risks, inefficiencies, issues and opportunities related to Salesforce platform. Document, maintain standardization and look for ways to constantly improve processes & procedures. Develop expertise of Workday Go-To-Market business applications end-to-end Develop domain expertise of Workday’s Enterprise Applications including Integrations About You Basic Qualifications: 5+ years in Enterprise Software Application Development Bachelor’s or equivalent experience in Computer Science, Information Technology, or related field Other Qualifications: Deep technical knowledge of enterprise software application development and enterprise application integrations (Salesforce, Apttus, MuleSoft/SnapLogic) Hands on experience troubleshooting technical issues on Salesforce platform end-to-end (Application, Database, Network & Integration layers) Knowledge of IT service management tools and best practices(preferred) Self-motivated, flexible, teammate with proven multi-tasking, time management & organization expertise with the ability to handle multiple and often changing priorities. Attention to detail with the ability to analyze and tackle sophisticated problems as well as provide documentation, mentorship and instruction to users. Proven ability to learn and embrace new technologies, applications, and solutions. Our Approach to Flexible Work With Flex Work, we’re combining the best of both worlds: in-person time and remote. Our approach enables our teams to deepen connections, maintain a strong community, and do their best work. We know that flexibility can take shape in many ways, so rather than a number of required days in-office each week, we simply spend at least half (50%) of our time each quarter in the office or in the field with our customers, prospects, and partners (depending on role). This means you'll have the freedom to create a flexible schedule that caters to your business, team, and personal needs, while being intentional to make the most of time spent together. Those in our remote "home office" roles also have the opportunity to come together in our offices for important moments that matter. Are you being referred to one of our roles? If so, ask your connection at Workday about our Employee Referral process!

Posted 4 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies