Home
Jobs

244 Data Transformation Jobs - Page 8

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Building Tool Good to have skills : Data EngineeringMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. Your role will require you to navigate complex data environments, providing insights and recommendations that drive effective data management and governance practices. Key Responsibilities:a Overall 12+ of data experience including 5+ years on Snowflake and 3+ years on DBT (Core and Cloud)b Played a key role in DBT related discussions with teams and clients to understand business problems and solutioning requirementsc As a DBT SME liaise with clients on business/ technology/ data transformation programs; orchestrate the implementation of planned initiatives and realization of business outcomes d Spearhead team to translate business goals/ challenges into practical data transformation and technology roadmaps and data architecture designs e Strong experience in designing, architecting and managing(admin) Snowflake solutions and deploying data analytics solutions in Snowflake.f Strong inclination for practice building that includes spearheading thought leadership discussions, managing team activities Technical Experience:a Strong Experience working as a Snowflake on Cloud DBT Data Architect with thorough knowledge of different servicesb Ability to architect solutions from OnPrem to cloud and create end to end data pipelines using DBT c Excellent process knowledge in one or more of the following areas:Finance, Healthcare, Customer Experienced Experience in working on Client Proposals (RFP's), Estimation, POCs, POVs on new Snowflake featurese DBT (Core and Cloud) end to end migration experience that includes DBT migration - Refactoring SQL for modularity, DBT modeling experience (.sql or .py files credbt job scheduling on atleast 2 projectsf Knowledge of Jinja template language (Macros) would be added advantageg Knowledge of Special features like DBT documentation, semantice layers creation, webhooks etc.h DBT and cloud certification is important.i Develop, fine-tune, and integrate LLM models (OpenAI, Anthropic, Mistral, etc.) into enterprise workflows via Cortex AI.j Deploy AI Agents capable of reasoning, tool use, chaining, and task orchestration for knowledge retrieval and decision support.k Guide the creation and management of GenAI assets like prompts, embeddings, semantic indexes, agents, and custom bots.l Collaborate with data engineers, ML engineers, and leadership team to translate business use cases into GenAI-driven solutions.m Provide mentorship and technical leadership to a small team of engineers working on GenAI initiatives.n Stay current with advancements in Snowflake, LLMs, and generative AI frameworks to continuously enhance solution capabilities.o Should have good understanding of SQL, Python. Also, the architectural concepts of Snowflake should be clear. Professional Attributes:a client management, stakeholder management, collaboration, interpersonal and relationship building skillsb Ability to create innovative solutions for key business challengesc Eagerness to learn and develop self on an ongoing basisd Structured communication written, verbal and presentational. Educational Qualification:a MBA Technology/Data related specializations/ MCA/ Advanced Degrees in STEM Qualification 15 years full time education

Posted 4 weeks ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Snowflake Data Warehouse Good to have skills : Data EngineeringMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. Key Responsibilities:a Overall 12+ of data experience including 5+ years on any ETL took, 3+ years on Snowflake and 1-3 years on Fivetranb Played a key role in Fivetran related discussions with teams and clients to understand business problems and solutioning requirementsc As a Fivetran SME liaise with clients on business/ technology/ data transformation programs; orchestrate the implementation of planned initiatives and realization of business outcomes d Spearhead team to translate business goals/ challenges into practical data transformation and technology roadmaps and data architecture designs e Strong experience in designing, architecting and managing(admin) Snowflake solutions and deploying data analytics solutions in Snowflake.f Strong inclination for practice building that includes spearheading thought leadership discussions, managing team activities Technical Experience:a Strong Experience working as a Fivetran Data Architect with thorough knowledge of different servicesb Ability to architect solutions from OnPrem to cloud and create end to end data pipelines using Fivetran c Excellent process knowledge in one or more of the following areas:Finance, Healthcare, Customer Experienced Experience in working on Client Proposals (RFP's), Estimation, POCs, POVs on new Snowflake featurese Fivetran end to end migration experience f Fivetran and any one cloud certification is good to have Professional Attributes:a Project management, stakeholder management, collaboration, interpersonal and relationship building skillsb Ability to create innovative solutions for key business challengesc Eagerness to learn and develop self on an ongoing basisd Structured communication written and verbal , presentation and s Educational Qualification:a MBA Technology/Data related specializations/ MCA/ Advanced Degrees in STEM Qualification 15 years full time education

Posted 4 weeks ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Building Tool Good to have skills : Data EngineeringMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives.Key Responsibilities:a Overall 12+ of data experience including 5+ years on Snowflake and 3+ years on DBT (Core and Cloud)b Played a key role in DBT related discussions with teams and clients to understand business problems and solutioning requirementsc As a DBT SME liaise with clients on business/ technology/ data transformation programs; orchestrate the implementation of planned initiatives and realization of business outcomes d Spearhead team to translate business goals/ challenges into practical data transformation and technology roadmaps and data architecture designs e Strong experience in designing, architecting and managing(admin) Snowflake solutions and deploying data analytics solutions in Snowflake.f Strong inclination for practice building that includes spearheading thought leadership discussions, managing team activities. Technical Experience:a Strong Experience working as a Snowflake on Cloud DBT Data Architect with thorough knowledge of different servicesb Ability to architect solutions from OnPrem to cloud and create end to end data pipelines using DBT c Excellent process knowledge in one or more of the following areas:Finance, Healthcare, Customer Experienced Experience in working on Client Proposals (RFP's), Estimation, POCs, POVs on new Snowflake featurese DBT (Core and Cloud) end to end migration experience that includes DBT migration - Refactoring SQL for modularity, DBT modeling experience (.sql or .py files credbt job scheduling on at least 2 projectsf Knowledge of Jinja template language (Macros) would be added advantageg Knowledge of Special features like DBT documentation, semantic layers creation, webhooks etc.h DBT and cloud certification is important.i Develop, fine-tune, and integrate LLM models (OpenAI, Anthropic, Mistral, etc.) into enterprise workflows via Cortex AI.j Deploy AI Agents capable of reasoning, tool use, chaining, and task orchestration for knowledge retrieval and decision support.k Guide the creation and management of GenAI assets like prompts, embeddings, semantic indexes, agents, and custom bots.l Collaborate with data engineers, ML engineers, and leadership team to translate business use cases into GenAI-driven solutions.m Provide mentorship and technical leadership to a small team of engineers working on GenAI initiatives.n Stay current with advancements in Snowflake, LLMs, and generative AI frameworks to continuously enhance solution capabilities.o Should have good understanding of SQL, Python. Also, the architectural concepts of Snowflake should be clear. Professional Attributes:a client management, stakeholder management, collaboration, interpersonal and relationship building skillsb Ability to create innovative solutions for key business challengesc Eagerness to learn and develop self on an ongoing basis Educational Qualification:a MBA Technology/Data related specializations/ MCA/ Advanced Degrees in STEM Qualification 15 years full time education

Posted 4 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP CPI for Data Services Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the solutions align with business objectives. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking opportunities for improvement and innovation in application development processes. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application specifications and user guides.- Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP CPI for Data Services.- Strong understanding of application development methodologies.- Experience with integration tools and techniques.- Familiarity with data transformation and mapping processes.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 3 years of experience in SAP CPI for Data Services.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Veeva Vault Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Offshore Migration Lead, you will oversee and coordinate offshore migration execution into the Veeva Vault platform. You will lead a team of migration specialists and analysts and BA, apply hands-on expertise in Vault migrations, SQL, and RDBMS, and collaborate with onshore counterparts to execute plans, manage timelines, resolve issues, and ensure compliance with quality standards. Roles & Responsibilities:-Lead and mentor a team of offshore migration specialists handling execution of document and metadata migration tasks.-Review deliverables and ensure adherence to migration standards, best practices, and compliance expectations.-Manage work allocation, backlog tracking, and progress reporting for offshore migration tasks.-Monitor the completion of daily/weekly migration targets, ensuring on-time and accurate delivery.-Perform root cause analysis on migration errors and coordinate with technical teams to resolve Vault Loader or API issues.-Validate output quality through spot checks, sampling, and test case validations.-Provide hands-on support when needed for migration jobs, SQL-driven data transformation, and validation checks.-Troubleshoot migration errors using Vault logs and work with developers or Vault SMEs to resolve blockers.-Act as the primary offshore contact for the onshore Migration Lead or Project Manager.-Ensure the offshore team follows controlled migration procedures and documentation protocols.-Maintain audit trails, job trackers, and version-controlled artifacts. Professional & Technical Skills: Must To Have Skills: Hands-on experience with Vault Loader and Vault REST APIs for document and object migration.-Strong command of SQL for data extraction, transformation, and validation.-Experience working with CSVs, XML, JSON payloads, and migration packaging.-Strong leadership and coordination skills in an offshore delivery model.-Excellent communication skills for daily sync-ups, reporting, and issue escalations.-Attention to detail, quality orientation, and ability to manage workload under deadlines.-Familiarity with regulatory requirements in GxP, 21 CFR Part 11 contexts.-Familiarity with Vault metadata models, document types, lifecycles, and object structures.-Experience with PromoMats / MedComms / Quality Suite / RIMS / Clinical and other Vault domains.-Proficiency in working with RDBMS like Oracle, SQL Server, PostgreSQL, or MySQL.-Experience in writing complex joins, subqueries, case statements, and data cleansing scripts.-Familiarity with legacy content/document systems such as Documentum, SharePoint, Calyx Insight, OpenText.-The candidate should have experience leading offshore migration teams for Veeva Vault projects.-Prior experience in regulated environments (GxP, 21 CFR Part 11) is required.-A minimum of 3-5 years of experience in Vault migrations is expected. Additional Information:-The candidate should have a minimum of 3 years of experience in Computer System Validation (CSV).-This position is PAN-INDIA based.-A 15 years full-time education is required. Qualification 15 years full time education

Posted 4 weeks ago

Apply

3.0 - 8.0 years

10 - 14 Lacs

Ahmedabad

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP ABAP Development for HANA Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will oversee the development process and ensure successful project delivery. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Lead the design, development, and implementation of applications.- Collaborate with cross-functional teams to ensure project success.- Provide technical guidance and mentorship to junior team members.- Identify and address technical challenges proactively.- Stay updated on industry trends and best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP CPI for Data Services.- Strong understanding of data integration and data transformation.- Experience in designing and implementing scalable solutions.- Hands-on experience with SAP Cloud Platform Integration tools.- Knowledge of SAP Cloud Platform services. Additional Information:- The candidate should have a minimum of 3 years of experience in SAP CPI for Data Services.- This position is based at our Hyderabad office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 4 weeks ago

Apply

5.0 - 6.0 years

12 - 16 Lacs

Chennai

Work from Office

Naukri logo

Project description We are seeking an experienced Senior Flexera Data Analyst to join our team. This role focuses on managing internal data structures, automating data flows, and delivering actionable reporting within the Flexera ecosystem. You should have deep experience in data modeling, API integrations, and reporting, along with a strong understanding of software asset management principles. Responsibilities Internal Data Structures & Modeling Design, maintain, and optimize internal data models and structures within the Flexera environment. Map business asset data to Flexera's normalized software models with precision and accuracy. Ensure accurate data classification, enrichment, and normalization to support software lifecycle tracking. Partner with infrastructure, operations, and IT teams to ingest and reconcile data from various internal systems. Reporting & Analytics Design and maintain reports and dashboards in Flexera or via external BI tools such as Power BI or Tableau. Provide analytical insights on software usage, compliance, licensing, optimization, and risk exposure. Automate recurring reporting processes and ensure timely delivery to business stakeholders. Work closely with business users to gather requirements and translate them into meaningful reports and visualizations. Automated Data Feeds & API Integrations Develop and support automated data feeds using Flexera REST/SOAP APIs. Integrate Flexera with enterprise tools (e.g., CMDB, SCCM, ServiceNow, ERP) to ensure reliable and consistent data flow. Monitor, troubleshoot, and resolve issues related to data extracts and API communication. Implement robust logging, alerting, and exception handling for integration pipelines. Skills Must have Minimum 6+ years of working with Flexera or similar software. Flexera ExpertiseStrong hands-on experience with Flexera One, FlexNet Manager Suite, or similar tools. Technical Skills: Proficient in REST/SOAP API development and integration. Strong SQL skills and familiarity with data transformation/normalization concepts. Experience using reporting tools like Power BI, Tableau, or Excel for data visualization. Familiarity with enterprise systems such as SCCM, ServiceNow, ERP, CMDBs, etc. Process & Problem Solving: Strong analytical and troubleshooting skills for data inconsistencies and API failures. Understanding of license models, software contracts, and compliance requirements. Nice to have Soft Skills: Excellent communication skills to translate technical data into business insights. OtherLanguagesEnglishC1 Advanced SenioritySenior

Posted 4 weeks ago

Apply

7.0 - 9.0 years

13 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

Job TitleModule Lead - Power BI Full Stack Location: Bangalore, India Experience Required: 7-12 years Employment Type: Full-time : We are seeking an experienced Power BI Lead with solid hands-on experience in requirement gathering , documentation, designing, developing, and maintaining robust BI solutions. The ideal candidate should possess expertise in Power BI , along with deep knowledge in backend development, database management , and data modeling to support end-to-end BI project lifecycles. Key Responsibilities: Develop and maintain Power BI reports, dashboards, and interactive visualizations to meet business requirements. Collaborate with business stakeholders to understand data needs, translating them into technical solutions. Design, implement, and optimize SQL queries, stored procedures, and data pipelines for efficient data retrieval and transformation. Develop and maintain ETL processes to ensure data integrity and accuracy across various data sources. Work with APIs and other third-party integrations to gather and visualize external data within Power BI reports. Ensure the security, scalability, and performance of BI solutions by following best practices in data governance. Conduct data modeling and design complex, multi-source data structures to support reporting needs. Perform data validation and troubleshoot issues to ensure accurate reporting and data representation. Continuously work on optimizing Power BI solutions for better performance and user experience . Provide training and technical support to end-users on Power BI tools and features. Required Skills & Qualifications: At least 5 years of hands-on experience in Power BI development, including creating dashboards, visualizations, and reports. Proficiency in DAX and Power Query for data transformations. Strong understanding of SQL Server, T-SQL , and other relational databases. Experience with ETL processes and tools like SSIS, Azure Data Factory , or similar. Experience in data modeling and working with large datasets in a business intelligence environment. Hands-on experience in backend development with programming languages like Python, .NET , or JavaScript is a plus. Ability to work with various data sources (SQL, NoSQL, cloud-based sources). Familiarity with Power BI service , including publishing, scheduling, and managing reports. Understanding of cloud technologies like Azure is a plus. Strong analytical and problem-solving skills. Additional Requirements: Demonstrated ability to have successfully completed multiple, complex technical projects. Demonstrates a rational and organized approach to the tasks undertaken and an awareness of the need to achieve quality. Demonstrates high standards of professional behaviour in dealings with clients, colleagues and staff. Strong written communication skills. Is effective and persuasive in both written and oral communication. Experience with gathering end user requirements and writing technical documentation. Time management and multitasking skills to effectively meet deadlines under time-to-market pressure. May require occasional travel. Qualifications Bachelors degree in Computer Science , Information Technology , or a related field. Power BI or relevant Microsoft certifications are preferred Additional Information Prior experience in industrial settings, and especially with Laboratory processes, is a plus.

Posted 4 weeks ago

Apply

0.0 years

1 - 3 Lacs

Gurugram

Work from Office

Naukri logo

Education : BE Pass out 2024 Pass out (Except BE Computer) Role & Responsibility - Fiber Engineer is a professional who designs, installs, maintains, and troubleshoots fiber optic networks used for high-speed internet, telecommunications, and data transmission. Their work is essential in building and maintaining the infrastructure that supports broadband service. Contact Details; Mobile no:- 9145591415 Mail ID:- rutuja.kumbhar@in.experis.com

Posted 4 weeks ago

Apply

6.0 - 8.0 years

18 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

The Development Lead will oversee the design, development, and delivery of advanced data solutions using Azure Databricks, SQL, and data visualization tools like Power BI. The role involves leading a team of developers, managing data pipelines, and creating insightful dashboards and reports to drive data-driven decision-making across the organization. The individual will ensure best practices are followed in data architecture, development, and reporting while maintaining alignment with business objectives. Key Responsibilities: Data Integration & ETL Processes: Design, build, and optimize ETL pipelines to manage the flow of data from various sources into data lakes, data warehouses, and reporting platforms. Data Visualization & Reporting: Lead the development of interactive dashboards and reports using Power BI, ensuring that business users have access to actionable insights and performance metrics. SQL Development & Optimization: Write, optimize, and review complex SQL queries for data extraction, transformation, and reporting, ensuring high performance and scalability across large datasets. Azure Cloud Solutions: Implement and manage cloud-based solutions using Azure services (Azure Databricks, Azure SQL Database, Data Lake) to support business intelligence and reporting initiatives. Collaboration with Stakeholders: Work closely with business leaders and cross-functional teams to understand reporting and analytics needs, translating them into technical requirements and actionable data solutions. Quality Assurance & Best Practices: Implement and maintain best practices in development, ensuring code quality, version control, and adherence to data governance standards. Performance Monitoring & Tuning: Continuously monitor the performance of data systems, reporting tools, and dashboards to ensure they meet SLAs and business requirements. Documentation & Training: Create and maintain comprehensive documentation for all data solutions, including architecture diagrams, ETL workflows, and data models. Provide training and support to end-users on Power BI reports and dashboards. Required Qualifications: Bachelors or Masters degree in Computer Science, Information Systems, or a related field. Proven experience as a Development Lead or Senior Data Engineer with expertise in Azure Databricks, SQL, Power BI, and data reporting/visualization. Hands-on experience in Azure Databricks for large-scale data processing and analytics, including Delta Lake, Spark SQL, and integration with Azure Data Lake. Strong expertise in SQL for querying, data transformation, and database management. Proficiency in Power BI for developing advanced dashboards, data models, and reporting solutions. Experience in ETL design and data integration across multiple systems, with a focus on performance optimization. Knowledge of Azure cloud architecture, including Azure SQL Database, Data Lake, and other relevant services. Experience leading agile development teams, with a strong focus on delivering high-quality, scalable solutions. Strong problem-solving skills, with the ability to troubleshoot and resolve complex data and reporting issues. Excellent communication skills, with the ability to interact with both technical and non-technical stakeholders. Preferred Qualifications: Knowledge of additional Azure services (e.g., Azure Synapse, Data Factory, Logic Apps) is a plus. Experience in Power BI for data visualization and custom calculations. Keywords Data Factory,Power BI*,Spark SQL,Logic Apps,Azure Databricks*,ETL design,agile development,SQL*,Synapse,data reporting*,Delta Lake,Azure Data Lake,Azure cloud architecture Mandatory Key Skills Data Factory,Power BI*,Spark SQL,Logic Apps,Azure Databricks*,ETL design,agile development,SQL*,Synapse,data reporting*,Delta Lake,Azure Data Lake,Azure cloud architecture

Posted 1 month ago

Apply

2.0 - 5.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : IBM Sterling B2B Integrator Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions, and ensuring that applications function seamlessly within the existing infrastructure. You will engage in problem-solving activities, contribute to key decisions, and manage the development process to deliver high-quality applications that align with business objectives. Roles & Responsibilities:- Expected to be an SME.- EDI Analysts to work on IBM B2Bi Integrator (Sterling Integrator). This work includes EDI design, development, and testing for customers onboarding, must be proficient in ASNI X12 and EDIFACT EDI specifications.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features.- Should be availabile for On-call during and week-end support as required for P1/P2 support Professional & Technical Skills: - Must To Have Skills: Proficiency in IBM Sterling B2B Integrator.- Strong understanding of application development methodologies.- Experience with integration solutions and data transformation processes.- Familiarity with troubleshooting and debugging techniques.- Ability to work with cross-functional teams to gather requirements. Additional Information:- The candidate should have minimum 5 years of experience in IBM Sterling B2B Integrator.- This position is based in Hyderabad.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

2.0 - 4.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Primary Responsibilities Gather and analyze requirements for clinical data conversion projects Collaborate with clients and vendors to define project scope, timelines, and deliverables Prepare and transform clinical data for conversion activities Address and resolve data-related issues reported by clients Develop and maintain documentation and specifications for data conversion processes Monitor project progress and ensure timely completion of milestones Troubleshoot common database issues and provide technical support Ensure compliance with US healthcare regulations and standards Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Familiarity with US healthcare systems and regulations Knowledge of standard EHR/EMR clinical data workflows Understanding of healthcare clinical dictionaries Proficiency in EHR database architecture and data extraction/transformation using MS SQL Server Solid knowledge of stored procedures, triggers, and functions Proven excellent problem-solving and troubleshooting skills Solid communication and collaboration abilities

Posted 1 month ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

Role Proficiency: Act creatively to develop applications and select appropriate technical options optimizing application development maintenance and performance by employing design patterns and reusing proven solutions account for others' developmental activities Outcomes: Interpret the application/feature/component design to develop the same in accordance with specifications. Code debug test document and communicate product/component/feature development stages. Validate results with user representatives; integrates and commissions the overall solution Select appropriate technical options for development such as reusing improving or reconfiguration of existing components or creating own solutions Optimises efficiency cost and quality. Influence and improve customer satisfaction Set FAST goals for self/team; provide feedback to FAST goals of team members Measures of Outcomes: Adherence to engineering process and standards (coding standards) Adherence to project schedule / timelines Number of technical issues uncovered during the execution of the project Number of defects in the code Number of defects post delivery Number of non compliance issues On time completion of mandatory compliance trainings Outputs Expected: Code: Code as per design Follow coding standards templates and checklists Review code - for team and peers Documentation: Create/review templates checklists guidelines standards for design/process/development Create/review deliverable documents. Design documentation r and requirements test cases/results Configure: Define and govern configuration management plan Ensure compliance from the team Test: Review and create unit test cases scenarios and execution Review test plan created by testing team Provide clarifications to the testing team Domain relevance: Advise Software Developers on design and development of features and components with a deep understanding of the business problem being addressed for the client. Learn more about the customer domain identifying opportunities to provide valuable addition to customers Complete relevant domain certifications Manage Project: Manage delivery of modules and/or manage user stories Manage Defects: Perform defect RCA and mitigation Identify defect trends and take proactive measures to improve quality Estimate: Create and provide input for effort estimation for projects Manage knowledge: Consume and contribute to project related documents share point libraries and client universities Review the reusable documents created by the team Release: Execute and monitor release process Design: Contribute to creation of design (HLD LLD SAD)/architecture for Applications/Features/Business Components/Data Models Interface with Customer: Clarify requirements and provide guidance to development team Present design options to customers Conduct product demos Manage Team: Set FAST goals and provide feedback Understand aspirations of team members and provide guidance opportunities etc Ensure team is engaged in project Certifications: Take relevant domain/technology certification Skill Examples: Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Break down complex problems into logical components Develop user interfaces business software components Use data models Estimate time and effort required for developing / debugging features / components Perform and evaluate test in the customer or target environment Make quick decisions on technical/project related challenges Manage a Team mentor and handle people related issues in team Maintain high motivation levels and positive dynamics in the team. Interface with other teams designers and other parallel practices Set goals for self and team. Provide feedback to team members Create and articulate impactful technical presentations Follow high level of business etiquette in emails and other business communication Drive conference calls with customers addressing customer questions Proactively ask for and offer help Ability to work under pressure determine dependencies risks facilitate planning; handling multiple tasks. Build confidence with customers by meeting the deliverables on time with quality. Estimate time and effort resources required for developing / debugging features / components Make on appropriate utilization of Software / Hardware's. Strong analytical and problem-solving abilities Knowledge Examples: Appropriate software programs / modules Functional and technical designing Programming languages - proficient in multiple skill clusters DBMS Operating Systems and software platforms Software Development Life Cycle Agile - Scrum or Kanban Methods Integrated development environment (IDE) Rapid application development (RAD) Modelling technology and languages Interface definition languages (IDL) Knowledge of customer domain and deep understanding of sub domain where problem is solved Additional Comments: Mandatory Skills, Knowledge, and Experience: - Python Development (6+ years): Strong backend development experience, including RESTful APIs and FastAPI. - Generative AI & OpenAI: Practical experience in working with Gen AI models and integrating OpenAI APIs into real-world applications. - API Development: Proven track record of building and maintaining REST APIs with FastAPI, including authentication, authorization, and rate-limiting. - Data Engineering: Expertise in ETL processes, data transformation, and analysis using Pandas. - LLM Prompt Engineering: Experience in prompt design and optimization for large language models. - Python Data Science Libraries: Proficient in Pandas, NumPy, and other data tools for processing and analysis. - Version Control & CI/CD: Proficient with Git and CI/CD pipelines for automated deployment and testing. - Agile/Scrum: 3+ years of experience working in Agile/Scrum environments. - Testing & Automation: Experience in unit, integration, and automated testing with pytest and unittest. - Communication: Strong verbal and written communication, with the ability to explain technical concepts to diverse stakeholders. - Non-Functional Requirements: Experience with performance optimization, scalability, and security in data-centric applications. Nice to Have Skills: - Cloud Platforms: Familiarity with AWS or GCP, particularly in scalable APIs, serverless architecture, and data storage. - Data Pipelines: Knowledge of Apache Airflow, Kafka, or similar tools for data workflow orchestration. - ML Frameworks: Experience with scikit-learn, TensorFlow, or PyTorch for model training and deployment. - Code Quality Tools: Familiarity with SonarQube, ESLint, or similar tools for maintaining high code quality. Required Skills Python,Generative AI,API,Etl Tools

Posted 1 month ago

Apply

6.0 - 10.0 years

19 - 25 Lacs

Gurugram

Work from Office

Naukri logo

Lead technology solution design and delivery Create and maintain optimal data solutions architecture and AI models Works with business partners to document complex company-wide acceptance test plans. Work concurrently on several projects, each with specific instructions that may differ from Assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvementsautomating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and cloud 'big data' technologies. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics. Work with stakeholders including the Executive, Product, Data and Design teams to assist with business-critical data insights, technical issues and support their data infrastructure needs. Keep our data separated and secure across national boundaries through multiple data centers and cloud regions. Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. Work with data and analytics experts to strive for greater functionality in our data systems. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Troubleshoot production support issues post release deployment and come up with solutions Explain, Socialize and Vet designs internal and external stakeholders Undergraduate degree or equivalent experience. Undergraduate Degree in Engineering or equivalent Over 7 years of experience in Data Engineering and Advanced Analytics Strong experience in build Generative AI based solutions for data management (data pipelines, data standardization, data quality) and data analytics. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience building and optimizing 'big data' data pipelines, architectures and data sets. Experience in Cloud technologies and SNOWFLAKE Experience in Kafka development Experience in Python/Java programing Experience in creating business data models Experience in Report development and dashboarding Strong Experience in driving Customer Experience Experience in working with agile teams Experience in Healthcare Clinical Domains Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Strong analytic skills related to working with unstructured datasets. A successful history of manipulating, processing and extracting value from large, disconnected datasets. Working knowledge of message queuing, stream processing, and highly scalable 'big data' data stores. Strong project management and organizational skills. Experience supporting and working with cross-functional teams in a dynamic environment. Careers with Optum. Here's the idea. We built an entire organization around one giant objective; make the health system work better for everyone. So when it comes to how we use the world's large accumulation of health-related information, or guide health and lifestyle choices or manage pharmacy benefits for millions, our first goal is to leap beyond the status quo and uncover new ways to serve. Optum, part of the UnitedHealth Group family of businesses, brings together some of the greatest minds and most advanced ideas on where health care has to go in order to reach its fullest potential. For you, that means working on high performance teams against sophisticated challenges that matter. Optum, incredible ideas in one incredible company and a singular opportunity to do your life's best work.SM Diversity creates a healthier atmosphereUnitedHealth Group is an Equal Employment Opportunity/Affirmative Action employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, age, national origin, protected veteran status, disability status, sexual orientation, gender identity or expression, marital status, genetic information, or any other characteristic protected by law. UnitedHealth Group is a drug-free workplace. Candidates are required to pass a drug test before beginning employment.

Posted 1 month ago

Apply

4.0 - 7.0 years

9 - 14 Lacs

Noida

Work from Office

Naukri logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities These are some of the basic responsibilities and more detailed one will be shared with the Talent Executive during the hiring process Data modeling, big data development, Extract, Transform and Load (ETL) development, storage engineering, data warehousing, data provisioning Platform-as-a-Service and Cloud solution with a focus on data stores and associated eco systems Partner with stakeholders to understand requirements and develop business intelligence tools or database tools to fetch data, provide insights and present recommendations Create specifications and transformation jobs to bring data into a proper structure and conduct analysis to validate the accuracy and quality of the data Create segmentation, dashboards, data visualizations, decision aids and business case analysis to support the organization Collaborate with stakeholders on ad hoc and standard reporting requests Identify appropriate data sources, metrics, and tools for providing required information according to clients' requests Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Graduate or Post Graduate preferable with major in Computer Science Experience on any Cloud Platform tool like Azure Experience on any visualization tools like Tableau, DOMO, QlikView Experience in a similar role/domain - HR Analytics and/or any of the HR vertical and/or Experience in a BI/Data Analytics team with exposure to HR data, PeopleSoft tables Hands-on knowledge on Data Transformations/Data Quality Knowledge on Database architecture, engineering, design, optimization, security and administration Knowledge on various HR Analytics datasets and metrics including but not restricted to Demographics, Hires & Turnover, DEI, Survey etc. Well versed with Data Engineering Insights and Data Analysis At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.

Posted 1 month ago

Apply

1.0 - 5.0 years

10 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

About the Role: Grade Level (for internal use): 08 The Role Data Transformation Analyst I - Well Logs The Team: The global subsurface operations team is responsible for regional formation tops studies, structural maps, international data, pressure data and directional surveys. Specifically, The Well Logs team is responsible for digitizing, log editing, and petrophysical data analysis. The team also manages log data collection and publication, accuracy, customer feedback, digital and raster sales. We value shared contributions, client satisfaction and being part of the team. Responsibilities and Impact: The Data Transformation Analyst will be responsible for sourcing, analysis, digitizing, data entry, maintenance and quality control of the exploration and production well log data within the S&P Global US Energy database. Well log identification, splicing, scoring and quality assurance of high business-value well-log curves into a composite log curve set and then distribute the data to interpretation applications. Petrophysical processing of well log data using different logs software including editing of bad data, depth alignment including Powerlog and Kingdom Resolving well log escalations and providing solutions. Manage historical entries in the database. Participate in data improvement projects through global, country, basin or area reviews conducted by the team Ensure consistency, currency and correctness of the data captured from various sources. Support the team in day-to-day activities to achieve the set goals. What Were Looking For: Basic Required Qualifications: Should have bachelors or masters degree in Geology/Applied Geology/Petroleum Engineering/Earth Science. Good computer skills and basic knowledge on MS-Office suite. Good understanding of petroleum geology, well logging. Experience in the oil and gas industry. Additional Preferred Qualifications: Experience in Powerlog or Kingdom software is preferred. Interest in managing and handling geological information. Ability to convert technical information into a usable format for entry into databases. Confident user of MS Excel main functions. Good written and oral communication skills in English. Good team player with proactive behavior. About S&P Global Commodity Insights At S&P Global Commodity Insights, our complete view of global energy and commodities markets enables our customers to make decisions with conviction and create long-term, sustainable value. Were a trusted connector that brings together thought leaders, market participants, governments, and regulators to co-create solutions that lead to progress. Vital to navigating Energy Transition, S&P Global Commodity Insights coverage includes oil and gas, power, chemicals, metals, agriculture and shipping. S&P Global Commodity Insights is a division of S&P Global (NYSESPGI). S&P Global is the worlds foremost provider of credit ratings, benchmarks, analytics and workflow solutions in the global capital, commodity and automotive markets. With every one of our offerings, we help many of the worlds leading organizations navigate the economic landscape so they can plan for tomorrow, today. For more information, visit http://www.spglobal.com/commodity-insights . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwideso we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Our benefits include: Health & WellnessHealth care coverage designed for the mind and body. Flexible DowntimeGenerous time off helps keep you energized for your time on. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), DTMGOP203 - Entry Professional (EEO Job Group), SWP Priority Ratings - (Strategic Workforce Planning)

Posted 1 month ago

Apply

3.0 - 7.0 years

12 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

Job Summary: Synechron is seeking an experienced Senior Data Engineer with expertise in AWS, Apache Airflow, and DBT to design and implement scalable, reliable data pipelines. The role involves collaborating with data teams and business stakeholders to develop data solutions that enable actionable insights and support organizational decision-making. The ideal candidate will bring data engineering experience, demonstrating strong technical skills, strategic thinking, and the ability to work in a fast-paced, evolving environment. Software Requirements: Required: Strong proficiency in AWS services including S3, Redshift, Lambda, and Glue, with proven hands-on experience Expertise in Apache Airflow for workflow orchestration and pipeline management Extensive experience with DBT for data transformation and modeling Solid knowledge of SQL for data querying and manipulation Preferred: Familiarity with Hadoop, Spark, or other big data technologies Experience with NoSQL databases (e.g., DynamoDB, Cassandra) Knowledge of data governance and security best practices within cloud environments Overall Responsibilities: Lead the design, development, and maintenance of scalable and efficient data pipelines and workflows utilizing AWS, Airflow, and DBT Collaborate with data scientists, analysts, and business teams to gather requirements and translate them into technical solutions Optimize Extract, Transform, Load (ETL) processes to enhance data quality, integrity, and timeliness Monitor pipeline performance, troubleshoot issues, and implement improvements to ensure operational excellence Enforce data management, governance, and security protocols across all data flows Mentor junior data engineers and promote best practices within the team Stay current with emerging data technologies and industry trends, recommending innovations for the data ecosystem Technical Skills (By Category): Programming Languages: Essential: SQL, Python (preferred for scripting and automation) Preferred: Spark, Scala, Java (for big data integration) Databases/Data Management: Extensive experience with data warehousing (Redshift, Snowflake, or similar) and relational databases (MySQL, PostgreSQL) Familiarity with NoSQL databases such as DynamoDB or Cassandra is a plus Cloud Technologies: AWS cloud platform, leveraging services like S3, Lambda, Glue, Redshift, and IAM security features Frameworks and Libraries: Apache Airflow, dbt, and related data orchestration and transformation tools Development Tools and Methodologies: Git, Jenkins, CI/CD pipelines, Agile/Scrum environment experience Security Protocols: Knowledge of data encryption, access control, and compliance standards in cloud data engineering Experience Requirements: At least 8 years of professional experience in data engineering or related roles with a focus on cloud ecosystems and big data pipelines Demonstrated experience designing and managing end-to-end data workflows in AWS environments Proven success in collaborating with cross-functional teams and translating business requirements into technical solutions Prior experience mentoring junior engineers and leading data projects is highly desirable Day-to-Day Activities: Develop, deploy, and monitor scalable data pipelines using AWS, Airflow, and DBT Collaborate regularly with data scientists, analysts, and business stakeholders to refine data requirements and deliver impactful solutions Troubleshoot production data pipeline issues to resolve data quality or performance bottlenecks Conduct code reviews, optimize existing workflows, and implement automation to improve efficiency Document data architecture, pipelines, and governance practices for knowledge sharing and compliance Keep abreast of emerging data tools and industry best practices, proposing enhancements to existing systems Qualifications: Bachelor’s degree in Computer Science, Data Science, Engineering, or related field; Master’s degree preferred Professional certifications such as AWS Certified Data Analytics – Specialty or related credentials are advantageous Commitment to continuous professional development and staying current with industry trends Professional Competencies: Strong analytical, problem-solving, and critical thinking skills Excellent communication abilities to effectively liaise with technical and business teams Proven leadership in mentoring team members and managing project deliverables Ability to work independently, prioritize tasks, and adapt to changing business needs Innovative mindset focused on scalable, efficient, and sustainable data solutions

Posted 1 month ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Hyderabad, Bengaluru

Work from Office

Naukri logo

- Database migration experience - Oracle to PostgreSQL, MongoDB, Cosmos DB - Migrate workloads from AWS to Azure - Automate schema/data migration - Tune performance and support cloud-native Database solutions Required Candidate profile - Database admin and migration with hands-on experience in Oracle to PostgreSQL, MongoDB, Cosmos DB, Azure, and AWS. - Strong skills in schema conversion, automation, and cloud-native databases.

Posted 1 month ago

Apply

5.0 - 7.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking an experienced SQL Developer with expertise in SQL Server Analysis Services (SSAS) and AWS to join our growing team. The successful candidate will be responsible for designing, developing, and maintaining SQL Server-based OLAP cubes and SSAS models for business intelligence purposes. You will work with multiple data sources, ensuring data integration, optimization, and performance of the reporting models. This role offers an exciting opportunity to work in a hybrid work environment, collaborate with cross-functional teams, and

Posted 1 month ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Naukri logo

Dear Candidate, We are hiring a Data Platform Engineer to build scalable infrastructure for data ingestion, processing, and analysis. Key Responsibilities: Architect distributed data systems. Enable data discoverability and quality. Develop data tooling and platform APIs. Required Skills & Qualifications: Experience with Spark, Kafka, and Delta Lake. Proficiency in Python, Scala, or Java. Familiar with cloud-based data platforms. Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Reddy Delivery Manager Integra Technologies

Posted 1 month ago

Apply

6.0 - 11.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Primary Skills Workato Platform Expertise Proficient in using the Workato integration platform to design, build, and manage end-to-end automation workflows across cloud and on-premise applications. Recipe Development and Orchestration Skilled in creating Workato recipes using triggers, actions, conditional logic, and loops to automate complex business processes and data flows. Application and API Integration Experience integrating a wide range of SaaS and enterprise applications (e.g., Salesforce, NetSuite, Workday, ServiceNow, Slack) using Workato connectors and custom HTTP connectors. Data Transformation and Mapping Strong understanding of data transformation techniques within Workato, including formula mode, list processing, and JSON/XML manipulation for seamless data exchange. Error Handling and Monitoring Ability to implement robust error handling strategies, including exception management, retries, and alerts, as well as monitoring recipe performance and logs. API Management and Webhooks Experience in exposing and consuming APIs using Workatos API platform, including setting up API endpoints, managing authentication, and handling webhooks. Security and Governance Knowledge of Workato workspace management, role-based access control (RBAC), and secure handling of credentials and sensitive data using encrypted properties. Collaboration and Lifecycle Management Familiarity with Workatos versioning, cloning, and environment promotion features to manage recipe lifecycle across development, staging, and production. Secondary Skills Understanding of business process automation and workflow optimization Experience with scripting or coding (JavaScript, Python) for custom logic in Workato Familiarity with iPaaS concepts and other platforms like MuleSoft, Dell Boomi, or Zapier Exposure to Agile methodologies and tools like Jira or Confluence Basic knowledge of databases and SQL for data querying and integration Experience with cloud platforms (AWS, Azure, GCP) and SaaS ecosystems Strong communication and documentation skills for working with business and technical teams

Posted 1 month ago

Apply

4.0 - 6.0 years

3 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

Primary Skills Workato Platform Expertise Proficient in using the Workato integration platform to design, build, and manage end-to-end automation workflows across cloud and on-premise applications. Recipe Development and Orchestration Skilled in creating Workato recipes using triggers, actions, conditional logic, and loops to automate complex business processes and data flows. Application and API Integration Experience integrating a wide range of SaaS and enterprise applications (e.g., Salesforce, NetSuite, Workday, ServiceNow, Slack) using Workato connectors and custom HTTP connectors. Data Transformation and Mapping Strong understanding of data transformation techniques within Workato, including formula mode, list processing, and JSON/XML manipulation for seamless data exchange. Error Handling and Monitoring Ability to implement robust error handling strategies, including exception management, retries, and alerts, as well as monitoring recipe performance and logs. API Management and Webhooks Experience in exposing and consuming APIs using Workatos API platform, including setting up API endpoints, managing authentication, and handling webhooks. Security and Governance Knowledge of Workato workspace management, role-based access control (RBAC), and secure handling of credentials and sensitive data using encrypted properties. Collaboration and Lifecycle Management Familiarity with Workatos versioning, cloning, and environment promotion features to manage recipe lifecycle across development, staging, and production. Secondary Skills Understanding of business process automation and workflow optimization Experience with scripting or coding (JavaScript, Python) for custom logic in Workato Familiarity with iPaaS concepts and other platforms like MuleSoft, Dell Boomi, or Zapier Exposure to Agile methodologies and tools like Jira or Confluence Basic knowledge of databases and SQL for data querying and integration Experience with cloud platforms (AWS, Azure, GCP) and SaaS ecosystems Strong communication and documentation skills for working with business and technical teams

Posted 1 month ago

Apply

3.0 - 8.0 years

6 - 12 Lacs

Kolkata

Work from Office

Naukri logo

Job Title: AI/ML Data Engineer Location: Kolkata, India Experience: 3+ Years Industry: IT / AI & Data Analytics Job Summary: We are hiring an experienced AI/ML Data Engineer to design and build scalable data pipelines and ETL processes to support analytics and machine learning projects. The ideal candidate will have strong Python and SQL skills, hands-on experience with tools like Apache Airflow , Kafka , and working knowledge of cloud platforms (AWS, GCP, or Azure) . A strong understanding of data transformation, feature engineering, and data automation is essential. Key Skills Required: ETL & Data Pipeline Development Python & SQL Programming Apache Airflow / Kafka / Spark / Hadoop Cloud Platforms: AWS / GCP / Azure Data Cleaning & Feature Engineering Strong Problem-Solving & Business Understanding Preferred Profile: Candidates with a B.Tech / M.Tech / MCA in Computer Science or Data Engineering, and 3+ years of hands-on experience in building data solutions, who can work closely with cross-functional teams and support AI/ML initiatives.

Posted 1 month ago

Apply

6.0 - 8.0 years

3 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title:ETL Developer Snap Logic Experience:6-8 Years Location:Bangalore : Technical Skills: Design, develop, and maintain SnapLogic pipelines to support integration projects. Build and manage APIs using SnapLogic to connect various data sources and systems. Leverage SnapLogic agent functionality to enable secure and efficient data integration. Collaborate with cross-functional teams to gather requirements and ensure solutions meet business needs. Troubleshoot and optimize existing SnapLogic integrations to improve performance and reliability. Document integration processes and provide guidance to team members on best practices. Proven experience with SnapLogic, including API builds and agent functionality. Strong understanding of integration patterns and best practices. Proficiency in data integration and ETL processes. Expertise on Relational Databases Oracle, SSMS and familiar with NO SQL DB MongoDB Knowledge of data warehousing concepts and data modelling Experience of performing validations on large-scale datax`x` Strong Rest API ,JSON’s and Data transformations experience Experience with Unit Testing and Integration Testing Familiarity with large language models (LLMs) and their integration with data pipelines. Experience in database architecture and optimization. Knowledge of U.S. healthcare systems, data standards (e.g., HL7, FHIR), and compliance requirements (e.g., HIPAA). Behavioral Skills: Excellent documentation and presentation skills, analytical and critical thinking skills, and the ability to identify needs and take initiative Follow engineering best practices and principles within your organisation Work closely with a Lead Software Engineer Be an active member of the MMC Technology community – contribute, collaborate, and learn Build strong relationships with members of your engineering squad

Posted 1 month ago

Apply

6.0 - 8.0 years

2 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title MuleSoft Developer Experience 6-8 Years Location Bangalore : Skills Hands-on experience with MuleSoft Anypoint Platform AnyPoint development tool, debugging techniques Knowledge of MuleSoft ESB (Enterprise Service Bus), including message routing, data transformation, and API-led connectivity. Ability to implement RESTful APIs and connectors within MuleSoft. Experience in developing and integrating API Portals, Identity Stores and API Gateways Understanding of architectural concepts for MuleSoft implementations including encryption, security, logging, throttling, scalability, clustering, and securing solutions Proficient in programming languages such as Java or JavaScript. Familiarity with database technologies such as SQL and NoSQL. Nice to haveCertified MuleSoft Associate Developer or MuleSoft Developer. Responsibilities Analyze, design and develop services and integrations using MuleSoft Any point platform. Design and implement API-led connectivity strategies, including the creation of reusable APIs, data services, and enterprise application integrations. Manage, deploy, and monitor Mule applications on cloud or on-premises environments. Create and maintain technical design documents and diagrams. Implement APIs as per the design specification. Design and develop MUnit Tests for APIs and integrations. Assist in troubleshooting and resolving technical issues. Collaborate with other software developers, business analysts, and software architects to plan, design, develop, test, and maintain web-based business applications. Participate in code reviews to ensure adherence to coding standards. Stay updated with the latest MuleSoft and related technology trends. Process Skills: Agile development processes Scrum Prover experience in delivering for complex software solutions Behavioral Skills: Excellent Oral and written communication Planning, executing, issue resolution skills Quick learner Excellent attitude, mindset to get the things done

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies