Jobs
Interviews

552 Csv Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

2 - 6 Lacs

bengaluru

Work from Office

We are currently seeking a SQL Developer to join our team in Bangalore, Karntaka (IN-KA), India (IN). Once You Are Here, You Will: Developer: SQL (PostgreSQL/ETL)Data Analysis Agile process knowledge Act as the first point of escalation for daily service issues along with PM and be a primary point of contact for Stakeholders . Proficiency in SQL, data environments, and data transformation tools (Python). Strong understanding of ETL data pipelines, including integration with APIs and databases. Hands-on experience with cloud-based Data Warehousing solutions (Snowflake). Knowledge of SDLC and Agile development techniques Practical experience with source control (GIT, SVN, etc.) Knowledge of design, development, and data linkages inside RDBMS and file data stores for MS SQL Server databases (CSV, XML, JSON, etc.) Thorough understanding of the development methods for batch and real-time system integration Prepare/Review Test Scripts and Unit testing of changes. Provide training, support, and leadership to the larger project team Required Qualifications: 5+ years experience in : SQL (PostgreSQL/ETL)Data Analysis Agile process knowledge consulting role that include completing at least 4 projects in a developer role Preferred Experience: Prior experience with a software development methodology, Agile preferred Experience with data migration using Data Loader Ideal Mindset: Problem Solver. You are creative but also practical in finding solutions to problems that may arise in the project to avoid potential escalations. Analytical. You like to dissect complex processes and can help forge a path based on your findings

Posted 17 hours ago

Apply

4.0 - 7.0 years

7 - 11 Lacs

pune, chennai, bengaluru

Work from Office

Responsibilities Understand requirement and translate that to product features. Develop Technical solution for complex business problems using Mulesoft or any cloud integration services and related technologies. Should be able to design Microservices and Serverless based architecture. Should be hands on in Mulesoft and implement API using best practices Should work with vendors and work on integration of multiple systems. Should work in an API driven/API Led connectivity approach using API principles and standards/specifications. Should work in designing application including asynchronous programming, multithreading, mutability and concurrency control/recovery when dealing with persistent data stores Should drive the entire development team by defining and setting up high coding standards and follow best practices and principles aligning to the solution. Should work with the Devops team an implement CI/CD architecture Should be hands in developing and implementing best practices and write smart piece of code. The Role offers An outstanding opportunity to re-imagine, redesign, and apply technology to add value to the business and operations. An end-to-end project exposure across multiple technical stack and cloud platform. Very good exposure to multiple integration using mulesoft cutting across different message format and different systems An individual who has passion to learn and adapt to new technologies quickly and scale to next level easily. Exposure to multiple platforms and team and work with them collaboratively to get the technical solution implemented. High visibility, opportunity to interact with multiple groups within the organization, technology vendors and implementation partners. Essential Skills Overall, 4 to 7+ years of experience with 3+ years of experience in implementing integration/API solutions-based MuleSoft , RAML, Open API, API Management Developing integration services in MuleSoft ESB using Anypoint Studio and must have sound experience on various Mule connectors / adapters and integration technologies with API, REST, JMS and SOAP Good understanding of data formats such as XML, CSV, EDI and JSON Good understanding of typical integration technologies such as HTTP, XML/XSLT, JMS, JDBC, REST, SOAP, Webservices and APIs Should have experience in end to end API life cycle (i.e Designing APIs using RAML / Open API; API Implementation, API management - proxy, policy, monitoring) Should have experience in applying various enterprise integration patterns, error handling, synch, async modal, batch model. etc Unit testing using Munit and Junit are mandatory 2+ years of Java JEE experience preferred. Holding valid developer certification is mandatory Good Analytical skills, verbal and oral communication is mandatory. Good to have : Boomi, IBM Integration, Apigee Essential Qualification MCA/equivalent masters in computers is a must. Mulesoft developer Certification is a must Any cloud certification is a plus

Posted 20 hours ago

Apply

7.0 - 9.0 years

9 - 14 Lacs

pune, chennai, bengaluru

Work from Office

Overall, 7 to 9 years of experience in designing, developing, deploying, and maintaining integration processes between applications across cloud and on-premises systems using the Dell Boomi Atmosphere platform, Open API, API Management, and Enterprise Integration Patterns Strong understanding of data formats such as XML, CSV, EDI, Flat files, and JSON Good understanding of typical integration technologies & protocols such as HTTP, XML/XSLT, JMS, JDBC, REST, SOAP, Webservices, and APIs Experience in applying various enterprise integration patterns, error handling, synchronous/asynchronous models, batch models, etc Expertise in end-to-end API lifecycle management (eg, designing APIs using Open API, API implementation, API management - proxy, policy, monitoring) Strong experience in debugging, performance tuning and application security Strong analytical, team leadership, and technical problem-solving skills Participate in proof-of-concept development, demos, and post-deployment support of cross-team integration efforts Excellent verbal and oral communication skills are mandatory Holding a valid developer and lead certification is mandatory Ability to interact with different functional areas with excellent interpersonal and communication skills Good to Have: Experience with other modern integration platforms ex: MuleSoft, IBM Integration, Azure Integration Services, APIM Management(Apigee, IBM APIM, Kong, etc

Posted 20 hours ago

Apply

2.0 - 6.0 years

0 Lacs

bangalore, karnataka

On-site

As a Techo Functional BA at NTT DATA in Bangalore, Karnataka, India, you will play a crucial role in requirement gathering, system configuration, and verification. Your expertise in system testing and evidence collection for compliance will be essential. Additionally, your familiarity with SQL and Python programming will be a strong advantage in this role. You will also be exposed to CSV processes and validation documentation, requiring you to have experience in preparing traceability matrices and test scripts. Working closely with QA teams to ensure audit readiness and providing strong troubleshooting and post-deployment support will be part of your responsibilities. Qualifications Required: - Expertise in requirement gathering, system configuration, and verification - Skilled in system testing and evidence collection for compliance - Familiarity with SQL and Python programming - Experience with CSV processes and validation documentation - Ability to prepare traceability matrices and test scripts - Strong troubleshooting and post-deployment support capabilities NTT DATA is a trusted global innovator of business and technology services, serving 75% of the Fortune Global 100. As a Global Top Employer with experts in more than 50 countries, we are committed to helping clients innovate, optimize, and transform for long-term success. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation, and management of applications, infrastructure, and connectivity. NTT DATA is a leading provider of digital and AI infrastructure globally and part of the NTT Group, investing over $3.6 billion annually in R&D to drive organizations and society confidently into the digital future. Visit us at us.nttdata.com.,

Posted 2 days ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

As a Techo functional BA with expertise in CSV, SQL, and Python, your role at NTT DATA in Bangalore, Karnataka (IN-KA), India will involve the following responsibilities: - Expertise in requirement gathering, system configuration, and verification. - Skilled in system testing and evidence collection for compliance. - Familiarity with SQL and Python programming is a strong advantage. - Exposure to CSV processes and validation documentation. - Experience in preparing traceability matrices and test scripts. - Ability to work closely with QA teams to ensure audit readiness. - Strong troubleshooting and post-deployment support capabilities. NTT DATA is a $30 billion trusted global innovator of business and technology services, committed to helping clients innovate, optimize, and transform for long-term success. With experts in more than 50 countries and a partner ecosystem of established and start-up companies, our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation, and management of applications, infrastructure, and connectivity. As one of the leading providers of digital and AI infrastructure in the world, NTT DATA, a part of the NTT Group, invests over $3.6 billion each year in R&D to support organizations and society in confidently moving into the digital future. Visit us at us.nttdata.com.,

Posted 3 days ago

Apply

8.0 - 13.0 years

9 - 14 Lacs

gurugram

Work from Office

Shift: 7 pm IST to 4 am IST Responsibilities Architect, engineer, implement, and administer Splunk solutions in highly available, redundant, distributed computing environments. Lead design and deployment of new Splunk environments, including clustered, multi-site, and large-scale configurations. Perform Splunk forwarder deployment, configuration, and troubleshooting across diverse platforms. Integrate, curate, and normalize diverse log sources into Splunk, ensuring CIM compliance and high data fidelity. Configure and maintain Splunk dashboards, searches, and alerts to meet PCI DSS logging requirements, and deliver evidentiary reports to auditors to support compliance verification Develop advanced content for SIEM correlation, including custom correlation searches, dashboards, and alerts. Administer, maintain, and tune Splunk components (Indexers, Search Heads, Forwarders, Cluster Masters, Deployer, Deployment Server, and License Master). Proactively monitor platform health using internal logs, KPIs, and custom monitoring solutions to identify and address performance bottlenecks. Lead capacity planning, storage forecasting, and continuity of operations for large Splunk deployments. Optimize Splunk performance through configuration tuning, search optimization, and data model acceleration strategies. Troubleshoot complex ingestion, performance, and search-related issues, identifying root causes and implementing sustainable fixes or workarounds. Reproduce customer or internal issues, document findings, and work with Splunk Support or vendor engineers for resolution. Create, maintain, and enforce Splunk engineering documentation, including SOPs, design diagrams, architecture runbooks, and KB articles. Develop custom scripts and automation tools (e.g., Python, Bash, PowerShell) to improve Splunk administration, onboarding, and operational workflows. Utilize Splunk APIs for integration with enterprise tools and automation frameworks. Serve as a technical escalation point for Splunk Engineer I/II and Splunk Admin roles. Administer, tune, and troubleshoot Splunk Enterprise Security, maintaining data models, correlation searches, and notable events pipeline. Configure and manage HEC (HTTP Event Collector) connections and onboard new data sources. Manage Splunk RBAC (Role-Based Access Control) including SAML and AD group integrations for search heads and API endpoints. Collaborate with security, infrastructure, application, and DevOps teams to ensure Splunk aligns with enterprise monitoring, compliance, and operational goals. Design and implement Splunk solutions supporting compliance frameworks (e.g., PCI DSS, HIPAA, SOX), including dashboard/report development and audit evidence. Research, evaluate, and implement new Splunk apps, add-ons, and integrations to enhance platform capabilities. Mentor junior Splunk engineers and guide cross-functional teams on Splunk best practices, search optimization, and data onboarding. Requirements 8+ years of IT experience in technical engineering, security operations, or infrastructure roles. 5+ years of direct, hands-on Splunk engineering and administration experience in large-scale, distributed environments. Expert-level knowledge of Splunk Enterprise and Splunk Enterprise Security, including architecture, clustering, and scaling strategies. Proficiency in Linux/Unix administration and shell scripting. Strong knowledge of Splunk APIs, including use for automation and tool integrations. Expertise in regex, field extractions, and key-value parsing. Strong programming/scriptingskills in one or more languages (Python, Bash, PowerShell, Perl, JavaScript). Experience with storage systems (DAS, SAN, object storage) and understanding of their performance implications for Splunk indexing. Solid understanding of networking (switches, routers, firewalls, load balancers, DNS, SSL/TLS) and how it impacts Splunk architecture. Familiarity with Enterprise Management and automation tools. Experience with Splunk ITSI (preferred) and other premium Splunk apps. Strong knowledge of data formats including JSON, XML, and CSV. Demonstrated experience delivering Splunk-based compliance reporting and audit support. Strong communication skills for interacting with technical and non-technical stakeholders. Proven ability to lead projects, mentor team members, and provide architectural guidance. Education & Certifications Bachelors degree in Computer Science, Information Systems, or related technical field (or equivalent experience). Splunk Certified Architect and/or Splunk Certified Consultant preferred. Additional certifications in security, cloud, or automation tools are a plus.

Posted 3 days ago

Apply

5.0 - 8.0 years

9 - 13 Lacs

hyderabad

Work from Office

Join Amgens Mission of Serving Patients At Amgen, if you feel like youre part of something bigger, its because you are Our shared mission?to serve patients living with serious illnesses?drives all that we do, Since 1980, weve helped pioneer the world of biotech in our fight against the worlds toughest diseases With our focus on four therapeutic areas Oncology, Inflammation, General Medicine, and Rare Diseasewe reach millions of patients each year As a member of the Amgen team, youll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives, Our award-winning culture is collaborative, innovative, and science based If you have a passion for challenges and the opportunities that lay within them, youll thrive as part of the Amgen team Join us and transform the lives of patients while transforming your career, Senior Manager Manufacturing Systems Engineering What You Will Do Lets do this Lets change the world This strategic leadership role is accountable for building, managing, and evolving the global MES COE, overseeing MES and adjacent systems deployments, lifecycle management, and governance for all manufacturing sites across Amgens network, As the Sr Manager, you will direct a high-performing team of MES engineers, and technical specialists, ensuring the effective delivery and continuous improvement of MES solutions You will partner with cross-functional leaders in Digital, Technology & Innovation, Operations, Quality, IT, and Automation to drive Amgens NextGen MES strategy?from vision through execution?supporting seamless integration with SAP, automation, cloud, and data platforms, Your leadership will ensure standardization, compliance, and technical excellence, enabling Amgens digital transformation and operational agility on a global scale, Roles & Responsibilities Lead and manage the global MES Center of Excellence (COE), including strategy, team building, performance management, and talent development, Oversee and execute the global MES roadmap and lifecycle strategy, ensuring alignment with Amgens business objectives and digital transformation initiatives, Establish and enforce best-in-class standards, methodologies, and governance for MES architecture, integration, cybersecurity, data integrity, and regulatory compliance (e-g , GxP, CFR Part 11), Drive harmonization of MES business processes and technical implementations across global sites, leveraging ISA-95 and other relevant manufacturing standards, Guide the evaluation and adoption of emerging technologies (e-g , IIoT, data lakes, AI/ML, containerization) to enhance MES capabilities and support advanced manufacturing use cases, Oversee the solution architecture, design, deployment, and lifecycle management of NextGen MES platforms?with a focus on PAS|X, SAP integration, automation interfaces, and cloud adoption?across all manufacturing sites, Serve as a steering member and customer concern point for all major MES projects, system incidents, and strategic vendor relationships, Develop and manage COE resource plans, budgets, and external partnerships to meet organizational objectives, Foster a culture of innovation, knowledge sharing, and continuous improvement within the COE and the wider global MES community, Ensure comprehensive documentation, knowledge management, and training programs to support global system adoption and sustainability, Champion change management strategies to enable successful MES transformation and adoption at all levels of the organization, Represent Amgen MES strategy and practices to senior leadership, regulatory agencies, and external partners as required, What We Expect Of You We are all different, yet we all use our unique contributions to serve patients, Basic Qualifications: Doctorate degree / Master's degree / Bachelor's degree and 12 to 17 years Engineering, Computer Science, Information Systems, or a related field, Minimum 8 years of hands-on experience with MES solution architecture, deployment, and lifecycle management in a GMP-regulated pharmaceutical, biotechnology, or manufacturing environment, Extensive experience with PAS|X MES (including v3 3 or higher), SAP integration, and Level 2/3 automation systems, Proven track record leading global, multi-site MES transformation programs, including direct management and mentorship of technical teams, Deep understanding of GxP regulations, CSV/validation, and SDLC standard methodologies, Demonstrated expertise with IT/OT cybersecurity, data integrity, cloud-based architectures, and emerging digital manufacturing platforms (e-g , IIoT, AI/ML, data lakes), Strong financial and vendor management experience, Excellent communication, relationship-building, and collaborator engagement skills at all levels, including executive leadership, Preferred Qualifications: Expertise with Korber PAS|X MES, including eBR, Equipment Management, Weigh & Dispense, etc Familiarity with international MES standards (ISA-95, GAMP, etc ), Experience representing MES programs with regulatory agencies and external partners, Experience with SAP MES integration and automation platforms (Rockwell, DeltaV, OPC UA, Pi), Proficiency with modern data platforms (PostgreSQL, data lakes, data fabric) and containerized application deployment (e-g , Kubernetes), Professional certifications (MES, SAP, Project Management/PMP, or equivalent), Soft Skills Visionary leadership with a passion for nurturing technical talent and high-performing teams, Strategic thinking, adaptability, and a growth mindset, Exceptional organizational, analytical, and problem-solving abilities, Resilience in a dynamic, fast-paced, and global environment, Commitment to diversity, inclusion, and fostering a collaborative culture, What You Can Expect Of Us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being From our competitive benefits to our collaborative culture, well support your journey every step of the way, In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards, Apply now and make a lasting impact with the Amgen team, careers amgen As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients Together, we compete in the fight against serious disease, Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law, We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment Please contact us to request accommodation, Show more Show less

Posted 3 days ago

Apply

8.0 - 13.0 years

2 - 2 Lacs

hyderabad

Work from Office

SUMMARY Job Title: Power Automate Senior Developer Location: [Hyderabad , In Office-Hybrid] Experience: 5+ years overall in RPA 2 3 years in Power Automate Job Description : Automation Development & Design Design, develop, and implement end-to-end automation solutions using Power Automate Desktop and Power Automate Cloud Flows . Minimum 2 to 3 years of hands-on relevant experience in developing solutions using Power Automate. Build reusable components and follow best practices for scalable, maintainable automation. Develop and maintain enterprise-grade automations and frameworks . Technical Proficiency Strong RPA background with a solid understanding of UI automation and background automation techniques. Integrate automations with APIs , Databases , and legacy systems . Implement robust logging and error-handling mechanisms to ensure reliability and traceability. Work with Work Queues and implement multi-bot architecture for optimized process execution. Handle data manipulation using Excel, Google Sheets, CSV, JSON, and XML. Hands-on experience with Google Workspace automation (e.g., Google Sheets, GDrive, Gmail). Leverage the Microsoft ecosystem effectively, including Office 365 , Azure Cloud , and Power Platform tools. Good experience with premium connectors and custom connectors . Emerging Technologies Awareness of Agentic Automation trends (autonomous agents capable of dynamic decision-making). 4. Soft Skills & Collaboration Strong communication and documentation skills for effective team collaboration and reporting. Ability to work independently and deliver high-quality results with minimal supervision.

Posted 3 days ago

Apply

6.0 - 10.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Technical Solution Architect, your role involves proposing technical options and solutions with thorough comparative analysis. You will guide the team in design and implementation, interact with clients to create end-to-end specifications for PIM solutions, and define implementation processes, quality gates, and standards. Additionally, you will perform data analysis and troubleshooting to resolve data quality, data integrity, and system performance issues. Your support will be crucial in assisting development and test teams with the installation & configuration of the Stibo STEP platform. Key Responsibilities: - Propose technical options and solutions with thorough comparative analysis. - Guide the team in design and implementation. - Interact with clients to create end-to-end specifications for PIM solutions. - Define implementation processes, quality gates, and standards. - Perform data analysis and troubleshooting to resolve data quality, data integrity, and system performance issues. - Support development and test teams in the installation & configuration of the Stibo STEP platform. Qualifications Required: - 5-8 years of hands-on experience with Stibo STEP Master Data Management (MDM) platform. - Proficiency in JavaScript or Java/J2EE. - Experience configuring and customizing Stibo STEP MDM across domains like Product, Customer, Supplier. - Strong understanding of data modeling concepts and experience designing data models within Stibo STEP. - Strong knowledge of data integration tools/techniques: ETL, REST APIs, 3rd-party integrations using web services. - Database & SQL knowledge. - Proficiency with IDEs and debugging code. - Understanding of ER model. - Familiarity with XML, XSD, JSON, CSV, and other data formats. - Stibo STEP certification (preferred). - Informatica PIM knowledge (a plus). Location: Pune/Bengaluru,

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

Role Overview: Fictiv is looking for an Integrations Engineering Manager to lead the integration engineering team while also being involved in MuleSoft integration solutions. This role requires a combination of technical leadership and people management skills to ensure the smooth operation of integration solutions and the development of a high-performing engineering team. As the Integrations Engineering Manager, you will be responsible for overseeing the technical architecture, team development, and seamless data flow between different systems and applications. Key Responsibilities: - Hire, onboard, and develop integration engineers, establishing clear career progression paths - Conduct regular 1:1s, performance reviews, and provide ongoing coaching and mentorship - Set team goals, manage capacity planning, and align objectives with business priorities - Foster a collaborative culture that promotes technical excellence and innovation - Define and drive integration architecture strategy and technical roadmap - Lead development of integrations between enterprise systems such as CRM, ERP, PLM, and Finance systems - Provide technical guidance, code reviews, and architectural decisions for complex integration challenges - Oversee MuleSoft integration performance, health, and stability using monitoring tools - Lead response to integration-related incidents and ensure timely resolution - Coordinate deployment of MuleSoft applications and integrations across different environments - Work with business stakeholders to understand requirements and translate them into technical solutions - Communicate technical concepts, project status, and strategic recommendations to technical and business audiences Qualifications Required: - 3+ years of engineering management experience - Experience managing technical teams of 3-8 engineers - 8+ years of hands-on experience in MuleSoft Anypoint Platform - Mulesoft Certified Platform Architect - Level 1 certification - Proficiency in integration patterns, ESB concepts, API-led connectivity, and various data formats and protocols - Familiarity with cloud platforms, monitoring tools, DevOps practices, and automation tools - Strong technical and problem-solving skills - Excellent communication and interpersonal skills - Ability to diagnose and resolve complex technical issues - Experience in managing and supporting enterprise-level applications and integrations - Strong analytical and troubleshooting skills - Ability to prioritize tasks, manage incidents, and work under pressure - Strong collaboration skills and the ability to work effectively in a team Additional Company Details: Fictiv's Digital Manufacturing Ecosystem is revolutionizing the design, development, and delivery of cutting-edge hardware products worldwide. The company values diversity, inclusion, respect, honesty, collaboration, and growth in creating a strong, empathetic team. Fictiv encourages applications from underrepresented groups to foster a diverse and inclusive work environment.,

Posted 4 days ago

Apply

2.0 - 6.0 years

0 Lacs

ahmedabad, gujarat

On-site

As an entrepreneurial, passionate, and driven Data Engineer at Startup Gala Intelligence backed by Navneet Tech Venture, you will play a crucial role in shaping the technology vision, architecture, and engineering culture of the company right from the beginning. Your contributions will be foundational in developing best practices and establishing the engineering team. **Key Responsibilities:** - **Web Scraping & Crawling:** Build and maintain automated scrapers to extract structured and unstructured data from websites, APIs, and public datasets. - **Scalable Scraping Systems:** Develop multi-threaded, distributed crawlers capable of handling high-volume data collection without interruptions. - **Data Parsing & Cleaning:** Normalize scraped data, remove noise, and ensure consistency before passing to data pipelines. - **Anti-bot & Evasion Tactics:** Implement proxy rotation, captcha solving, and request throttling techniques to handle scraping restrictions. - **Integration with Pipelines:** Deliver clean, structured datasets into NoSQL stores and ETL pipelines for further enrichment and graph-based storage. - **Data Quality & Validation:** Ensure data accuracy, deduplicate records, and maintain a trust scoring system for data confidence. - **Documentation & Maintenance:** Keep scrapers updated when websites change, and document scraping logic for reproducibility. **Qualifications Required:** - 2+ years of experience in web scraping, crawling, or data collection. - Strong proficiency in Python (libraries like BeautifulSoup, Scrapy, Selenium, Playwright, Requests). - Familiarity with NoSQL databases (MongoDB, DynamoDB) and data serialization formats (JSON, CSV, Parquet). - Experience in handling large-scale scraping with proxy management and rate-limiting. - Basic knowledge of ETL processes and integration with data pipelines. - Exposure to graph databases (Neo4j) is a plus. As part of Gala Intelligence, you will be working in a tech-driven startup dedicated to solving fraud detection and prevention challenges. The company values transparency, collaboration, and individual ownership, creating an environment where talented individuals can thrive and contribute to impactful solutions. If you are someone who enjoys early-stage challenges, thrives on owning the entire tech stack, and is passionate about building innovative, scalable solutions, we encourage you to apply. Join us in leveraging technology to combat fraud and make a meaningful impact from day one.,

Posted 4 days ago

Apply

5.0 - 10.0 years

6 - 12 Lacs

bengaluru

Hybrid

Notice Period-Immediate Joiner or Max 10 days (Do not share long notice period profile.) Permanent Payroll - Anlage Infotech Client - NTT DATA Location - Bangalore (Hybrid- 2/3 days in a week) Role & responsibilities - Location: Bengaluru Exp: 5+ years (Relevant) Shift Timings: General Hybrid: 2 to 3 days WFO Mandate Skills: ** CSV, Compliance Gxp, Infrastructure support ** GxP, Compliance , CSV knowledge, Qualification and Maintenance ** Optional Skills: (Good to Have Skills) - Tools exp TIMS, ADO, Servicenow POSITION OVERVIEW : Industry Consulting Consultant POSITION GENERAL DUTIES AND TASKS : Job Description Skill Area: CSV, Compliance Gxp, Infrastructure support Main Technology: Compliance• Mandatory Skills: (Must Have Skills) - GxP, Compliance , CSV knowledge, Qualification and Maintenance Optional Skills: (Good to Have Skills) - Tools exp TIMS, ADO, Servicenow Roles & Responsibilities: QC Key Responsiilites : Support IT Infrastructure qualification activities e.g., creation and maintenance of the qualification documentation Ensuring that IT services are following relevant internal and external requirements. Work in the agile framework as part of a product team and work towards the team’s goals, based on Product Owner and Business Owner’s requirements.

Posted 4 days ago

Apply

4.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Test/Sr. Test Data Analyst at our company in Chennai, you will be responsible for the following: - Data formats: You will work with various data formats such as XML, FIX, JSON, and CSV. - Test data management tool: Utilize any test data management tool for your tasks. - Data transformation tools: Use any data transformation tools as required. - Technology: Your primary goal will involve extracting data from existing production sets, masking/subsetting, and transforming it into data loaders format (CSV). Currently, subject matter experts are reviewing all test data requirements from different streams. The current approach of addressing everyone's test data needs is not scalable in the current approach with SIT/E2E/CDE/ITE phase. Additionally, you will need the following qualifications: - Able to discuss various test data requirements with multiple teams. - Collaborate with Business Analysts, Product teams, and management to understand requirements and timelines. - Quick learner who can grasp complex workflows related to the trading cycle and dependencies. If you have any additional details about the company in the job description, please provide them for a more comprehensive overview.,

Posted 5 days ago

Apply

4.0 - 9.0 years

2 - 6 Lacs

hyderabad

Work from Office

Job Purpose The Property Data Engineer is responsible for developing and maintaining data conversion programs that transform raw property assessment data into standardized formats based on specifications by Property Data Analyst and Senior Analysts. This role requires not only advanced programming and ETL skills but also a deep understanding of the structure, nuances, and business context of assessment data. Even with clear and well-documented conversion instructions, engineers without prior exposure to this domain often face significant challenges in interpreting and transforming the data accurately. Data Engineer plays a critical role in ensuring the accuracy, efficiency and scalability of data processing pipelines that support the Assessor Operations. Responsibilities Depending on the specific team and role, the Property Data Engineer may be responsible for some or all the following tasks: Develop and maintain data conversion programs using C#, Python, JavaScript, and SQL. Implement ETL workflows using tools such as Pentaho Kettle, SSIS, and internal applications. Collaborate with Analysts and Senior Analysts to interpret conversion instructions and translate them into executable code. Troubleshoot and resolve issues identified during quality control reviews. Recommend and implement automation strategies to improve data processing efficiency. Perform quality checks on converted data and ensure alignment with business rules and standards. Contribute to the development of internal tools and utilities to support data transformation tasks. Maintain documentation for code, workflows, and processes to support team knowledge sharing. Programming (Skill Level: Advanced to Expert) Create and maintain conversion programs in SQL, Visual Studio using C#, Python or JavaScript. Use JavaScript within Pentaho Kettle workflows and SSIS for data transformation. Build and enhance in-house tools to support custom data processing needs. Ensure code is modular, maintainable, and aligned with internal development standards. Ensure code quality through peer reviews, testing and adherence to development standards. ETL Execution (Skill Level: Advanced to Expert ) Execute and troubleshoot ETL processes using tools like Kettle, SSIS, and proprietary tools. Input parameters, execute jobs, and perform quality checks on output files. Troubleshoot ETL failures and optimize performance. Recommend and implement automation strategies to improve data processing efficiency and accuracy. Data File Manipulation (Skill Level: Advanced to Expert) Work with a wide variety of file formats (CSV, Excel, TXT, XML, etc.) to prepare data for conversion. Apply advanced techniques to clean, merge, and structure data. Develop scripts and tools to automate repetitive data preparation tasks. Ensure data is optimized for downstream ETL and analytical workflows. Data Analysis (Skill Level: Supportive Applied) Leverage prior experience in data analysis to independently review and interpret source data when developing or refining conversion programs. Analyze data structures, field patterns, and anomalies to improve the accuracy and efficiency of conversion logic. Use SQL queries, Excel tools, and internal utilities to validate assumptions and enhance the clarity of analyst-provided instructions. Collaborate with Analysts and Senior Analysts to clarify ambiguous requirements and suggest improvements based on technical feasibility and data behavior. Conduct targeted research using public data sources (e.g., assessor websites) to resolve data inconsistencies or fill in missing context during development. Quality Control (Skill Level: Engineer-Level) Perform initial quality control on converted data outputs before formal review by Associates, Analysts, or Senior Analysts for formal review. Validate that the program output aligns with conversion instructions and meets formatting and structural expectations. Use standard scripts, ad-hoc SQL queries, and internal tools to identify and correct discrepancies in the data. Address issues identified during downstream QC reviews by updating conversion logic or collaborating with analysts to refine requirements. Ensure that all deliverables meet internal quality standards prior to release or further review. Knowledge and Experience Minimum Education: Bachelors degree in Computer Science, Information Systems, Software Engineering, Data Engineering, or a related technical field; or equivalent practical experience in software development or data engineering. Preferred Education: Bachelors degree (as above) plus additional coursework or certifications in: Data Engineering ETL Development Cloud Data Platforms (e.g., AWS, Azure, GCP) SQL and Database Management Programming (C#, Python, JavaScript) 4+ years of experience in software development, data engineering, or ETL pipeline development. Expert-level proficiency in programming languages such as SQL, Visual Studio using C#, Python, and JavaScript. Experience with ETL tools such as Pentaho Kettle, SSIS, or similar platforms. Strong understanding of data structures, file formats (CSV, Excel, TXT, XML), and data transformation techniques. Familiarity with relational databases and SQL for data querying and validation. Ability to read and interpret technical documentation and conversion instructions. Strong problem-solving skills and attention to detail. Ability to work independently and collaboratively in a fast-paced environment. Familiarity with property assessment, GIS, tax or public property records data. Preferred Skills Experience developing and maintaining data conversion programs in Visual Studio. Experience with property assessment, GIS, tax or public records data. Experience building internal tools or utilities to support data transformation workflows. Knowledge of version control systems (e.g., Git, Jira) and agile development practices. Exposure to cloud-based data platforms or services (e.g., Azure Data Factory, AWS Glue). Ability to troubleshoot and optimize ETL performance and data quality. Strong written and verbal communication skills for cross-functional collaboration.

Posted 5 days ago

Apply

6.0 - 8.0 years

9 - 13 Lacs

bengaluru

Work from Office

6-8 years of experience Deep hands-on experience in PL/SQL, SQL, APIs (standard & custom) Oracle Reports, BI Publisher Oracle Forms, Oracle Workflow OAF (Oracle Application Framework), ADF (preferred) Strong understanding of EBS data models across: SCM: Order to Cash, Procure to Pay, Shipping, Inventory, BOM, WMS Finance: AR, AP, GL, Fixed Assets Experience in debugging interface failures using logs, tables, and SOA trace (if applicable). Experience with file-based interfaces (CSV/XML/Flat Files), FTP/SFTP, SOAP/REST APIs Familiarity with middleware (Oracle SOA Suite, OSB, WebLogic) for integration troubleshooting Skilled in concurrent manager and debugging issues with concurrent programs Strong understanding of EBS data models and transaction flows in: Order to Cash, Procure to Pay, Inventory & WMS Accounts Receivable, Accounts Payable, General Ledger, Assets Functional awareness to interact effectively with business/functional teams Knowledge of multi-org, localization, and multi-currency setups Other Requirements: Strong analytical and debugging skills, especially in RICEW support scenarios. Ability to work independently and in coordination with functional and cross-technical teams. Experience in ServiceNow or other ITSM tools for defect and incident tracking.Preferred Qualifications: Oracle Certified Professional (OCP) Developer or Technical Track. Prior experience in large-scale global Oracle EBS support environments. Familiarity with SOX-compliant change management processes.

Posted 5 days ago

Apply

6.0 - 15.0 years

0 Lacs

karnataka

On-site

The role involves designing and developing scalable BI and Data Warehouse (DWH) solutions, leveraging tools like Power BI, Tableau, and Azure Databricks. Responsibilities include overseeing ETL processes using SSIS, creating efficient data models, and writing complex SQL queries for data transformation. You will design interactive dashboards and reports, working closely with stakeholders to translate requirements into actionable insights. The role requires expertise in performance optimization, data quality, and governance. It includes mentoring junior developers and leveraging Python for data analysis (Pandas, NumPy, PySpark) and scripting ETL workflows with tools like Airflow. Experience with cloud platforms (AWS S3, Azure SDK) and managing databases such as Snowflake, Postgres, Redshift, and MongoDB is essential. Qualifications include 6-15+ years of BI architecture and development experience, a strong background in ETL (SSIS), advanced SQL skills, and familiarity with the CRISP-DM model. You should also possess skills in web scraping, REST API interaction, and data serialization (JSON, CSV, Parquet). Strong programming foundations with Python and experience in version control for code quality and collaboration are required for managing end-to-end BI projects.,

Posted 6 days ago

Apply

3.0 - 8.0 years

10 - 12 Lacs

dadra & nagar haveli

Work from Office

JOB Description- Role- Computer System Validation Consultant Location- Dadra Responsibilities- 1. Expertise in computer system validation, QMS and IT Compliance. 2. Good understanding of GAMP 5, 21 CFR Part 11, Annex 11, ICH Q9. 3. Validation experience and expertise in manufacturing, QC and enterprise systems. 4. QMS: Initiation/Review of change management, Investigation, CAPA, Deviation etc. 5. Review the IT activities- Backup report, user access management, Support in internal audit and IT Compliance management. 6. The candidate should have experience in SOP preparation and review. 7. Project and stakeholder management. 8. Periodic review and IT schedule activity management 9. Excellent verbal and written communication skills. Note: This is 6 days working. Month's first Saturday will be off. And bus transportation services will be provided.

Posted 6 days ago

Apply

6.0 - 8.0 years

9 - 13 Lacs

kochi

Work from Office

6-8 years of experience Deep hands-on experience in PL/SQL, SQL, APIs (standard & custom) Oracle Reports, BI Publisher Oracle Forms, Oracle Workflow OAF (Oracle Application Framework), ADF (preferred) Strong understanding of EBS data models across: SCM: Order to Cash, Procure to Pay, Shipping, Inventory, BOM, WMS Finance: AR, AP, GL, Fixed Assets Experience in debugging interface failures using logs, tables, and SOA trace (if applicable). Experience with file-based interfaces (CSV/XML/Flat Files), FTP/SFTP, SOAP/REST APIs Familiarity with middleware (Oracle SOA Suite, OSB, WebLogic) for integration troubleshooting Skilled in concurrent manager and debugging issues with concurrent programs Strong understanding of EBS data models and transaction flows in: Order to Cash, Procure to Pay, Inventory & WMS Accounts Receivable, Accounts Payable, General Ledger, Assets Functional awareness to interact effectively with business/functional teams Knowledge of multi-org, localization, and multi-currency setups Other Requirements: Strong analytical and debugging skills, especially in RICEW support scenarios. Ability to work independently and in coordination with functional and cross-technical teams. Experience in ServiceNow or other ITSM tools for defect and incident tracking.Preferred Qualifications: Oracle Certified Professional (OCP) Developer or Technical Track. Prior experience in large-scale global Oracle EBS support environments. Familiarity with SOX-compliant change management processes.

Posted 6 days ago

Apply

4.0 - 9.0 years

5 - 15 Lacs

hyderabad

Work from Office

Position: Automation & CSV Engineer Location: Hyderabad . Responsibilities 5+ years of experience in a Process Automation/Computer System Validation role in a Pharma, biopharmaceutical manufacturing environment. Facilities, Utilities, and Equipment (FUE) qualification Unit operations automation qualification with Honeywell, Delta V, and PLC-based systems Computerized systems validation Support to change controls, investigations, deviations, and CAPAs Technical understanding and experience of automation platforms, such as DeltaV, Emerson, Honeywell, Rockwell PLC, Siemens XFP Ability to effectively lead validation projects, coordinate contractors, junior level personnel and drive results Working closely with the systems integrators, automation leads & process leads Completing the FAT, SAT's, IQ, OQ. creating and executing the protocols & documentation Knowledge of E2500,V Model concepts along with GAMP5, S88,S95 Knowledge of Industry guidelines (ISPE, PDA), US and international regulations (FDA, ICH, ISO, EU) for GMP regulated environments Must be able to solve routine problems with assistance. Strong organizational skills, excellent writing and communications skills Proficiency with Microsoft office including Word, Excel, and PowerPoint. Microsoft Project and Visio a plus Ability to travel up to 50% or more within and outside India Knowledge of Delta-V DCS systems, Rockwell qualification Knowledge of PI Historian. Large scale project experience. Education Engineering degree in Chemical, Automation, Biotechnology, EEE.

Posted 6 days ago

Apply

2.0 - 6.0 years

0 Lacs

ahmedabad, gujarat

On-site

As a Senior Python Developer specializing in web scraping and automation at Actowiz Solutions in Ahmedabad, you will be a key member of our dynamic team. Actowiz Solutions is a prominent provider of data extraction, web scraping, and automation solutions, enabling businesses to leverage clean, structured, and scalable data for informed decision-making. By utilizing cutting-edge technology, we strive to deliver actionable insights that drive the future of data intelligence. Your primary responsibility will be to design, develop, and optimize large-scale web scraping solutions using Scrapy, a mandatory requirement for this role. You will work with a variety of additional libraries and tools such as BeautifulSoup, Selenium, Playwright, and Requests to enhance the efficiency and effectiveness of our scraping frameworks. Implementing robust error handling, data parsing, and storage mechanisms (JSON, CSV, SQL/NoSQL databases) will be crucial in ensuring the reliability and scalability of our solutions. Collaboration with product managers, QA, and DevOps teams is essential to ensure timely project delivery. You will also be expected to research and adopt new scraping technologies that can further improve performance, scalability, and efficiency of our data extraction processes. To excel in this role, you should have at least 2 years of experience in Python development with a strong expertise in Scrapy. Proficiency in automation libraries such as Playwright or Selenium, experience with REST APIs, asynchronous programming, and concurrency are also required. Familiarity with databases (SQL/NoSQL) and cloud-based data pipelines will be advantageous, along with strong problem-solving skills and the ability to deliver within Agile methodologies. Preferred qualifications include knowledge of DevOps tools like Docker, GitHub Actions, or CI/CD pipelines. In return, Actowiz Solutions offers a competitive salary, a 5-day work week (Monday to Friday), a flexible and collaborative work environment, and ample opportunities for career growth and skill development. Join us in shaping the future of data intelligence and drive impactful decision-making with our innovative solutions.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

ahmedabad, gujarat

On-site

As a developer, you will be responsible for designing, building, and maintaining scalable APIs & microservices using Node.js and TypeScript. Your role will involve developing responsive and performant UIs with Next.js, Tailwind CSS, and Bootstrap. You will also integrate third-party & open-source APIs, particularly focusing on the AI & automation domain. Collaboration with Flask-based Python APIs to interface with open-source tools will be a part of your responsibilities. You will implement and manage LLM APIs such as OpenAI to deliver AI-powered features and build automation workflows using tools like N8N. Additionally, you will utilize AI-based developer tools like Cursor and contribute to creative coding sessions. Your tasks will include working with diverse data sources including MySQL, PostgreSQL, SQLite, and flat files like CSV / Excel. Troubleshooting and optimizing integrations with AI tools, third-party services, and internal systems will be crucial. You are expected to write clean, maintainable, and testable code following best practices and team standards. We are looking for candidates with 3+ years of experience in fullstack web development, proficiency in Node.js, TypeScript, and Next.js, as well as experience with RESTful APIs & third-party integrations. Familiarity with Python, Flask, LLM APIs, and workflow automation is desirable. Hands-on experience with MySQL, PostgreSQL, SQLite, CSV/Excel is required, and knowledge of Prompt engineering, vector DBs, Docker, CI/CD, AWS is a bonus.,

Posted 1 week ago

Apply

5.0 - 10.0 years

25 - 30 Lacs

bengaluru

Work from Office

About The Role Entity :- Accenture Strategy & Consulting Team :- Strategy & Consulting Global Network Practice :- Marketing Analytics Job location :- Gurgaon About S&C - Global Network :- Accenture Applied Intelligence practice help our clients grow their business in entirely new ways. Analytics enables our clients to achieve high performance through insights from data - insights that inform better decisions and strengthen customer relationships. From strategy to execution, Accenture works with organizations to develop analytic capabilities - from accessing and reporting on data to predictive modelling - to outperform the competition. WHATS IN IT FOR YOU? As part of our Analytics practice, you will join a worldwide network of over 20,000 smart and driven colleagues experienced in leading statistical tools, methods and applications. From data to analytics and insights to actions, our forward-thinking consultants provide analytically-informed, issue-based insights at scale to help our clients improve outcomes and achieve high performance. Accenture will continually invest in your learning and growth. You'll work with MMM experts, and Accenture will support you in growing your own tech stack and certifications In Applied intelligence you will understands the importance of sound analytical decision-making, relationship of tasks to the overall project, and executes projects in the context of a business performance improvement initiative. What you would do in this role Working through the phases of project Define data requirements for creating a model and understand the business problem Clean, aggregate, analyze, interpret data and carry out quality analysis of it 5+ years of advanced experience of Market Mix Modeling and related concepts of optimizing promotional channels and budget allocation Experience in working with non linear optimization techniques. Proficiency in Statistical and Probabilistic methods such as SVM, Decision-Trees, Bagging and Boosting Techniques, Clustering Hands on experience in python data-science and math packages such as NumPy , Pandas, Sklearn, Seaborne, Pycaret, Matplotlib Development of AI/ML models Develop and Manage data pipelines Develop and Manage Data within different layers of Azure/ Snowflake Aware of common design patterns for scalable machine learning architectures, as well as tools for deploying and maintaining machine learning models in production. Knowledge of cloud platforms and usage for pipelining and deploying and scaling marketing mix models. Working knowledge of MMM optimizer and its intricacies Awareness of MMM application development and backend engine integration will be preferred Working along with the team and consultant/manager Well versed with creating insights presentations and client ready decks. Should be able to mentor and guide a team of 10-15 people under him/her Manage client relationships and expectations, and communicate insights and recommendations effectively Capability building and thought leadership Logical Thinking Able to think analytically, use a systematic and logical approach to analyze data, problems, and situations. Notices discrepancies and inconsistencies in information and materials. Task Management Advanced level of task management knowledge and experience. Should be able to plan own tasks, discuss and work on priorities, track and report progress Qualification Who we are looking for? 5+ years of work experience in consulting/analytics with reputed organization is desirable. Master degree in Statistics/Econometrics/ Economics or B Tech/M Tech or Masters/M Tech in Computer Science or M.Phil/Ph.D in statistics/econometrics or related field from reputed college Must have knowledge of SQL and Python language and at-least one cloud-based technologies (Azure, AWS, GCP) Must have good knowledge of Market mix modeling techniques and optimization algorithms and applicability to industry data Must have data migration experience from cloud to snowflake (Azure, GCP, AWS) Managing sets of XML, JSON, and CSV from disparate sources. Manage documentation of data models, architecture, and maintenance processes Have an understanding of econometric/statistical modeling and analysis techniques such as regression analysis, hypothesis testing, multivariate statistical analysis, time series techniques, optimization techniques, and statistical packages such as R, Python, Java, SQL, Spark etc. Working knowledge in Machine Learning algorithms like Random Forest, Gradient Boosting, Neural Network etc. Proficient in Excel, MS word, PowerPoint, etc. Strong client and team management and planning of large-scale projects with risk assessment Accenture is an equal opportunities employer and welcomes applications from all sections of society and does not discriminate on grounds of race, religion or belief, ethnic or national origin, disability, age, citizenship, marital, domestic or civil partnership status, sexual orientation, gender identity, or any other basis as protected by applicable law.

Posted 1 week ago

Apply

2.0 - 5.0 years

5 - 9 Lacs

bengaluru

Work from Office

About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : OpenText ECM Tools Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and enhancements to existing applications while staying updated with the latest technologies and best practices in application development. We are seeking a skilled OpenText Exstream Developer with hands-on experience in Communication Builder to join our team. In this role, you will design, develop, and maintain customer communications using the OpenText Exstream platform, with a focus on leveraging Communication Builder to create omnichannel, personalized, and data-driven correspondence.Roles & Responsibilities:- Design, develop, and customize solutions using OpenText Exstream platform (communication builder) - Configure Content Server, Extended ECM Business Workspaces, and metadata models.- Work with business analysts, designers, and other developers to translate requirements into efficient document templates.- Create and manage interactive, on-demand, and batch document solutions for various output channels (print, email, web, SMS, etc.).- Integrate external data sources (XML, JSON, CSV, etc.) into document templates and configure complex logic for personalization.- Maintain and enhance existing Exstream solutions to meet evolving business needs.- Conduct testing and debugging of communication templates to ensure accuracy, performance, and compliance.- Collaborate in Agile/Scrum environments, participating in sprint planning, codereviews, and continuous improvement efforts.- Support deployment and configuration in various environments (Dev, Test, Prod).- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features.Professional & Technical Skills: - Must Have Skills: Proficiency in OpenText Exstream tools.- Strong understanding of OpenText Extended ECM (xECM) platform - content server and archive centre- Experience with integration of OpenText solutions with other enterprise systems.- Familiarity with user interface design principles and best practices.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 5 years of experience in OpenText Exstream Tools.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

gurugram

Work from Office

The Team: The OSTTRA Technology teamis composed of Capital Markets Technology professionals, who build,supportand protect the applications that operate our network. The technology landscapeincludeshigh-performance, high-volume applications as well as compute intensive applications,leveragingcontemporary microservices, cloud-based architectures. The Impact: Together, we build, support, protect and manage high-performance, resilient platforms that process more than 100 million messages a day. Our services are vital to automated trade processing around the globe, managing peak volumes and working with our customers and regulators to ensure the efficient settlement of trades and effective operation of global capital markets. Whats in it for you: The role will primarily focus on delivering implementations & integrations. This position may additionally be required to produce cross-training materials in the agreed, standardised formats; take on primary & secondary responsibilities when delivering implementations & integrations with other team members; and engage in product UAT cycles. Specialist - Professional Services at all levels are expected to collaborate with other members of professional services, and other internal teams, in order to deliver implementations & integrations. The expected working hours in Gurgaon are 12 - 9pm. Some tasks, such as deployment of changes, is required on Sundays as part of the role. This is an excellent opportunity to be part of a team based out of Gurgaon and to work with colleagues across multiple regions globally. The role is being opened to work on new initiatives within OSTTRA. Responsibilities: Implementation & Integration Deliver implementations & integrations for multiple project types across the services (currently limited to ex. Traiana services) offered within the FX&S pillar at OSTTRA Hand over to the operations teams once live Day one check in with the customer Finalising readiness to migrate to production, and liaising with the relevant counterparties (as required) Undergoing the UAT phase with the customer directly, unilaterally identifying issues, investigating those issues, and resolving those issues with the relevant internal or external team Gathering & setting up all required static data in UAT & production (as required) System configuration in UAT and production environments Connectivity & integration set up in the product Connectivity & integration set up in IC and/or Adapters Coordinate the development of the transformer based on the spec provided by Solution Design Create any required routing in IC Ensure that all integration changes & set ups undergo the required 4-eye checks prior to deployment in production Ensure all integrations follow the integration standards outlined Work effectively as part of a professional services project team on each implementation and/or integration, alongside a project manager and solution design manager Work effectively with key internal stakeholders outside of professional services during the implementation and/or integration, such as the connectivity team, product or development teams Demonstrate a positive customer experience during implementations & integrations, regardless of whether the Technical Project Manager leads discussions or is working behind the scenes on items Update the PSA system (e.g. Monday.com) on a daily basis so that the project manager has the correct information on project status, risks, issues and dependencies Creating and tracking UAT plans Ensure all required implementation & integration documentation is produced in the standard formats defined, and is made available prior to the point of go-live, including the operations handover material Effectively manage time so that tasks are completed by the expected due date Cross-Training Create cross-training materials in the pre-defined standardised formats on implementation & integration processes for project types To lead implementation & integrations as a primary resource, while developing a secondary resource Develop new core skills, and take on new project types To assist a primary resource during implementation & integrations, while acting as a secondary resource Where necessary during the professional services restructure, assist with other teams in their cross-training priorities and needs Teamwork Responsive, collaborative and engaged with the internal project management team assigned to each implementation and/or integration Engage, be open and be objective in post-project retrospectives to develop the team further Product UAT Executing the required UAT runbook Operations Escalations Act as an escalation point for certain project types services from a technical project management perspective What Were Looking For: Knowledge of a message formats such as FIX, XML, JSON or CSV Work effectively as part of a team Ability to define and document detailed workflow processes Process-oriented with excellent organisational skills Ability to fulfil required project tasks in a timely manner Customer facing skills Creative problem solver Excellent verbal and written communication skills Understanding of the services offered by the OSTTRA FX & S pillar

Posted 1 week ago

Apply

4.0 - 8.0 years

4 - 6 Lacs

pune

Work from Office

2. Hi, Pl send Updated CV on Rashmi.kulkarni1@Fresenius-kabi.com Details as below : 1. Position : Senior Technical Expert- Quality Assurance Key Responsibilities: - Validation Profile Validation Planning & Execution: Develop, review, and approve Validation Master Plans , Validation Protocols (IQ/OQ/PQ), and Reports for equipment, utilities, and manufacturing processes. Lead validation activities for injectable dosage forms , including aseptic processing, lyophilization, and sterile filtration. Ensure validation activities comply with cGMP , FDA , EU , and other applicable regulatory standards 1 . Documentation & Compliance: Maintain and update Site Master File , Annual Product Reviews , and Validation Summary Reports 2 . Review and approve SOPs , Work Instructions , and Change Control documentation. Ensure traceability and archival of all validation documents. Cross-functional Collaboration: Coordinate with Production , Engineering , QC , and Regulatory Affairs to align validation efforts with project timelines. Support audits and inspections by providing validation documentation and responding to observations. Deviation & CAPA Management: Investigate validation-related deviations and implement Corrective and Preventive Actions (CAPA) . Conduct risk assessments and impact analyses for changes to validated systems 3 . Training & Support: Train operators and analysts on validation protocols and regulatory expectations. Provide technical support during equipment qualification and process validation. 2. Position : Technical Lead- Quality Assurance Key Responsibilities: Batch Release Batch Documentation Review: Review Batch Manufacturing Records (BMR) and Batch Packaging Records (BPR) for completeness, accuracy, and compliance with cGMP . Ensure all documentation supports batch release and adheres to data integrity standards. Product Release Authorization: Authorize release of raw materials , intermediates , and finished injectable products . Ensure specifications are met and documented evidence supports product quality. Deviation & CAPA Management: Escalate discrepancies and deviations promptly. Participate in investigations and ensure Corrective and Preventive Actions (CAPA) are implemented and closed. Compliance & Regulatory Support: Ensure batch release activities comply with FDA , EU , and WHO GMP guidelines. Support internal and external audits by providing batch documentation and responding to queries. Cross-functional Collaboration: Coordinate with Production , QC , Regulatory Affairs , and Warehouse teams to ensure timely and compliant product disposition. Support continuous improvement initiatives in batch release processes. Documentation Control: Manage issuance, filing, and archival of executed batch records. Maintain batch documentation library and ensure traceability.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies