Jobs
Interviews

4894 Data Processing Jobs - Page 9

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 - 3.0 years

3 - 4 Lacs

noida

Work from Office

Responsibilities: * Accurately enter data into computer systems using typing skills * Maintain confidentiality of sensitive information * Process data with high speed and accuracy * Maintain and update sales database, CRM systems, and customer record Provident fund

Posted 4 days ago

Apply

1.0 - 6.0 years

3 - 8 Lacs

pune, bengaluru

Work from Office

Builds reports and dashboards independently Completes independent quality checks of reports and dashboards Identifies insights in dashboards and communicates insights to internal team and to clients Prioritises enhancements and bug fixes to dashboards and manages the implementation Develops underlying data structures necessary to support dashboard development Facilitates discovery with internal and client stakeholders to document requirements Manages dashboard build teams for smaller engagements Manages permissions and security of reporting environments

Posted 4 days ago

Apply

15.0 - 20.0 years

15 - 20 Lacs

bengaluru

Work from Office

Lead in technical architecture and setting the technical vision for the team. Oversee the development and implementation for highly complex applications, tools, systems and integrations. Lead the exploration of new trends, technologies and information, and evaluate these trends to pitch applicable projects that impacts Celonis through innovation. Receive work in the form of objectives that regularly require innovation around original ideas. Translate targeted solutions into end-to-end architectural designs. Independently own multiple large problem spaces with significant ambiguity. Consistently demonstrate ability to go deep into a variety of domains. Communicate well to all levels of product development and across the company. Navigate open-ended technical and workflow discussions, helping reach conclusions or constructive next steps. Proactively engage with internal and external peers and management to develop unprecedented solutions.

Posted 4 days ago

Apply

0.0 - 5.0 years

2 - 7 Lacs

neemrana

Work from Office

ERP (Manager) Experience : 6.0-0.0 years Type : Full Time Location : Gugalkota, Neemrana Department: IT/ERP Implementation Experience: 6+ years in ERP implementation support - Oracle SCM, Fusion, or Oracle Cloud only Fusion - Implementation, Rollouts, Inventory Management, Procurement - Order Management, Lifecycle Management, Manufacturing - Supply Chain Planning, Supplier Portal, Supplier Life Cycle Management - Costing, Logistics, Warehouse Management

Posted 4 days ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

hyderabad

Work from Office

As the lead data scientist, this individual will be responsible for development and deployment of data science modeling frameworks using Statistical modeling (Machine Learning as appropriate) for sDNA and cDNA for BENLUX markets e.g., prediction of store potential, store segmentation, assortment optimization, asset deployment, audience profiling, etc. The position exists to unlock competitive advantage in go-to-market and execution through cutting edge data science, advanced analytics, and AI techniques, with a focus on OT, TT and AFH for BENLUX. The focus of this role is ensuring the successful implementation of data science solutions that deliver. The role holder will also leverage best practice to help to establish a reusable data science ecosystem. Qualifications Bachelors or advanced degree in a quantitative field for instance a Master s degree or PhD in data science or math (e.g., Computer Science, Mathematics, Statistics, Data Science) or equivalent experience 7 years + of relevant advanced analytics experience in Marketing or Commercial in either Retail, or CPG industries. Other B2C domains can be considered Proven experience with medium to large size data science projects Advanced knowledge of key data science techniques: Combining data from multiple sources through APIs, Semantic Web, etc. Data preparation and feature engineering Supervised / Unsupervised learning Collaborative Filtering Location Analytics & Intelligence Proficiency in programming languages such as Python, R, or SQL. Good to have experience with data processing frameworks (e.g., Hadoop, Spark) Good to have understanding of data engineering principles, data integration, and ETL processes. Excellent problem-solving skills and the ability to translate business needs into data-driven solutions. Strong communication and interpersonal skills, with the ability to effectively convey complex concepts to both technical and non-technical stakeholders. Bachelors or advanced degree in a quantitative field for instance a Master s degree or PhD in data science or math (e.g., Computer Science, Mathematics, Statistics, Data Science) or equivalent experience 7 years + of relevant advanced analytics experience in Marketing or Commercial in either Retail, or CPG industries. Other B2C domains can be considered Proven experience with medium to large size data science projects Advanced knowledge of key data science techniques: Combining data from multiple sources through APIs, Semantic Web, etc. Data preparation and feature engineering Supervised / Unsupervised learning Collaborative Filtering Location Analytics & Intelligence Proficiency in programming languages such as Python, R, or SQL. Good to have experience with data processing frameworks (e.g., Hadoop, Spark) Good to have understanding of data engineering principles, data integration, and ETL processes. Excellent problem-solving skills and the ability to translate business needs into data-driven solutions. Strong communication and interpersonal skills, with the ability to effectively convey complex concepts to both technical and non-technical stakeholders.

Posted 4 days ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

hyderabad

Work from Office

Design, build, and maintain robust and scalable ETL/ELT pipelines to ingest data from various sources (APIs, databases, files, streaming) Cleanse, transform, and integrate raw data into usable formats for downstream analytics and machine learning applications Develop and manage data storage solutions such as Snowflake, BigQuery, Redshift, or Azure Synapse Optimize data processing workflows and database performance to ensure minimal latency and high availability Implement data validation, logging, lineage, and monitoring to ensure data integrity, consistency, and compliance Use orchestration tools like Apache Airflow, Luigi, or cloudnative schedulers to automate data workflows and processes Work closely with data scientists, analysts, and stakeholders to understand data requirements and deliver appropriate solutions Maintain thorough documentation and follow engineering best practices, including version control, CI/CD, and testing Stay updated on emerging tools, trends, and technologies in the data engineering landscape to improve system performance and team productivity

Posted 4 days ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

hyderabad

Work from Office

Design, build, and maintain robust and scalable ETL/ELT pipelines to ingest data from various sources (APIs, databases, files, streaming). Cleanse, transform, and integrate raw data into usable formats for downstream analytics and machine learning applications. Develop and manage data storage solutions such as Snowflake, BigQuery, Redshift, or Azure Synapse. Optimize data processing workflows and database performance to ensure minimal latency and high availability. Implement data validation, logging, lineage, and monitoring to ensure data integrity, consistency, and compliance. Use orchestration tools like Apache Airflow, Luigi, or cloudnative schedulers to automate data workflows and processes. Work closely with data scientists, analysts, and stakeholders to understand data requirements and deliver appropriate solutions. Maintain thorough documentation and follow engineering best practices, including version control, CI/CD, and testing. Stay updated on emerging tools, trends, and technologies in the data engineering landscape to improve system performance and team productivity.

Posted 4 days ago

Apply

0.0 years

2 - 3 Lacs

mumbai, ahmedabad

Work from Office

We are seeking for a dedicated Data Process Associate to handle claim process and join our dynamic team. As an Data entry Associate you will be responsible for processing healthcare data, ensuring accuracy, and supporting RCM process.

Posted 4 days ago

Apply

2.0 - 7.0 years

4 - 9 Lacs

chennai

Work from Office

Hiring for Dell Boomi_Chennai Integration Development Build end-to-end integration solutions using Boomi AtomSphere, including cloud and on-premise systems, APIs, connectors, mapping, process orchestration, and transformations. SURNOI Naukri IITJobs jobisite.com hirist.tech Shine ETL and Data Processing Implement ETL workflows using Boomi for extracting, transforming, and loading data. Includes batch, real-time, or hybrid processing architectures. Shine jobisite.com Hike2 hirist.tech EDI Handling (when applicable) Manage EDI data exchanges (X12, EDIFACT, ANSI, etc.), mapping, partner onboarding, and compliance. jobisite.com hirist.tech API Integration Design and integrate RESTful and SOAP APIs, manage API authentication and security (OAuth, JWT), and support API-led connectivity within Boomi ",

Posted 4 days ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

mumbai, bengaluru

Work from Office

While technology is the heart of our business, a global and diverse culture is the heart of our success. We love our people and we take pride in catering them to a culture built on transparency, diversity, integrity, learning and growth. If working in an environment that encourages you to innovate and excel, not just in professional but personal life, interests you- you would enjoy your career with Quantiphi! Consults with various stakeholders throughout the organization to analyse their information needs and determines analytical requirements for system application and data processing solutions, including selection of appropriate data sources, tabulations, statistical methods and analysis, and communication of results and insights Builds Business Intelligence (BI) and Data Visualization solutions to meet business and systems needs by thorough requirements gathering, software component designing, coding, testing, and effective review Uses DW/BI toolkits to collect, record, and provide access to data and assist the enterprise in making better business decisions Provides support to existing DW/BI solutions using architecture and development best practices including data design, data transport, data access, data visualization, and data quality/metadata techniques Designs and develops services for transformation, integration and reporting solutions Provides feedback and consultation on analytical capabilities and integrated data assets to business partners in order to mature the vision and direction for BI Applies industry best practices and promotes standards for execution and delivery approach If you like wild growth and working with happy, enthusiastic over-achievers, youll enjoy your career with us !

Posted 4 days ago

Apply

5.0 - 9.0 years

7 - 11 Lacs

chennai

Work from Office

ROLE SUMMARY As part of the Clinical Data Sciences (CDS) group, an integral delivery unit within the Clinical Development & Operations (CD&O) organization, the Clinical Data Scientist (CDS) is responsible for timely and high-quality data management deliverables supporting the Pfizer portfolio. The CDS delivers asset level information strategies and services for optimal use and reuse of internal and external information that will advance research, development, and commercialization of the Pfizer portfolio and further precision medicine. The CDS designs, develops, and maintains key data management deliverables used to collect, review, monitor, and ensure the integrity of clinical data, applies standards, data review and query management, and is accountable for quality study data set release and consistency in asset/submission data. ROLE RESPONSIBILITIES Serve as Clinical Data Scientist and Trial Lead for one or more clinical trials assuming responsibility for all CDS activities including selection and application of data acquisition standards, Data Management Plan, selection of quality risk indicators, third party study data due diligence Serve as a technical resource to the study teams for DM and RBM standards, tools, data provisioning, and reporting Partners with Research/Business Units, external DM service providers and internal CDS staff to deliver high quality data management for all studies as assigned. Proactively drives quality and efficiency to meet timeline and milestones for data management, ensuring scientific and operational excellence in support of strategic imperatives and in collaboration with the cross functional study team (s). Ensure work carried out by or on behalf of CDS is in accordance with applicable SOPs and working practices. Participates and ensures quality database design including documentation, testing and implementation of clinical data collection tools, both CRF and non-CRF, using an electronic data capture (EDC) system and/or other data collection systems. Ensure the required study specific CDS documents in the Trial Master File (TMF) are of high quality and are filed contemporaneously. Ensure operational excellence in collaboration with partners for application of standards, data acquisition, proactive data review and data integrity monitoring, data cleaning, e-data processing, data access and visualization, and database release. Work Location Assignment: Hybrid Medical

Posted 4 days ago

Apply

0.0 - 3.0 years

2 - 5 Lacs

hyderabad

Work from Office

Our MMO platform is an in-house initiative designed to empower clients with data-driven decision-making in marketing strategy. By applying Bayesian and frequentist approaches to media mix modeling , we are able to quantify channel-level ROI, measure incrementality, and simulate outcomes under varying spend scenarios. Key components of the project include: Data Integration: Combining client first-party, third-party, and campaign-level data across digital, offline, and emerging channels into a unified modeling framework. Model Development: Building and validating media mix models (MMM) using advanced statistical and machine learning techniques such as hierarchical Bayesian regression, regularized regression (Ridge/Lasso), and time-series modeling. Scenario Simulation: Enabling stakeholders to forecast outcomes under different budget allocations through simulation and optimization algorithms. Deployment & Visualization: Using Streamlit to build interactive, client-facing dashboards for model exploration, scenario planning, and actionable recommendation delivery. Scalability: Engineering the system to support multiple clients across industries with varying data volumes, refresh cycles, and modeling complexities. Responsibilities Develop, validate, and maintain media mix models to evaluate cross-channel marketing effectiveness and return on investment. Engineer and optimize end-to-end data pipelines for ingesting, cleaning, and structuring large, heterogeneous datasets from multiple marketing and business sources. Design, build, and deploy Streamlit-based interactive dashboards and applications for scenario testing, optimization, and reporting. Conduct exploratory data analysis (EDA) and advanced feature engineering to identify drivers of performance. Apply Bayesian methods, regularization, and time-series analysis to improve model accuracy, stability, and interpretability. Implement optimization and scenario-planning algorithms to recommend budget allocation strategies that maximize business outcomes. Collaborate closely with product, engineering, and client teams to align technical solutions with business objectives. Present insights and recommendations to senior stakeholders in both technical and non- technical language. Stay current with emerging tools, techniques, and best practices in media mix modeling, causal inference, and marketing science. Bachelor s or Master s degree in Data Science, Statistics, Computer Science, Applied Mathematics, or related field . Proven hands-on experience in media mix modeling, marketing analytics, or econometrics .

Posted 4 days ago

Apply

3.0 - 10.0 years

5 - 12 Lacs

bengaluru

Work from Office

Azure Data Factory-Hyderabad Data Orchestration: Create and schedule complex data processing tasks, ensuring efficient and automated data movement and transformation. Data Integration & Transformation: Connect to various data sources (on-premises, cloud, and perform transformations to prepare data for analytical purposes. Develop ETL/ELT Pipelines: Design, build, and implement data integration workflows using Azure Data Factorys pipelines and dataflows. ",

Posted 4 days ago

Apply

4.0 - 10.0 years

6 - 12 Lacs

bengaluru

Work from Office

Building off our Cloud momentum, Oracle has formed a new organization - Health Data Intelligence. This team will focus on product development and product strategy for Oracle Health, while building out a complete platform supporting modernized, automated healthcare. This is a net new line of business, constructed with an entrepreneurial spirit that promotes an energetic and creative environment. We are unencumbered and will need your contribution to make it a world class engineering center with the focus on excellence. Oracle Health Data Analytics has a rare opportunity to play a critical role in how Oracle Health products impact and disrupt the healthcare industry by transforming how healthcare and technology intersect. Career Level - IC3 As a member of the software engineering division, you will take an active role in the definition and evolution of standard practices and procedures. Define specifications for significant new projects and specify, design and develop software according to those specifications. You will perform professional software development tasks associated with the developing, designing and debugging of software applications or operating systems. Design and build distributed, scalable, and fault-tolerant software systems. Build cloud services on top of the modern OCI infrastructure. Participate in the entire software lifecycle, from design to development, to quality assurance, and to production. Invest in the best engineering and operational practices upfront to ensure our software quality bar is high. Optimize data processing pipelines for orders of magnitude higher throughput and faster latencies. Leverage a plethora of internal tooling at OCI to develop, build, deploy, and troubleshoot software.

Posted 4 days ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

hyderabad

Work from Office

Application Development Lead Analyst Position Overview We are looking for a top-notch Developer to work on production data pipelines and analytics systems. Your primary responsibility will development of data pipelines supporting digital experiences and analytics systems. You ll collaborate on architecture decisions and develop solutions using on AWS and Databricks. Interested candidates must be self-motivated, willing to learn, and willing to share new ideas to improve our team and process. Responsibilities Develop data pipelines supporting Cigna digital experiences and analytics systems. Collaborate on architecture decisions and develop solutions using tools in AWS and Databricks. Production data pipeline development in Spark using Python and SQL on Databricks Assemble large, complex data sets that meet functional and analytics business requirements. Develop both batch data processing and real time streaming technologies. Identify and address bottlenecks in data pipelines to improve performance. Improve data reliability, efficiency, and quality of data. Work with stakeholders including Analysts, Product, and Engineering teams to assist with data-related technical issues and support their data infrastructure needs. Qualifications Required Skills & Experience : 5 8 years of experience in data systems and analytics development. Expert in advanced SQL, Spark and Python. 3+ years of experience with Databricks Unity Catalog, Workflows and Autoloader. Experience developing Big Data pipelines in AWS cloud. Bachelor s Degree in Computer Science , Information Systems, Data Analytics or related. Experience with Git repository and CI/CD pipeline. Desired Experience: Understanding of web and mobile analytics. Experience with Data Security and managing PII/PHI in production environments. Understanding of common Big Data file formats: Databricks Delta, CSV, JSON, Parquet, etc Understanding of Adobe Analytics or Customer Journey Analytics (CJA). Understanding of the medical insurance industry. Experience with Terraform code for Databricks job deployment. Location & Hours of Work Full-time position, working 40 hours per week. Expected overlap with US hours as appropriate Primarily based in the Innovation Hub in Hyderabad, India in a hybrid working model (3 days WFO and 2 days WAH) Equal Opportunity Statement Evernorth is an Equal Opportunity Employer actively encouraging and supporting organization-wide involvement of staff in diversity, equity, and inclusion efforts to educate, inform and advance both internal practices and external work with diverse client populations. About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.

Posted 4 days ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

bengaluru

Work from Office

Position Overview We are looking for an experienced Full Stack Developer who has a strong experience in web application development, focusing on integrating Generative AI and Agentic Application models. Proficiency in React.js (front-end), Python (back-end), Databricks, and Azure platforms is required. Knowledge of Node.js, CI/CD pipelines, and Terraform is beneficial. The role will work with AI engineers , data scientists, and software engineers to embed and deploy AI models, ensuring seamless integration and performance. Key Responsibilities Front-End Development: - Create responsive UIs using React.js and modern JavaScript (ES6+). - Use state management libraries like Redux or Zustand. - Ensure cross-browser compatibility and optimize front-end performance. Back-End Development: - Develop APIs and server-side logic using Python (Flask, FastAPI, Django). - Integrate AI models into the back-end. - Implement secure authentication and data processing pipelines. AI Model Integration: - Embed pre-trained Generative AI models and Agentic frameworks. - Optimize model inference pipelines using tools like ONNX and TensorRT. - Ensure efficient communication between front-end, back-end, and AI endpoints. Cloud and DevOps: - Deploy applications on Azure using services like AKS and Azure Functions. - Utilize Databricks for data processing and AI workflows. - Implement infrastructure as code using Terraform. - Set up CI/CD pipelines with Azure DevOps, GitHub Actions, or Jenkins. Collaboration and Best Practices: - Work with cross-functional teams to deliver high-quality solutions. - Conduct code reviews and testing to ensure reliability and security. Monitoring and Optimization: - Implement monitoring using tools like Azure Monitor and Grafana. - Optimize performance and troubleshoot issues related to AI integration and user experience. Required skills Front-End: React.js, JavaScript/TypeScript, HTML5, CSS3, Tailwind CSS or Material-UI. Back-End: Python (Flask, FastAPI, Django), RESTful/GraphQL APIs, SQL/NoSQL databases. AI Integration: Familiarity with frameworks like ONNX and TensorFlow Serving. A2A & MCP Cloud Platforms: Azure, Databricks. DevOps: CI/CD pipelines, Terraform, Docker, Kubernetes. Version Control: Git (GitHub). Good-to-Have Skills: Node.js, real-time back-end services React.js and Python WebSockets, serverless architectures Exposure to Generative AI or Agentic Applications About 5+ years hands-on work experience in a Full Stack Developer role Knowledge of MLOps practices. Experience with GraphQL. Understanding of security best practices for AI applications. Certifications in Azure or Databricks. Azure, Javascript, Python, React, Team Handelling, Team Management

Posted 4 days ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

noida

Work from Office

Design and implement scalable data processing solutions using Apache Spark and Java. Develop and maintain high-performance backend services and APIs. Collaborate with data scientists, analysts, and other engineers to understand data requirements. Optimize Spark jobs for performance and cost-efficiency. Ensure code quality through unit testing, integration testing, and code reviews. Work with large-scale datasets in distributed environments (e. g. , Hadoop, AWS EMR, Databricks). Monitor and troubleshoot production systems and pipelines. Experience in Agile Development Process. Experience in leading a 3-5 member team on the technology front Excellent communication skills, problem solving and debugging and troubleshooting Skills. Mandatory Competencies Programming Language - Java - Core Java (java 8+) Architecture - Architectural Patterns - Microservices Data Science and Machine Learning - Data Science and Machine Learning - Apache Spark Tech - Unit Testing Data Science and Machine Learning - Data Science and Machine Learning - Databricks Big Data - Big Data - Hadoop Cloud - AWS - Tensorflow on AWS, AWS Glue, AWS EMR, Amazon Data Pipeline, AWS Redshift Agile - Agile - Extreme Programming Big Data - Big Data - SPARK Beh - Communication and collaboration Perks and Benefits for Irisians Iris provides world-class benefits for a personalized employee experience. These benefits are designed to support financial, health and well-being needs of Irisians for a holistic professional and personal growth. Click here to view the benefits.

Posted 4 days ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

noida

Work from Office

Design and implement scalable data processing solutions using Apache Spark and Java. Develop and maintain high-performance backend services and APIs. Collaborate with data scientists, analysts, and other engineers to understand data requirements. Optimize Spark jobs for performance and cost-efficiency. Ensure code quality through unit testing, integration testing, and code reviews. Work with large-scale datasets in distributed environments (e. g. , Hadoop, AWS EMR, Databricks). Monitor and troubleshoot production systems and pipelines. Experience in Agile Development Process. Excellent communication skills, problem solving and debugging and troubleshooting Skills. Mandatory Competencies Architecture - Architectural Patterns - Microservices Programming Language - Java - Core Java (java 8+) Data Science and Machine Learning - Data Science and Machine Learning - Apache Spark Tech - Unit Testing Big Data - Big Data - Hadoop Cloud - AWS - Tensorflow on AWS, AWS Glue, AWS EMR, Amazon Data Pipeline, AWS Redshift Data Science and Machine Learning - Data Science and Machine Learning - Databricks Agile - Agile - Extreme Programming Beh - Communication and collaboration Perks and Benefits for Irisians Iris provides world-class benefits for a personalized employee experience. These benefits are designed to support financial, health and well-being needs of Irisians for a holistic professional and personal growth. Click here to view the benefits.

Posted 4 days ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

noida

Work from Office

Design and implement scalable data processing solutions using Apache Spark and Java. Develop and maintain high-performance backend services and APIs. Collaborate with data scientists, analysts, and other engineers to understand data requirements. Optimize Spark jobs for performance and cost-efficiency. Ensure code quality through unit testing, integration testing, and code reviews. Work with large-scale datasets in distributed environments (e. g. , Hadoop, AWS EMR, Databricks). Monitor and troubleshoot production systems and pipelines. Experience in Agile Development Process. Excellent communication skills, problem solving and debugging and troubleshooting Skills. Mandatory Competencies Architecture - Architectural Patterns - Microservices Programming Language - Java - Core Java (java 8+) Data Science and Machine Learning - Data Science and Machine Learning - Apache Spark Tech - Unit Testing Big Data - Big Data - Hadoop Cloud - AWS - Tensorflow on AWS, AWS Glue, AWS EMR, Amazon Data Pipeline, AWS Redshift Data Science and Machine Learning - Data Science and Machine Learning - Databricks Agile - Agile - Extreme Programming Beh - Communication and collaboration Perks and Benefits for Irisians Iris provides world-class benefits for a personalized employee experience. These benefits are designed to support financial, health and well-being needs of Irisians for a holistic professional and personal growth. Click here to view the benefits.

Posted 4 days ago

Apply

3.0 - 5.0 years

9 - 12 Lacs

chennai

Work from Office

This is a remote position. Position Overview: We are seeking a skilled QA Engineer to ensure the highest level of quality across our suite of applications\u2014from our React-based admin dashboard and Flutter mobile app to our robust Node.js backend. The ideal candidate will possess a blend of manual and automation testing expertise, a deep understanding of financial and investment domains, and a commitment to security and compliance standards. Key Responsibilities: Test Planning & Execution: -Develop comprehensive test plans, test cases, and test scripts for manual and automated testing. -Execute functional, integration, end-to-end, and regression tests to ensure optimal performance across web, mobile, and backend systems. -Collaborate with cross-functional teams to define testing requirements and ensure alignment with business objectives. Automation & API Testing: -Design and implement automated test frameworks using industry-standard tools (e.g., Selenium, Cypress, Appium, or Flutter\u2019s testing tools). -Validate RESTful API integrations and conduct thorough API testing using tools such as Postman or equivalent. Quality & Compliance Assurance: -Validate financial data processing, multi-currency calculations, and secure payment operations. -Ensure compliance with security standards and regulatory requirements, including KYC/AML guidelines and secure coding practices. -Identify, document, and report defects with clear reproducibility steps, collaborating with developers to ensure timely resolution. Continuous Improvement & Collaboration: -Integrate testing processes within CI/CD pipelines to support rapid, high-quality releases. -Participate in code reviews, sprint planning, and technical discussions to advocate for quality across all development stages. -Mentor junior QA team members and contribute to process enhancements and best practices. Requirements Required Skills & Experience: Technical Expertise: -Proven experience (3+ years) in both manual and automated testing in a complex software environment. -Proficiency in testing modern web applications (e.g., React.js) and mobile applications (preferably Flutter). -Hands-on experience with API testing and test automation frameworks. -Familiarity with CI/CD processes and integrating automated tests into deployment pipelines. Domain & Compliance Knowledge: -Solid understanding of financial data processing, multi-currency systems, and investment management concepts. -Experience with security testing and ensuring compliance with financial regulations and data protection standards. Tools & Methodologies: -Knowledge of testing tools such as Selenium, Cypress, Postman, Appium, or equivalent. -Familiarity with agile methodologies and collaborative development workflows. -Experience in performance testing, end-to-end testing, and troubleshooting complex technical issues. Soft Skills: -Strong analytical and problem-solving abilities with acute attention to detail. -Excellent communication skills to articulate test findings and collaborate effectively with multidisciplinary teams. -A proactive mindset with the ability to work independently and as part of a fast-paced, innovative team. Preferred Qualifications: -Experience working in a fintech or financial services environment. -Familiarity with additional testing frameworks or scripting languages. -Prior exposure to internationalization and localization testing (e.g., RTL support for multilingual applications).

Posted 5 days ago

Apply

7.0 - 12.0 years

10 - 12 Lacs

ahmedabad

Work from Office

Experience 7+ Years Qualification - B.E/ B.tech System Architecture Core PHP & Laravel Developer Male Candidate

Posted 5 days ago

Apply

0.0 - 5.0 years

1 - 1 Lacs

ambala

Work from Office

Looking for someone with basic internet and Excel skills Work from home with international clients. What You Will Do: • Log in & perform actions on different accounts • Download and format reports • Search for info online & organize it neatly Flexi working Work from home

Posted 5 days ago

Apply

6.0 - 12.0 years

0 Lacs

bengaluru, karnataka, india

Remote

Job Description Data Engineer Location:- Bangalore Experience:- 6 to 12 Years Choosing Capgemini means choosing a place where you'll be empowered to shape your career, supported by a collaborative global community, and inspired to reimagine what's possible. Join us in helping leading drive scalable, sustainable growth. Your Role: DevOpsIT operationsJavaMicrosoft AzurePySparkPythoncloud computingcloud providersdata analysisdata managementdata processingdata scienceinformation technologymulti-paradigm programmingprogramming Your Profile: Experience in public cloudpython data sciencesoftware developmentsystem administrationtechnology, Data Engineer What You'll Love About Working Here We value flexibility and support our employees with remote work options and adaptable schedules to maintain a healthy work-life balance. Our inclusive culture brings together diverse professionals committed to growth, innovation, and excellence. You'll have access to continuous learning opportunities and certifications in emerging technologies like cloud and AI. About Us Capgemini is a global business and technology transformation partner, helping organizations accelerate their dual transformation to address the evolving needs of customers and citizens. With a strong 55-year heritage and deep industry expertise, Capgemini is trusted by its clients to address the entire breadth of their business needs-from strategy and design to operations. To achieve this, Capgemini draws on the capabilities of its 360,000 team members in more than 50 countries, all driven by the purpose of unleashing human energy through technology for an inclusive and sustainable future. It is a responsible and diverse organization with market-leading capabilities in digital, cloud, and data.

Posted 5 days ago

Apply

0.0 - 1.0 years

1 - 1 Lacs

chennai

Work from Office

Responsibilities: * Extract data, process it, enter into computer * Maintain accuracy and confidentiality * Meet deadlines and quality standards * Collaborate with team on projects as needed Provident fund Gratuity Referral bonus Performance bonus Annual bonus Employee state insurance Maternity benefits in mediclaim policy Over time allowance Shift allowance Joining bonus

Posted 5 days ago

Apply

0.0 - 1.0 years

1 - 1 Lacs

bangalore rural, bengaluru

Work from Office

Job Title: Data Entry Operator Banking Sector Location: Koramangala, Bangalore We are hiring Data Entry Operators for the banking field. Candidates must have excellent typing speed and good knowledge of MS Excel . Both graduates and undergraduates are welcome to apply. Freshers with strong attention to detail and accuracy will be preferred. Salary: 15,000 (in-hand) Work Days: 5 days working Location: Koramangala, Bangalore If you are eager to start your career in the banking sector with data entry and excel operations, this role is for you! Interested and experienced candidates can apply by: Call/WhatsApp: 9205488912 (Pragati) Email: pragti.saxena@cielhr.com

Posted 5 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies