Jobs
Interviews

4894 Data Processing Jobs - Page 40

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

15.0 - 20.0 years

10 - 14 Lacs

bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : PySpark Good to have skills : Python (Programming Language)Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team through the development process while ensuring alignment with organizational goals. You will also engage in strategic planning and decision-making to enhance application performance and user experience, fostering a culture of innovation and continuous improvement within your team. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior professionals to enhance their skills and knowledge.- Facilitate workshops and meetings to drive project objectives and gather feedback. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Good To Have Skills: Experience with Python (Programming Language).- Strong understanding of data processing frameworks and distributed computing.- Experience with cloud platforms and services related to application development.- Familiarity with Agile methodologies and project management tools.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 5 years of experience in PySpark.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

3.0 - 5.0 years

20 - 25 Lacs

chennai, bengaluru

Work from Office

Position Purpose The ISPL RE APS Automation & Tooling team is being built to set-up a Real Estate APS Center of excellence in synergy with teams in France & other international sites of Real Estate APS. Leveraging BNP Paribas Paris teams expertise and ISPL IT skills, to deliver automated and standardized solutions. The APS Automation & Tooling team is meant to provide a set of added value, Industrialized solution and services contributing to major IT transformations including infrastructure obsolescence management, modernization of our tools & technologies, automation of our operations and definition of referential guidelines. The Production Engineer Automation and tooling will be primarily responsible for ensuring the setup and maintenance of automation and reporting models, while contributing to project and programs linked to the RE APS. This team will actively co-ordinate & lead together with France team on different transformation levers of Real Estate Production. To meet these challenges, the APS team will be using BNP Paribas procedures and processes defined by Paris teams or jointly defined with ISPL teams. A strong relationship is also to be built between the RE APS team and dedicated business lines teams. The teams scope will be expanded in the future beyond the first set of described activities depending on business applications needs and the overall BNP Paribas IT organisation. Responsibilities Direct Responsibilities The Production engineer Automation & tooling duties are as follows, as a whole or in part Ansible development Continuous deployment tool & support Certificates Renewal Service now automation Monitoring and alerting setup Develop and maintain dashboards and reporting. Ensuring descriptive modelling Ensuring knowledge sharing Ensuring quality and security Contributing Responsibilities Contribute to the knowledge transfer with Paris APS teams. Contribute to the definition of procedures and processes necessary for the team. Help build team spirit and integrate into BNP Paribas culture. Contribute to the regular activity reporting and KPI calculation. Contribute to continuous improvement actions. Contribute to the acquisition by ISPL team of new skills & knowledge to expand its scope. Contributing to continuous process improvement Contributing to user training and information Contribute to projects and programs Technical & Behavioral Competencies Strong Knowledge of ITIL Strong knowledge IT infrastructure Knowledge of ansible Knowledge on Production tools like Dynatrace, Elastic Search, Autosys. Strong knowledge of Data visualization tools (Tableau) Knowledge of Scripting and automatization workflows Strong knowledge of data processing (Dataiku) Knowledge of Data Lab environment - Test & learn methodologies Methodologies Test AB Experience on control methodology Good written and spoken English. Measure and identify areas for improving Quality and overall Delivery. Able to communicate efficiently. Good Team Player Knowledge of service Now development Specific Qualifications (if required) Skills Referential Behavioural Skills : (Please select up to 4 skills) Ability to collaborate / Teamwork Ability to deliver / Results driven Communication skills - oral & written Attention to detail / rigor Transversal Skills: (Please select up to 5 skills) Ability to develop and adapt a process Analytical Ability Ability to set up relevant performance indicators Ability to understand, explain and support change Ability to manage / facilitate a meeting, seminar, committee, training Education Level: Master Degree or equivalent Experience Level At least 3 years.

Posted 3 weeks ago

Apply

2.0 - 3.0 years

1 - 2 Lacs

coimbatore

Work from Office

Data entry & management, manage data using Excel & Tally ERP, assist ERP setup, organize product records/images, maintain documentation, handle accounts support (invoices, billing, reports), and support daily operations in a growing Company. Required Candidate profile Be ready to help set up systems & processes, assist ERP, improve daily operations with Excel & AI tools, capture/edit product photos, and maintain updated pricelists & records.

Posted 3 weeks ago

Apply

10.0 - 15.0 years

40 - 45 Lacs

chennai

Work from Office

Position Overview We are seeking an exceptional Staff Data Engineer to join our team as a senior technical leader responsible for architecting and implementing enterprise-scale data solutions that directly impact our cybersecurity platforms strategic objectives. This role combines deep technical expertise with significant organizational influence, requiring someone who can design complex data architectures, mentor engineering teams, and drive the technical vision for our multi-petabyte data infrastructure serving millions of users. Essential Qualifications (All Required) Bachelors degree in Computer Science, Data Engineering, or related technical field (Masters degree preferred) Minimum 10-15 years of hands-on data engineering experience in large-scale, high-volume environments Expert-level proficiency in SQL optimization for multi-petabyte datasets and advanced Python programming Extensive experience with Snowflake architecture, optimization, and cost management at enterprise scale Demonstrated expertise building Kimball dimensional data warehouses and star schema architectures Advanced experience with AWS cloud infrastructure and either DBT or SQLMesh for data transformations Hands-on experience with CI/CD pipelines, GitFlow, and data governance frameworks ensuring GDPR/PCI-DSS compliance Proven ability to architect and own multiple entire areas of the codebase or services independently Advanced Technical Requirements Expertise with Apache Kafka, real-time data processing, and event-driven architectures Experience with Apache Iceberg, data lakehouse architectures, and modern table formats Demonstrated expertise in cloud cost optimization and data warehouse performance tuning Desired Qualifications Experience with MLOps frameworks, LLM integration, vector databases, and AI-driven analytics. We know its new, but some familiarity is a plus. Proven track record with subscription business models including cohort analysis, LTV calculations, and marketing attribution Direct industry experience in cybersecurity, ad-tech, or high-volume data processing environments (billions of events per day) Key Responsibilities Design and architect enterprise-scale data solutions that directly impact Gens data strategy execution. This might be something like designing and implementing an incremental identity graph, integrating a marketing attribution vendor or automating LTV calculation. Lead evaluation and implementation of cutting-edge technologies including modern data stack and advanced analytics Collaborate with executives and senior leadership to align data architecture with business strategy Mentor IC9 and IC10 engineers, focusing on architectural decision-making and advanced technical skills Drive complex, multi-quarter technical initiatives requiring coordination across multiple engineering teams This position requires a unique combination of deep technical expertise, strategic thinking, and leadership capabilities. The successful candidate will be a recognized expert in their domains within the broader data engineering community and capable of driving significant technical and business impact at enterprise scale. Gen is proud to be an equal-opportunity employer, committed to diversity and inclusivity. We base employment decisions on merit, experience, and business needs, without considering race, color, national origin, age, religion, sex, pregnancy, genetic information, disability, medical condition, marital status, sexual orientation, gender identity or expression, military or veteran status, or other unlawful factors. Gen prohibits discrimination based on these protected characteristics and recruits talented candidates from diverse backgrounds. We consider individuals with arrest and conviction records and do not discriminate against employees for discussing their own pay or that of other employees or applicants . Learn more about pay transparency . To conform to U.S. export control regulations, applicant should be eligible for any required authorizations from the U.S. Government.

Posted 3 weeks ago

Apply

12.0 - 17.0 years

45 - 50 Lacs

bengaluru

Work from Office

Bachelors or Masters degree in Computer Science Engineering or related field 12+ years of experience in the software industry with 3+ years in Engineering Management leading agile teams on design, development, and delivery of highly scalable products. Strong, demonstrable track record for hiring and retaining top-tier talent. Outstanding verbal and written communication skills with the ability to clearly communicate a vision and get people invested in success. Exposure to the process and nuances of operationalising and supporting products and services used by thousands of customers simultaneously. Ability to create, drive and evangelize cross-team processes and achieve org-wide impact. 5+ years of senior management experience with a solid track record of building and leading engineering teams Experience with JavaScript (ReactJS preferred) and web technologies. Experience developing RESTful services. Familiarity with technologies and design concepts around Big Data Processing and Relational Databases such as: ETL, Hadoop Ecosystem, structured data, SQL schemas and queries etc. Knowledge of DevOps technologies such as Jenkins, Kubernetes, Spinnaker, etc. Knowledge and experience with any Cloud platform such as OCI, AWS, Azure Knowledge of software engineering standard processes across the development lifecycle, including agile methodologies, coding standards, code reviews, source management, build processes, testing, and operations. Demonstrated knowledge and experience with major cloud providers (Oracle Cloud, AWS, Microsoft Azure or Google Cloud) Ability to interact directly with clients and a passion to help them succeed. Career Level - M3 We are seeking an experienced leader who enjoys building/maintaining complex, highly technical products and service. As a member of the Care Coordination team, you will help us adding value to the customers by delivering defect free solution. Providing technical leadership, direction, and guidance for the existing team. Establishing and developing the team capacity needed to execute on strategy project. Mentoring and developing junior engineers to senior levels. Successfully leading a local team as part of a larger geographically distributed organization. Delivering project on-time with the high quality. Able to work on multiple technology and switch gears based on the business needs. Overseeing operational reviews and take appropriate actions to make the system available meeting the standards. Get involved in the Sprint planning and ensure to meet the sprint commitment. Working with leadership, senior engineers, program managers, and product managers to develop compelling products and services that meet customer needs. Providing constructive feedback to team members and other stakeholders. Do the performance appraisal for the direct reports. Raising the bar for product quality and customer experience.

Posted 3 weeks ago

Apply

7.0 - 8.0 years

9 - 10 Lacs

bengaluru

Work from Office

Location: Bengaluru Designation: Assistant Manager Audit & Assurance/Assurance (A&A) RBIs IDPMS and EDPMS/AM The team Assurance is about much more than just the numbers. It s about attesting to accomplishments and challenges and helping to assure strong foundations for future aspirations. Deloitte exemplifies the what, how, and why of change so you re always ready to act ahead. Learn more about Audit & Assurance Practice Your work profile In our Assurance (A&A) Team you ll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations: - Job summary The Compliance Analyst will be responsible for analyzing data from the Reserve Bank of Indias (RBI) Import Data Processing and Monitoring System (IDPMS) and Export Data Processing and Monitoring System (EDPMS) portals. The role involves identifying compliance and non-compliance cases related to import and export transactions, preparing detailed reports for management, and collaborating with relevant stakeholders to resolve issues as per actionable guidelines. Key Responsibilities: Data Analysis: Extract and analyze data from RBIs IDPMS and EDPMS portals to identify compliance and non-compliance cases. Monitor the status of Bills of Entry (BoE) and Shipping Bills (SB) to ensure timely and accurate reporting. Compliance Monitoring: Track and document instances of non-compliance, including delayed submissions, discrepancies, and other irregularities. Ensure all import and export transactions adhere to RBI guidelines and regulations Actions: Work closely with internal departments, including finance, operations, to address compliance issues. Implement actionable guidelines to address non-compliance cases. Provide recommendations for process improvements to enhance compliance monitoring and reporting. Everyone s welcome entrust your happiness to us Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here s a glimpse of things that are in store for you. Interview tips We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you re applying to. Check out recruiting tips from Deloitte professionals.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

pune

Work from Office

The candidate must possess in-depth functional knowledge of the process area and apply it to operational scenarios to provide effective solutions. He/she must be able to identify discrepancies and propose optimal solutions by using a logical, systematic, and sequential methodology. It is vital to be open-minded towards inputs and views from team members and to effectively lead, control, and motivate groups towards company objects. Additionally, he/she must be self-directed, proactive, and seize every opportunity to meet internal and external customer needs and achieve customer satisfaction by effectively auditing processes, implementing best practices and process improvements, and utilizing the frameworks and tools available. Goals and thoughts must be clearly and concisely articulated and conveyed, verbally and in writing, to clients, colleagues, subordinates, and supervisors. Process Manager Roles and responsibilities: Exposure to APIs and data integrations using REST. Write efficient, production-grade Python scripts for data extraction, transformation, and automation tasks. Design and optimize SQL queries to analyze large, complex datasets across multiple sources (Presto Server, Hive, Spark, etc.). Develop and maintain reusable ETL pipelines using Python libraries such as pandas, sqlalchemy, or Airflow or any tools available at client disposal. Willingness to learn any new ETL tool as per the client requirements. Client-Facing Ownership Lead weekly or bi-weekly calls with clients, understand their analytical needs, and translate them into actionable data tasks. Present insights and dashboards in a structured, business-friendly manner using tools like Tableau, Power BI, or Google Data Studio. Manage project timelines, expectations, and delivery quality for 13 active clients simultaneously. Contribute to standardizing and documenting reusable scripts and SQL templates. Collaborate effectively with internal teams and client SPOCs to deliver end-to-end solutions. Ensure data accuracy, integrity, and compliance during data handling and reporting. Technical & Functional Skills: 5+ years of experience in data roles involving Python and SQL in production environments. 2 - 3 years of direct client interaction, ideally in a services or consulting setup. Strong problem-solving skills with measurable outcomes from past projects (e.g., improved reporting efficiency by X%, reduced data processing time by Y%). Excellent communication, both verbal and written, with the ability to translate complex data into business insights. Experience working with cloud data platforms (e.g., AWS, GCP). Familiarity with project management tools (e.g., JIRA, Asana). Knowledge of data modeling and performance tuning.

Posted 3 weeks ago

Apply

2.0 - 4.0 years

15 - 22 Lacs

noida

Work from Office

Role Description: This is a full-time role for a Software Developer (C++/Golang) based in Noida. You will develop and optimize in-house low-latency platforms that support our high-frequency trading (HFT) and arbitrage strategies. Collaborating closely with cross-functional teams, you will enhance system performance and ensure reliable data processing. Key Responsibilities: Create and maintain high-performance trading applications using C++/Golang. Design and implement low-latency platforms to support trading strategies. Optimize existing systems for improved performance and reduced latency. Work with real-time data processing and apply latency reduction techniques. Collaborate with teams to enhance HFT and arbitrage strategies. Analyze system performance metrics and identify areas for improvement. Resolve bottlenecks to enhance code speed and efficiency. Implement and optimize high-performance, high-availability distributed systems. Document software design and implementation processes clearly. Qualifications: Bachelors or Masters degree in Computer Science, Engineering, or a related field. Minimum of 2 years of experience with low-latency platforms and high-performance distributed systems. Familiarity with Linux/Unix operating systems Experience with Git or other version control systems Strong analytical skills with a systematic approach to problem-solving. Preferred Skills: Experience in developing low-latency trading systems. Understanding of financial markets and trading concepts. Have good knowledge of parallel programming paradigms. Have a good grasp over memory management. Experience with Devops workflows involving but not limited to Ansible and Jenkins. Excellent problem-solving and critical thinking abilities. Strong communication and collaboration skills within fast-paced, multidisciplinary teams.

Posted 3 weeks ago

Apply

4.0 - 7.0 years

12 - 13 Lacs

hyderabad

Work from Office

As a Junior/Senior Data Engineer, youll be taking the lead in designing and maintaining complex data ecosystems. Your experience will be instrumental in optimizing data processes, ensuring data quality, and driving datadriven decisionmaking within the organization. Architecting and designing complex data systems and pipelines. Leading and mentoring junior data engineers and team members. Collaborating with crossfunctional teams to define data requirements. Implementing advanced data quality checks and ensuring data integrity. Optimizing data processes for efficiency and scalability. Overseeing data security and compliance measures. Evaluating and recommending new technologies to enhance data infrastructure. Providing technical expertise and guidance for critical data projects. Required skills & experience Proficiency in designing and building complex data pipelines and data processing systems. Leadership and mentorship capabilities to guide junior data engineers and foster skill development. Strong expertise in data modeling and database design for optimal performance. Skill in optimizing data processes and infrastructure for efficiency, scalability, and costeffectiveness. Knowledge of data governance principles, ensuring data quality, security, and compliance. Familiarity with big data technologies like Hadoop, Spark, or NoSQL. Expertise in implementing robust data security measures and access controls. Effective communication and collaboration skills for crossfunctional teamwork and defining data requirements. Skills Cloud Azure/GCP/AWS DE Technologies ADF, Big Query, AWS Glue etc., Data Lake Snowflake, Data Bricks etc.,

Posted 3 weeks ago

Apply

0.0 - 2.0 years

5 - 8 Lacs

bengaluru

Work from Office

The Cloud Developer builds from the ground up to meet the needs of mission-critical applications, and is always looking for innovative approaches to deliver end-to-end technical solutions to solve customer problems. Brings technical thinking to break down complex data and to engineer new ideas and methods for solving, prototyping, designing, and implementing cloud-based solutions. Collaborates with project managers and development partners to ensure effective and efficient delivery, deployment, operation, monitoring, and support of Cloud engagements. The Cloud Developer provides business value expertise to drive the development of innovative service offerings that enrich HPEs Cloud Services portfolio across multiple systems, platforms, and applications. Management Level Definition: Contributes to assignments of limited scope by applying technical concepts and theoretical knowledge acquired through specialized training, education, or previous experience. Acts as team member by providing information, analysis and recommendations in support of team efforts. Exercises independent judgment within defined parameters. Responsibilities: Develops and maintains cloud application modules per feature specifications, adhering to security policies. Designs test plans and executes and automates test cases for assigned portions of the application. Deploys code and debugs issues. Shares and reviews innovative technical ideas with peers, high-level technical contributors, technical writers, and managers. Analyses science, engineering, business, and other data processing problems to develop and implement solutions to complex application problems, system administration issues, or network concerns. Education and Experience Required: Bachelors degree in computer science, engineering, information systems, or closely related quantitative discipline. Master s desirable. Typically, 0-2 years experience. Knowledge and Skills: Programming skills in Python, Java, Golang, or JavaScript. Understanding of basic testing, coding, and debugging procedures. Ability to quickly learn new skills and technologies and work well with other team members. Good written and verbal communication skills. Understanding DevOps practices like continuous integration/continuous deployment (CI/CD).

Posted 3 weeks ago

Apply

0.0 - 2.0 years

13 - 14 Lacs

bengaluru

Work from Office

The Cloud Developer builds from the ground up to meet the needs of mission-critical applications, and is always looking for innovative approaches to deliver end-to-end technical solutions to solve customer problems. Brings technical thinking to break down complex data and to engineer new ideas and methods for solving, prototyping, designing, and implementing cloud-based solutions. Collaborates with project managers and development partners to ensure effective and efficient delivery, deployment, operation, monitoring, and support of Cloud engagements. The Cloud Developer provides business value expertise to drive the development of innovative service offerings that enrich HPEs Cloud Services portfolio across multiple systems, platforms, and applications. Management Level Definition: Contributes to assignments of limited scope by applying technical concepts and theoretical knowledge acquired through specialized training, education, or previous experience. Acts as team member by providing information, analysis and recommendations in support of team efforts. Exercises independent judgment within defined parameters. Responsibilities: Develops and maintains cloud application modules per feature specifications, adhering to security policies. Designs test plans and executes and automates test cases for assigned portions of the application. Deploys code and debugs issues. Shares and reviews innovative technical ideas with peers, high-level technical contributors, technical writers, and managers. Analyses science, engineering, business, and other data processing problems to develop and implement solutions to complex application problems, system administration issues, or network concerns. Education and Experience Required: Bachelors degree in computer science, engineering, information systems, or closely related quantitative discipline. Master s desirable. Typically, 0-2 years experience. Knowledge and Skills: Programming skills in Python, Java, Golang, or JavaScript. Understanding of basic testing, coding, and debugging procedures. Ability to quickly learn new skills and technologies and work well with other team members. Good written and verbal communication skills. Understanding DevOps practices like continuous integration/continuous deployment (CI/CD). Additional Skills: Cloud Architectures, Cross Domain Knowledge, Design Thinking, Development Fundamentals, DevOps, Distributed Computing, Microservices Fluency, Full Stack Development, Release Management, Security-First Mindset, User Experience (UX)

Posted 3 weeks ago

Apply

8.0 - 13.0 years

7 - 10 Lacs

bengaluru

Work from Office

This job will lead the design, development, and implementation of advanced monitoring and RCA(root cause analysis) functions for machine learning models and strategies to solve complex problems. You will work closely with data scientists, software engineers, and product teams to enhance services through backend application and AI/ML solutions. Your role will involve building scalable monitoring backend applications, enable RCA(root cause analysis) capabilities for detected anomalies, and fix issues in production environments to drive business insights and improve customer experiences Job Description Essential Responsibilities Lead the development and optimization of advanced machine learning models. Oversee the preprocessing and analysis of large datasets. Deploy and maintain ML solutions in production environments. Collaborate with cross-functional teams to integrate ML models into products and services. Monitor and evaluate the performance of deployed models, making necessary adjustments. Minimum Qualifications Minimum of 8 years of relevant work experience and a Bachelors degree or equivalent experience. Extensive experience with ML frameworks like TensorFlow, PyTorch, or scikit-learn. Expertise in cloud platforms (AWS, Azure, GCP) and tools for data processing and model deployment. Preferred Qualification Lead the development, optimization and maintenance of monitoring and RCA applications to enable self-service for business partners to establish capabilities in fraud/incident detection and investigation. Build and optimize a self-service monitoring & RCA systems using Java/Python frameworks and technologies Maintain and improve existing applications, ensuring high availability and fault tolerance Collaborate with engineers from other sites, data scientists and business stakeholders to understand data requirements and deliver appropriate solutions. Proficiency in Java with Spring Boot framework and Spring ecosystem, or Python with knowledge of npm package management Experience with Maven for dependency management and build automation Solid understanding of RESTful API development and HTTP client libraries Experience with cloud-based data platforms (e.g. Google BigQuery) Experience with relational databases (e.g., PostgreSQL, MySQL) and/or NoSQL databases (e.g., MongoDB) Experience with time-series databases (InfluxDB) Experience with Devops best practises, containerization(Docker / Kubernetes) , version control systems (Git) and CI/CD Familiar with Linux environments; able to perform troubleshooting of scripts (Shell/Python) Good documentation skills, and have flexibility to sync up with teams across different locations/time zones remotely Experience with message queues (RabbitMQ, Apache Kafka) Experience in building GenAI based solutions Strong problem-solving skills and attention to detail Experience working in agile development environments Excellent communication and collaboration skills Experience in Concentration analysis and root cause analysis capbilities of anomalies detected Experience with relational databases (e.g., PostgreSQL, MySQL) and/or NoSQL databases (e.g., MongoDB) Experience with time-series databases (InfluxDB)

Posted 3 weeks ago

Apply

5.0 - 10.0 years

5 - 8 Lacs

bengaluru

Work from Office

This job will design, develop, and implement machine learning models and algorithms to solve complex problems. You will work closely with data scientists, software engineers, and product teams to enhance services through innovative AI/ML solutions. Your role will involve building scalable ML pipelines, ensuring data quality, and deploying models into production environments to drive business insights and improve customer experiences. Essential Responsibilities Develop and optimize machine learning models for various applications. Preprocess and analyze large datasets to extract meaningful insights. Deploy ML solutions into production environments using appropriate tools and frameworks. Collaborate with cross-functional teams to integrate ML models into products and services. Monitor and evaluate the performance of deployed models. Minimum Qualifications Minimum of 5 years of relevant work experience and a Bachelors degree or equivalent experience. Experience with ML frameworks like TensorFlow, PyTorch, or scikit-learn. Familiarity with cloud platforms (AWS, Azure, GCP) and tools for data processing and model deployment. Several years of experience in designing, implementing, and deploying machine learning models. Preferred Qualification Advanced proficiency in multiple programming and scripting languages, including Python, Java, Scala, and Unix/Linux Shell Scripting. Demonstrated expertise in designing, implementing, and deploying end-to-end AI/ML solutions in production environments on On-Prem and Cloud (GCP, AWS, or Azure). Hands-on experience with Big Data technologies such as Hadoop, Spark, HBase, and Kafka. Expertise in data modeling, feature engineering, and a strong grasp of traditional machine learning algorithms (e.g., Neural Networks, Linear Regression, Logistic Regression, Random Forest, etc.). Experience with Large Language Models (LLMs), particularly in areas like RAG, MCP, and Agentic Agents. Experience in containerization tools like Docker and Kubernetes. GCP experience is a distinct advantage. Strong proficiency in SQL, ETL processes, and database design, with practical knowledge of NoSQL systems like HBase, Redis, and Aerospike. Solid understanding of distributed systems, real-time data streaming, and complex event processing architectures. Knowledge of front-end and back-end development; full-stack engineering experience is a plus.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

bengaluru

Work from Office

Develop, implement and maintain advanced machine learning based monitoring platform with cutting edge big data capability. Your role will involve building scalable monitoring pipelines, ensuring data quality, and fix issues in production environments to drive business insights and improve customer experiences. Essential Responsibilities Develop and optimize machine learning models for various applications. Preprocess and analyze large datasets to extract meaningful insights. Deploy ML solutions into production environments using appropriate tools and frameworks. Collaborate with cross-functional teams to integrate ML models into products and services. Monitor and evaluate the performance of deployed models. Minimum Qualifications Minimum of 5 years of relevant work experience and a Bachelors degree or equivalent experience. Experience with ML frameworks like TensorFlow, PyTorch, or scikit-learn. Familiarity with cloud platforms (AWS, Azure, GCP) and tools for data processing and model deployment. Several years of experience in designing, implementing, and deploying machine learning models. Preferred Qualification Strong programming skills in Big data processing (Pig/Scala+Java/Python) and SQL. Strong in data issue investigation and problem solving. Ability to synthesize information and generalize the pattern. Expertise on big data platform and infrastructure Develop, optimize and maintain ETL pipelines to handle large volumes of data from multiple sources for advanced machine learning models Build and optimize distributed data processing systems using big data frameworks and technologies Maintain and improve existing data infrastructure, ensuring high availability and fault tolerance Collaborate with engineers from other sites, data scientists and business stakeholders to understand data requirements and deliver appropriate solutions Strong proficiency in Python, Java or Scala Extensive experience with Apache Spark (Spark SQL, Spark Streaming, PySpark) Hands-on experience with Hadoop ecosystem (HDFS, YARN, Hive, HBase) Experience with cloud-based data platforms (Google BigQuery) Experience with relational databases (e.g., PostgreSQL, MySQL) and/or NoSQL databases (e.g., MongoDB) Experience with version control systems (Git) and CI/CD practices Familiar with Linux environments; able to perform troubleshooting and write automation scripts (Shell/Python) Good documentation habit and can sync up with teams across different locations remotely Good understanding of security principles and data protection Experience with time-series databases (InfluxDB) Knowledge of RESTful API development and HTTP client libraries Experience in building GenAI based solutions Strong problem-solving skills and attention to detail Experience working in agile development environments Excellent communication and collaboration skills

Posted 3 weeks ago

Apply

5.0 - 10.0 years

7 - 11 Lacs

chennai

Work from Office

This job will lead the devops work of advanced monitoring and RCA(root cause analysis) functions for machine learning models and strategies to solve complex problems. You will work closely with software engineers across different sites to enhance services statbility, scalability and continuity through devops support. Your role will involve building stable CI/CD pipelines and containerized env, setup DB of SQL/NoSQL with high avaiability, and portability with distributed system & dockerized solutions, and ensure continuity with system upgrades & patchings, supporting both software installation and hardware maintanance. Job Description Essential Responsibilities Develop and optimize machine learning models for various applications. Preprocess and analyze large datasets to extract meaningful insights. Deploy ML solutions into production environments using appropriate tools and frameworks. Collaborate with cross-functional teams to integrate ML models into products and services. Monitor and evaluate the performance of deployed models. Minimum Qualifications Minimum of 5 years of relevant work experience and a Bachelors degree or equivalent experience. Experience with ML frameworks like TensorFlow, PyTorch, or scikit-learn. Familiarity with cloud platforms (AWS, Azure, GCP) and tools for data processing and model deployment. Several years of experience in designing, implementing, and deploying machine learning models. Preferred Qualification Proficiency with Docker, Kubernates, Unix/Linux, CI/CD, SQL, cloud/network security, Python/Java Build stable CI/CD pipelines and containerized env to support AI/ML solutions. Setup DB(SQL&NoSQL&TSDB) with high avaiability. Enable scalability and portability with distributed system and dockerized solutions. Ensure continuity with system upgrades and patchings. Supporting both software installation and hardware maintanance. Collaborate with engineers from other sites, data scientists and business stakeholders to understand devops requirements and deliver appropriate solutions. Proficiency in establishing and maintain containarized system(Docker, Kubernetes) Experience with install and management of relational databases (e.g., PostgreSQL, MySQL) and/or NoSQL databases (e.g., MongoDB) Experience with install and management of time-series databases (InfluxDB) Experience with Devops best practises, version control systems (Git) and CI/CD Familiar with Linux environments; able to perform troubleshooting in Linux env Experience with Devops in cloud-based platforms (e.g. Google Cloud) Good documentation skills, and have flexibility to sync up with teams across different locations/timezones remotely . Familiar with Linux shell scripts / Python Experience with message queues (RabbitMQ, Apache Kafka) Strong problem-solving skills and attention to detail Experience working in agile development environments Excellent communication and collaboration skills

Posted 3 weeks ago

Apply

4.0 - 9.0 years

4 - 7 Lacs

mumbai

Work from Office

We are seeking a skilled Data Engineer with strong experience in PySpark, Python, Databricks , and SQL . The ideal candidate will be responsible for designing and developing scalable data pipelines and processing frameworks using Spark technologies. Key Responsibilities: Develop and optimize data pipelines using PySpark and Databricks Implement batch and streaming data processing solutions Collaborate with data scientists, analysts, and business stakeholders to gather data requirements Work with large datasets to perform data transformations Write efficient, maintainable, and well-documented PySpark code Use SQL for data extraction, transformation, and reporting tasks Monitor data workflows and troubleshoot performance issues on Spark platforms Ensure data quality, integrity, and security across systems Ideal Candidate Profile: 3+ years of hands-on experience with Databricks 4+ years of experience with PySpark and Python Strong knowledge of Apache Spark ecosystem and its architecture Proficiency in writing complex SQL queries ( 3+ years ) Experience in handling large-scale data processing and distributed systems Good understanding of data warehousing concepts and ETL pipelines Why Join Exponentia.ai Innovate with Purpose: Opportunity to create pioneering AI solutions in partnership with leading cloud and data platforms Shape the Practice: Build a marquee capability from the ground up with full ownership Work with the Best: Collaborate with top-tier talent and learn from industry leaders in AI Global Exposure: Be part of a high-growth firm operating across US, UK, UAE, India, and Singapore Continuous Growth: Access to certifications, tech events, and partner-led innovation labs Inclusive Culture: A supportive and diverse workplace that values learning, initiative, and ownership

Posted 3 weeks ago

Apply

2.0 - 7.0 years

4 - 7 Lacs

kolkata, mumbai, new delhi

Work from Office

Develop, test, and maintain robust, scalable Java-based applications and products . Understand requirements and existing designs; build and deliver solutions with minimal supervision. Work with databases and Message Queues for efficient integration and processing. Apply Data Structures & Algorithms to build performant, optimized solutions. Debug, troubleshoot, and resolve complex issues with high attention to detail . Use GIT for version control and collaborative development. Suggest improvements, explore new technologies, and propose innovative solutions. Collaborate with cross-functional teams to deliver software in fast-paced environments . What You Will Need to Accomplish the Job Bachelor s degree in Computer Science or related field. 2+ years of proven hands-on experience as a Java Developer in application/product development. Strong understanding of databases and Message Queues . Solid foundation in Data Structures, Algorithms, and problem-solving . Strong understanding of Object-Oriented Programming (OOP) principles and design patterns . Expertise in debugging complex systems. Strong knowledge of GIT and version control best practices. Experience in designing and integrating REST/SOAP/HTTP APIs . Proficiency in SQL scripting and experience with MS SQL databases . Self-motivated, with a passion for continuous learning and coding excellence . Strong communication skills; able to work independently and in teams .

Posted 3 weeks ago

Apply

10.0 - 15.0 years

10 - 14 Lacs

hyderabad

Work from Office

Define Architectural Vision: Define and drive the technical vision and long-term strategy for New Relics core streaming data pipelines, ensuring they are scalable, reliable, and cost-effective. Technical Leadership & Mentorship: Serve as a technical leader and mentor for multiple engineering teams working on the streaming platform. You will guide design, promote best practices in stream processing, and elevate the technical bar for the entire organization. Hands-On Prototyping & Development: Engage in hands-on development for critical path projects, building prototypes to de-risk new technologies, and optimizing existing systems for performance or cost. Solve Hard Problems: Tackle our most complex technical challenges related to data consistency, fault tolerance, and performance at extreme scale. Cross-Functional Collaboration: Partner with product managers, engineering leaders, and other principal engineers to align the platform roadmap with business objectives and the needs of product engineering teams. Evangelize and Educate: As a distributed organization, clear documentation and communication are paramount. You will create architectural documents, tech talks, and best-practice guides to share knowledge across the company. What Your Playground Will Look Like One of the largest Apache Kafka deployments in the world, serving as the central nervous system for all New Relic data. A sophisticated stream processing environment utilizing Apache Flink and other frameworks to perform real-time data enrichment, aggregation, and analysis. A multi-cloud architecture (primarily AWS) leveraging services like Kubernetes (EKS), S3, and other cloud-native technologies. A polyglot environment with hundreds of services written predominantly in Java and Go .

Posted 3 weeks ago

Apply

1.0 - 6.0 years

3 - 6 Lacs

gurugram

Work from Office

Design, develop, and maintain scalable and efficient microservices using Java and Spring Boot. Experience working with APIs , including designing and integrating RESTful services. Work with Spring Batch to process large-scale data efficiently. Write unit tests to ensure robust and well-tested services, following best practices for test-driven development (TDD). Set up monitoring and alerting to identify and resolve production issues proactively. Collaborate with cross-functional teams to implement new features and optimize backend performance. Ensure high availability, reliability, and scalability of backend services. Take on new challenges and quickly adapt to emerging technologies with a problem-solving mindset. Get hands-on with new technologies and be open to learning and experimentation. Required Skills & Qualifications: B achelors in Technology from a Tier 1 University. 1+ year of relevant experience in backend development with hands-on industry standards and best practices. Proficiency in OOPS-oriented language and a strong understanding of microservice architecture. Unit testing expertise, with experience writing test cases using JUnit, Mockito, or similar frameworks. Experience with cloud services (AWS). Strong understanding of relational database fundamentals, message queues [Kafka, SQS] Knowledge of monitoring and logging tools for production systems. Familiarity with version control systems (e.g., Git) and CI/CD pipelines. Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Bonus: Experience with Spring Batch , Storm, Elasticsearch and Spark for real-time and batch data processing.

Posted 3 weeks ago

Apply

12.0 - 17.0 years

30 - 35 Lacs

mumbai

Work from Office

Techno Functional Solution Design: Interpret business requirements and design holistic supply chain planning solutions (Demand, Supply, IBP, S&OP-S&OE,STD,TLB) using o9 platform capabilities Integration Architecture Leadership: Lead design workshops; define E2E data integration strategy linking ERP sources and other sources (SAP, TPM, etc.) through to o9 integration layers Data Mapping & Architecture Governance: Oversee source to target mappings, platform architecture alignment, and governance of solution artifacts (Solution Design Document, Technical Design Document, Data Mapping) Cross Functional Stakeholder Facilitation: Run alignment sessions across regions & stakeholders, partners (o9), implementation teams; influence decisions, define priorities and resolve ambiguity Team & Vendor Coordination: Partner with Scrum Master, Product Owner, development teams and o9 consultants to define user stories, manage sprints, and onboard vendor expertise System Performance Optimization: Identify process improvement opportunities: batch cycle reduction, UI performance tuning, automation, and sustained performance enhancements Change Management & Knowledge Transfer: Drive adoption of new processes and tools with change management team, lead to conduct training, documentation hand over to support teams during and after go live

Posted 3 weeks ago

Apply

9.0 - 10.0 years

40 - 45 Lacs

pune

Work from Office

Design and implement scalable and secure cloud integration solutions using AWS services such as AWS Lambda, API Gateway, AWS Glue, and Amazon S3, alongside Confluent Kafka for data streaming and Databricks for data processing and analytics. Develop and maintain integration architectures, data flows, and APIs to connect various cloud and on-premises systems, leveraging Confluent Kafka for real-time data processing or other integration tools for batch processing. Design and implement scalable data solutions on AWS, ensuring optimal performance, reliability and cost effectiveness while adhering to best practices in cloud data management. Develop comprehensive cloud integration architectures using AWS services like API Gateway, Lambda, SQS, SNS, Kinesis, App Flow and other relevant components to meet specific integration requirements. Implement DevOps practices and tools to streamline development and deployment processes, ensuring efficient collaboration and continuous integration/continuous deployment (CI/CD) pipelines. Design and implement scalable data solutions on AWS, ensuring optimal performance and cost-effectiveness while adhering to best practices in cloud data management. Develop and Optimize ETL pipelines and data models using the Medallion Architecture (Bronze, Silver, Gold layers) for improved data processing and facilitate data transformation. Collaborate with business analysts, data scientists, and stakeholders to understand their data requirements and deliver tailored solutions that leverage the capabilities of the Databricks Lakehouse platform. Stay updated on AWS services and industry trends, recommending improvements and innovations to enhance integration processes.

Posted 3 weeks ago

Apply

0.0 - 2.0 years

1 - 3 Lacs

noida, ghaziabad, new delhi

Work from Office

Scanzer Outsourcing is looking for DATA ENTRY OPERATORS & COMPUTER OPERATOR to join our dynamic team and embark on a rewarding career journey Input and update data into computer systems.

Posted 3 weeks ago

Apply

0.0 - 2.0 years

1 - 3 Lacs

bengaluru

Work from Office

Role: Urgent Opening : DATA ENTRY / BACK Office Coordinator Industry Type: BPM / BPO Department: Customer Success, Service & Operations Employment Type: Full Time, Permanent Role Category: Back Office Education UG: Graduation Not Required

Posted 3 weeks ago

Apply

0.0 - 2.0 years

2 - 6 Lacs

mumbai

Work from Office

Roles & Responsibilities The Analyst will work on back-office and middle-office processes for financial institutions, handling various stages of the client/product lifecycle across KYC, reference data management, legal docs, loans, portfolio reconciliation, document capture, system reconciliation, pre and post settlements, brokerage functions, drafting, trade support, corporate actions, tax operations, and more. Responsibilities also include data capture, cataloging, data processing, system inputs and updates, reconciliations, settlements, and fund transfers. The role involves preparing reports using MS Excel and may require external interaction with agents/counterparties/clients to resolve process-related queries and discrepancies via phone or email. Key responsibilities include: Identifying and escalating risks, promptly reporting outstanding issues to clients. Performing various trade support activities across the Trade Lifecycle, such as Trade Confirmation matching, Trade Pre-Settlements support, Front office to back-office reconciliation of trade positions, report generation, and settlements of cash flows from trading events (e.g., Interest or Premium). Handling operations of Syndicated Loans and Corporate action setup and operations. Managing other capital market operational tasks beyond Trade Lifecycle support, including Reference Data support and Regulatory reporting (Swaps Data Repository, SDR, Know Your Customer (KYC), various front-office and back-office reconciliations). Learning and mastering various financial products, including Equity Securities and Derivatives, Interest Rates Swaps, FX Spot, Options, Futures, Credit Derivatives Swaps, Commodities Derivatives, and Fixed Income products (e.g., Corporate and Treasury Bonds). Qualification and Skills Bachelors Degree (B.Com, BBA, BBM, BCA) / Masters Degree (M.Com, MBA, PGDM). 0 to 2 years of experience in investment banking operations involving projects, people, process, and client management. Basic knowledge of finance, trade lifecycle, investment banking, and derivatives. Strong logical and quantitative abilities to derive insights from data. Excellent time management skills and ability to resolve issues promptly. Proficiency in planning, organizing, and time management.

Posted 3 weeks ago

Apply

1.0 - 3.0 years

1 - 3 Lacs

dhule

Work from Office

The Computer Operator is responsible for overseeing the daily operation of computer systems used in construction project management. This includes managing project data, assisting with design and scheduling software, troubleshooting hardware and software issues, and ensuring that all construction technology runs smoothly to support ongoing and upcoming projects. Key Responsibilities: System Management: Monitor and maintain construction management software systems, including project management, scheduling, and budgeting tools. Operate and oversee software for design and drafting (e.g., AutoCAD, Revit) to support engineers, architects, and project managers. Ensure all construction-related data is accurately stored, organized, and backed up in the system. Data Entry & Processing: Input and update project information, including material costs, labor hours, progress reports, and other relevant construction data. Generate daily, weekly, and monthly reports for project managers and stakeholders. Maintain accurate records of construction timelines, budgets, and resource allocation in digital systems. Collaboration: Work closely with engineers, project managers, architects, and field teams to ensure that technological systems meet project needs. Communicate any system issues to senior management and IT personnel to ensure quick resolution. Compliance & Security: Ensure that all construction data complies with industry regulations and company policies. Assist in maintaining cybersecurity protocols for sensitive construction data. Perform regular data backups and ensure the security of digital project information. Skills & Qualifications: Technical Skills: Proficiency in construction management software (e.g., Procore, Buildertrend, Microsoft Project). Familiarity with design and drafting software (e.g., AutoCAD, Revit). Basic knowledge of networking and troubleshooting hardware and software issues. Experience: Previous experience in a computer operator or similar role, preferably within the construction industry. Understanding of construction terminology, processes, and industry standards. Education: High school diploma or equivalent; associate's degree in computer science, information technology, or a related field preferred Mandatory Key Skills troubleshooting,compliance,cybersecurity protocols,Data Entry,Data Processing,System Management,construction project management,AutoCAD,Revit*

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies