Home
Jobs

1649 Data Processing Jobs - Page 22

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 12.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Were Celonis, the global leader in Process Mining technology and one of the worlds fastest-growing SaaS firms. We believe there is a massive opportunity to unlock productivity by placing data and intelligence at the core of business processes - and for that, we need you to join us. The Team: Our team is responsible for building the Celonis end-to-end Task Mining solution . Task Mining is the technology that allows businesses to capture user interaction (desktop) data, so they can analyze how people get work done, and how they can do it even better. We own all the related components, e.g. the desktop client, the related backend services, the data processing capabilities, and Studio frontend applications. The Role: Celonis is looking for a Senior Software Engineer to build new features and increase the reliability of our Task Mining solution. You would contribute to the development of our Task Mining Client so expertise on C# and .NET framework is required and knowledge of Java and Spring boot is a plus. The work you ll do: Implement highly performant and scalable desktop components to improve our existing Task Mining software Own the implementation of end to end solutions: leading the design, implementation, build and delivery to customers Increase the maintainability, reliability and robustness of our software Continuously improve and automate our development processes Document procedures, concepts, and share knowledge within and across teams Manage complex requests from support, finding the right technical solution and managing the communication with stakeholders Occasionally work directly with customers, including getting to know their system in detail and helping them debug and improve their setup. The qualifications you need: 7+ years of professional experience building .NET applications Passion for writing clean code that follows SOLID principles Hand-on experience in C# and .NET framework. Experience in user interface development using WPF and MVVM. Familiarity with Java, Spring framework is a plus. Familiarity with containerization technologies (i.e. Docker) Experience in REST APIs and/or distributed micro service architecture Experience in monitoring and log analysis capabilities (i.e. DataDog) Experience in writing and setting up unit and integration tests Experience in refactoring legacy components. Able to supervise and coach junior colleagues Experience interacting with customers is a plus. Strong communication skills. What Celonis Can Offer You: Pioneer Innovation: Work with the leading, award-winning process mining technology, shaping the future of business. Accelerate Your Growth: Benefit from clear career paths, internal mobility, a dedicated learning program, and mentorship opportunities. Receive Exceptional Benefits: Including generous PTO, hybrid working options, company equity (RSUs), comprehensive benefits, extensive parental leave, dedicated volunteer days, and much more . Prioritize Your Well-being: Access to resources such as gym subsidies, counseling, and well-being programs. Connect and Belong: Find community and support through dedicated inclusion and belonging programs. Make Meaningful Impact: Be part of a company driven by strong values that guide everything we do: Live for Customer Value, The Best Team Wins, We Own It, and Earth Is Our Future. Collaborate Globally: Join a dynamic, international team of talented individuals. Empowered Environment: Contribute your ideas in an open culture with autonomous teams. About Us: Celonis makes processes work for people, companies and the planet. The Celonis Process Intelligence Platform uses industry-leading process mining and AI technology and augments it with business context to give customers a living digital twin of their business operation. It s system-agnostic and without bias, and provides everyone with a common language for understanding and improving businesses. Celonis enables its customers to continuously realize significant value across the top, bottom, and green line. Celonis is headquartered in Munich, Germany, and New York City, USA, with more than 20 offices worldwide. Get familiar with the Celonis Process Intelligence Platform by watching this video . Celonis Inclusion Statement: At Celonis, we believe our people make us who we are and that The Best Team Wins . We know that the best teams are made up of people who bring different perspectives to the table. And when everyone feels included, able to speak up and knows their voice is heard - thats when creativity and innovation happen. Your Privacy: Any information you submit to Celonis as part of your application will be processed in accordance with Celonis Accessibility and Candidate Notices By submitting this application, you confirm that you agree to the storing and processing of your personal data by Celonis as described in our Privacy Notice for the Application and Hiring Process . Please be aware of common job offer scams, impersonators and frauds. Learn more here .

Posted 2 weeks ago

Apply

15.0 - 20.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

Were Celonis, the global leader in Process Mining technology and one of the worlds fastest-growing SaaS firms. We believe there is a massive opportunity to unlock productivity by placing data and intelligence at the core of business processes - and for that, we need you to join us. Team: Were building a new engineering team to drive the productization of an exciting and innovative new project. The purpose of this project is to extend Celonis process observability beyond company boundaries, enabling our customers with the power to share knowledge cross-organization. This will allow companies to share their data with their business partners and augment their own process data, opening up new opportunities for process improvement across value chains involving multiple companies. Role: Ready to architect the future? As a Senior Staff Engineer at Celonis, youll lead the design and development of our Process Intelligence Platforms core features. Your expertise in distributed systems, microservices, and API design will be crucial in building high-performance solutions. If youre driven to tackle complex architectural challenges, youll thrive here. Your contributions will directly empower global companies to save costs, boost efficiency, and drive sustainability. The work you ll do: Lead in technical architecture and setting the technical vision for the team. Oversee the development and implementation for highly complex applications, tools, systems and integrations. Lead the exploration of new trends, technologies and information, and evaluate these trends to pitch applicable projects that impacts Celonis through innovation. Receive work in the form of objectives that regularly require innovation around original ideas. Translate targeted solutions into end-to-end architectural designs. Independently own multiple large problem spaces with significant ambiguity. Consistently demonstrate ability to go deep into a variety of domains. Communicate well to all levels of product development and across the company. Navigate open-ended technical and workflow discussions, helping reach conclusions or constructive next steps. Proactively engage with internal and external peers and management to develop unprecedented solutions. Qualifications you need: Bachelors or Masters degree in Computer Science or related field. 15+ years of professional software development experience. Expert knowledge and hands-on experience in an object-oriented programming language (ideally Java/ Typescript) and design patterns. Experience on design and development of cloud native microservices based distributed systems either on Java, Spring Boot, Spring data or on Typescript, JavaScript and Node.js . Experience in building integrations with different data sources e.g. PostgreSQL. Strong experience with at least one of the hyperscalers, ideally AWS Good experience in building large-scale data processing and analytics pipelines using platforms like Apache Spark or Databricks is a plus. Deep understanding of containerization and Kubernetes Experience with CI/CD pipelines and deployment automation using tools, preferable GitHub Actions and ArgoCD Ability to navigate typical enterprise challenges such as security, compliance, scalability, and data consistency Exhibit excellent programming skills and a strong ability to troubleshoot and resolve issues. Have a strong background in architecting reusable large-scale software components, developing scalable and highly available microservices, and a good understanding of interrelationships between software components and systems. Skilled in driving technical solutions to complex and ambiguous problems, with a focus on stability and scalability, and with cross-team collaboration. Experienced in measuring, managing, and resolving technical debt to ensure the long-term viability of software systems. Possess strong facilitation and communication skills, both in written and oral forms. Exposure to geographically distributed work environments. Capable of supporting and coaching mid to staff level colleagues. What Celonis Can Offer You: Pioneer Innovation: Work with the leading, award-winning process mining technology, shaping the future of business. Accelerate Your Growth: Benefit from clear career paths, internal mobility, a dedicated learning program, and mentorship opportunities. Receive Exceptional Benefits: Including generous PTO, hybrid working options, company equity (RSUs), comprehensive benefits, extensive parental leave, dedicated volunteer days, and much more . Prioritize Your Well-being: Access to resources such as gym subsidies, counseling, and well-being programs. Connect and Belong: Find community and support through dedicated inclusion and belonging programs. Make Meaningful Impact: Be part of a company driven by strong values that guide everything we do: Live for Customer Value, The Best Team Wins, We Own It, and Earth Is Our Future. Collaborate Globally: Join a dynamic, international team of talented individuals. Empowered Environment: Contribute your ideas in an open culture with autonomous teams. About Us: Celonis makes processes work for people, companies and the planet. The Celonis Process Intelligence Platform uses industry-leading process mining and AI technology and augments it with business context to give customers a living digital twin of their business operation. It s system-agnostic and without bias, and provides everyone with a common language for understanding and improving businesses. Celonis enables its customers to continuously realize significant value across the top, bottom, and green line. Celonis is headquartered in Munich, Germany, and New York City, USA, with more than 20 offices worldwide. Get familiar with the Celonis Process Intelligence Platform by watching this video . Celonis Inclusion Statement: At Celonis, we believe our people make us who we are and that The Best Team Wins . We know that the best teams are made up of people who bring different perspectives to the table. And when everyone feels included, able to speak up and knows their voice is heard - thats when creativity and innovation happen. Your Privacy: Any information you submit to Celonis as part of your application will be processed in accordance with Celonis Accessibility and Candidate Notices By submitting this application, you confirm that you agree to the storing and processing of your personal data by Celonis as described in our Privacy Notice for the Application and Hiring Process . Please be aware of common job offer scams, impersonators and frauds. Learn more here .

Posted 2 weeks ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

The people here at Apple don t just build products they craft the kind of wonder that has revolutionised entire industries It s the diversity of those people and their ideas that encourages the innovation that runs through everything we do, from amazing technology to industry-leading environmental efforts Imagine what you could do here Join Apple, and help us leave the world better than we found it A job at Apple is unlike any other you ve had You will be challenged You will be inspired And you ll be proud! At Apple, phenomenal ideas have a way of becoming phenomenal products, services, and customer experiences very quickly Bring passion and dedication to your job, and theres no telling what you could accomplish!The Apple Services Engineering team (ASE) is one of the most exciting examples of Apple s long-held passion for combining art and technology These are the people who power the App Store, Apple TV, Apple Music, Apple Podcasts, and Apple Books And they do it at an extensive scale, meeting our high expectations with dedication to deliver a huge variety of entertainment in over 35 languages to more than 150 countries These engineers build secure, end-to-end solutions They develop the custom software used to process all the creative work, the tools that providers use to deliver that media, all the server-side systems, and the APIs for many Apple services Thanks to Apple s unique integration of hardware, software, and services, engineers here partner to get behind a single unified vision That vision always includes a deep commitment to strengthening Apple s privacy policy, one of our core values Although services are a bigger part of Apple s business than ever before, these teams remain small, and multi-functional, offering greater exposure to the array of opportunities here Description The Service Reliability Engineer (SRE) role in Apple Services Engineering requires a mix of strategic engineering and design along with hands-on, technical work This SRE will configure, tune, and fix multi-tiered systems to achieve optimal application performance, stability and availability We manage jobs as well as applications on bare-metal and cloud computing platforms to deliver data processing for many of Apple s global products Our teams work with exabytes of data, petabytes of memory, and tens of thousands of jobs to enable predicable and performant data analytics enabling features in Apple Music, TV+, Appstore and other world class products If you love designing, running systems that will impact millions of users, then this is the place for you!The Main Responsibilities for this position include:- Support Java-based applications & Spark/Flink jobs on Baremetal, AWS & Kubernetes- Ability to understand the application requirements (Performance, Security, Scalability, etc ) and assess the right services/topology on AWS, Baremetal & Kubernetes- Build automation to enable self-healing systems- Build tools to monitor high performance & alert the low-latency applications- Ability to troubleshoot application-specific, core network, system & performance issues - Involvement in challenging and fast paced projects supporting Apples business by delivering innovative solutions - Monitor production, staging, test and development environments for a myriad of applications in an agile and dynamic organisation BS degree in computer science or equivalent field with 5+ years or MS degree with 3+ years experience, or equivalent. At least 5 years in a Site Reliability Engineering (SRE), DevOps role 5+ years of running services in a large-scale *nix environment Understanding of SRE principles and goals along with prior on-call experience Extensive experience in managing applications on AWS & Kubernetes Deep understanding and experience in one or more of the following - Hadoop, Spark, Flink, Kubernetes, AWS Preferred Qualifications Fast learner with excellent analytical problem solving and interpersonal skills Experience supporting Java applications Experience with Big Data Technologies Experience working with geographically distributed teams and implementing high level projects and migrations Strong communication skills and ability to deliver results on time with high quality

Posted 2 weeks ago

Apply

2.0 - 3.0 years

3 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

Its fun to work at a company where people truly believe in what they are doing! Job Description: Job Description Position Summary The Litigation Analyst works as a member of the Operations team within Epiqs Electronic Discovery. In this role analyst is responsible for both overseeing litigation support work and interacting with Client services in order to maintain ECTs w.r.t processing team s work. Strong attention to detail, high quality work product and frequent interaction with project managers is also a major function of this role. Essential Job Responsibilities The Litigation Analyst is responsible for the following: Oversee daily tasks and workflows performed by the litigation support department as directed by management Ensure daily services requests are assigned to team members and executed accurately in accordance with client deadlines Ensure all QC procedures and protocols are followed Responsible for performing searching, search term formatting and structured analytics. Also responsible for managing processing team priorities, managing ECTs and communication with project managers whenever required. Handling general requests and assigning to other teams as per the instructions so knowledge of overall EDRM model is also required. Trouble-shoot and resolve issues from litigation Analysts and Client Services prior to escalation to managers Requirements for the role include: At least 2-3 years experience in the litigation support industry is required. Intermediate knowledge of several ESI data processing platforms (e.g. NUIX) Intermediate knowledge of several ESI data hosting platforms (e.g. Relativity, Concordance, Summation etc.) Must be flexible in working long hours and could work earlier and later than their scheduled shift to meet often last minute and tightly compressed client deadlines Must possess a strong understanding of electronic discovery tools and technology with an advanced level understanding of eDiscovery Processing and data extraction Possess and employ effective verbal and written communicate skills and work positively and effectively with other company departments Education & Experience Bachelor s degree or equivalent combination of education and experience; a degree in Computer Science, Business Management or a closely related field of study is preferred. Knowledge, Skills, and Abilities Experience working under tight deadlines in a fast-paced technical environment is strongly preferred Ability to perform troubleshooting and learn customized proprietary software Excellent communication skills (written and verbal) Strong organizational skills and an extreme attention to detail is required If you like wild growth and working with happy, enthusiastic over-achievers, youll enjoy your career with us!

Posted 2 weeks ago

Apply

6.0 - 7.0 years

4 - 8 Lacs

Mumbai

Work from Office

Naukri logo

Tasks & Responsibility Description of tasks: - Commercial responsibility within his/her relation towards the team leader - Daily handling of business cases in the area of sea cargo import shipments and delivering of shipments in accordance with the procedures for dispatching and delivering - Contacts clients, agents and shipping companies, other freight forwarding companies and customs bodies in connection with dispatching and delivering of shipments - Issues and monitors transportation documents, collects documents for dispatching and delivering of shipments - Electronic data processing - Coverage of insurance (temporarily or permanently) - Composes records about damages and deficits of shipments and complaint orders - Issues invoices of the accumulated expenses respectively transferring to the person in charge for invoicing - Filing of business cases - Makes offers to customers and partners - Enters the data of new customers, partners and service provider and updates existing ones - Generate sales leads - Customer service, keeps contact with agents - Compiles monthly bordero report for her/his relation - Knowledge of the standard operation procedures/guidelines and systems like AS 400, S.P.O.T., LogSpace Qualification and skills Level of Education: commercial education or special education in freight forwarding Working Experience: At least 6-7 years in Sea Cargo. Special Knowledge: Computer basic knowledge, MS Office English language Personal Qualification: Team player Dynamic Commercial thinking Initiative Responsible Company Introduction: For over 40 years, cargo-partner has flourished in the logistics industry, delivering unparalleled service to our clients worldwide. We have now embarked on another journey and to continue our commitment for excellence, we have now joined the Nippon Express Group which will now underpin all the values we constantly aspire to achieve, now becoming a top 5 global player. As an end to end info logistics provider, we pride ourselves on offering a comprehensive portfolio of air, sea, land transport, and warehousing services. With a unique focus on information technology and supply chain optimization, we empower businesses to thrive in todays fast-paced world. Join our dynamic team, where innovation meets passion and every voice is valued. Embark on a journey where your skills are nurtured, creativity is celebrated, and together, we take pride in making a difference. Discover more about our Mission & Vision . Dive into a world of endless opportunities and embark on the cargo-partner journey with us. cargo-partner is an equal opportunity employer. We celebrate diversity and are committed to creating an environment where all employees feel valued and respected. We do not discriminate on the basis of race, color, religion, gender, sexual orientation, gender identity, national origin, age, disability, or any other legally protected characteristics. We welcome and encourage applications from all individuals, regardless of background. Explore endless opportunities and leave your mark with us. #JoinUs #Logistics #workingdigital #Teamwork #cargopartner #wow Ready to get things moving? Join our team! Learn about Life at cargo-partner here . View our Privacy Policy .

Posted 2 weeks ago

Apply

1.0 - 6.0 years

3 - 8 Lacs

Mumbai

Work from Office

Naukri logo

We are seeking an experienced Azure IoT Systems Developer to develop a comprehensive industrial IoT monitoring and control system. This is a hands-on technical role focused exclusively on Azure cloud services configuration, backend development, and IoT solution architecture. Work Model: Remote with occasional on-site collaboration No: of positions: 3 Key Functions & Roles of the Candidate: Azure IoT Platform Development. Design and implement Azure IoT Hub configuration for device connectivity and telemetry ingestion. Develop device provisioning services for secure device onboarding. Create Azure Digital Twins models (DTDL) representing industrial equipment and production sites. Implement real-time data synchronization between physical devices and digital twins. Backend Services & Integration Develop Azure Functions for data processing and business logic implementation. Implement Stream Analytics jobs for real-time telemetry processing. Create batch processing services with complex business rule implementation. Data Management & Analytics Configure hot and cold data storage solutions (Azure SQL, Cosmos DB, Data Lake). Implement data pipelines for real-time and historical analytics. Develop notification services for system alerts and monitoring. Create data archival and retention policies. Implement data pipelines for real-time and historical analytics Implement CI/CD pipelines for automated deployment Configure security policies for IoT devices and cloud services. Set up monitoring and logging solutions Required Technical Skills Azure IoT Hub & IoT Edge - Device connectivity, telemetry ingestion, and edge computing. Azure Digital Twins - DTDL modeling, twin relationships, and queries. Azure Service Bus - Message queuing, sessions, and dead-letter handling. Azure Functions - Serverless computing and event-driven processing. Azure Stream Analytics - Real-time data processing and analytics. Azure Functions - Serverless computing and event-driven processing Azure API Management - API gateway and security implementation. Complete Azure infrastructure setup and configurationFully functional IoT data ingestion and processing pipelineDigital twin implementation with real-time synchronizationTask processing system with business rules engineBackend APIs for system integration and monitoringComprehensive documentation and deployment guidesUnit tests and integration test suites. At least 1 year of IoT systems development. Proven experience building end-to-end solutions. Design and develop RESTful APIs using Azure API Management.

Posted 2 weeks ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

About GalaxEye Space GalaxEye is pioneering the next generation of Earth Observation (EO) by leveraging advanced SAR and MSI sensors to provide high-resolution satellite imagery and actionable insights. Our mission is to revolutionize geospatial intelligence for industries like agriculture, defense, and disaster management. Role Overview We\u2019re looking for a Senior DevOps Engineer to lead the development and operations of our scalable, cloud-native infrastructure that supports real-time satellite data delivery. Youll work closely with engineering, data, and product teams to build robust, automated, and secure systems that can handle large volumes of EO data with minimal downtime. This is a high-impact role in a fast-paced environment where reliability and performance are mission-critical. Key Responsibilities \u25CF Architect and manage highly available, fault-tolerant infrastructure on cloud platforms like AWS, GCP, or Azure. \u25CF Design and maintain CI/CD pipelines to enable smooth, automated deployments and rollbacks. \u25CF Build and maintain infrastructure as code using Terraform, and manage containerized environments with Kubernetes and Docker. \u25CF Implement system observability, including centralized logging, performance monitoring, and alerting. \u25CF Ensure system security, compliance, and identity management (IAM, encryption, secure networking). \u25CF Lead incident response, root cause analysis, and ongoing system hardening. \u25CF Partner with backend engineers to ensure performance, scalability, and seamless integration with microservices. \u25CF Optimize cloud spend and resource allocation based on system load and business goals. \u25CF Contribute to and maintain internal DevOps best practices and documentation. Requirements Required Skills & Qualifications \u200b \u25CF 3-5 years of experience in cloud infrastructure, DevOps, or SRE roles. \u25CF Strong knowledge of Python for automation and scripting. Familiarity with monitoring and logging frameworks (e.g., Prometheus, Granada, ELK, Datadog). \u25CF Hands-on experience with Kubernetes, Docker, Terraform, and CI/CD pipelines. \u25CF Experience working with cloud providers like AWS, GCP, or Azure. \u25CF Proficiency in Linux system administration and networking concepts. \u25CF Strong problem-solving skills and ability to work in a fast-paced environment. \u25CF Knowledge of satellite technology and geospatial data processing is a plus. Benefits Benefits \u25CF Fair compensation will be provided as per market standards \u25CF Experience rapid growth and start-up culture \u25CF Flexible Working Hours \u25CF Open to explore, discuss and implement new ideas and processes \u25CF Opportunity to work closely with the Founding Team at GalaxEye \u25CF Get a chance to work with Advisors holding senior positions and decades of experience

Posted 2 weeks ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

About GalaxEye Space GalaxEye is pioneering the next generation of Earth Observation (EO) by leveraging advanced SAR and MSI sensors to provide high-resolution satellite imagery and actionable insights. Our mission is to revolutionize geospatial intelligence for industries like agriculture, defense, and disaster management. Role Overview We are looking for a Full Stack Developer (Backend-Focused) to develop and optimize our satellite data platform. While full-stack knowledge is required, your primary expertise should be in backend development, designing scalable APIs, managing data pipelines, and integrating cloud-based services. Key Responsibilities Develop and optimize backend systems for handling satellite data processing and analytics. Design RESTful and GraphQL APIs for seamless integration with frontend and third-party services. Work with databases (SQL & NoSQL) to store, manage, and retrieve large geospatial datasets. Implement microservices architecture and optimize backend performance. Collaborate with frontend developers to ensure seamless API consumption. Optimize cloud-based data processing pipelines for scalability and cost-efficiency. Write clean, maintainable, and well-documented code. Requirements 3-5 years of experience in backend development. Proficiency in Python, Node.js, or Go. Strong experience with Django, Flask, or FastAPI. Hands-on experience with PostgreSQL, MongoDB, or Redis. Expertise in AWS Lambda, ECS, or Kubernetes for scalable backend deployment. Strong knowledge of data structures, algorithms, and system design. Experience with frontend technologies (React, Vue.js) is a plus. Benefits Fair compensation will be provided as per market standards Experience rapid growth and start-up culture Flexible Working Hours Open to explore, discuss and implement new ideas and processes Opportunity to work closely with the Founding Team at GalaxEye Get a chance to work with Advisors holding senior positions and decades of experience

Posted 2 weeks ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title: DevOps Engineer (3-8+ Years Experience) Location: Bengaluru, India Job Type: Full-time Experience: 3-8+ years Industry: Financial Technology / Software Development About Us: We are a cutting-edge software development company specializing in ultra-low latency trading applications for brokers, proprietary trading firms, and institutional investors. Our solutions are designed for high-performance, real-time trading environments, and we are looking for a DevOps Engineer to enhance our deployment pipelines, infrastructure automation, and system reliability. For more info, please visit: https://tradelab.in/ Responsibilities: 1. CI/CD & Infrastructure Automation Design, implement, and manage CI/CD pipelines for rapid and reliable software releases. Automate deployments using Terraform, Helm, and Kubernetes . Optimize build and release processes to support high-frequency, low-latency trading applications . Good knowledge on Linux/Unix 2. Cloud & On-Prem Infrastructure Management Deploy and manage cloud-based (AWS, GCP) and on-premises infrastructure . Ensure high availability and fault tolerance of critical trading systems. Implement infrastructure as code (IaC) to standardize deployments. 3. Performance Optimization & Monitoring Monitor system performance, network latency, and infrastructure health using tools like Prometheus, Grafana, ELK . Implement automated alerting and anomaly detection for real-time issue resolution. 4. Security & Compliance Implement DevSecOps best practices to ensure secure deployments. Maintain compliance with financial industry regulations (SEBI) . Conduct vulnerability scanning, access control, and log monitoring . 5. Collaboration & Troubleshooting Work closely with development, QA, and trading teams to ensure smooth deployments. Troubleshoot server, network, and application issues under tight SLAs. Required Skills & Qualifications: 5+ years of experience as a DevOps Engineer in a software development or trading environment. Strong expertise in CI/CD tools (Jenkins, GitLab CI/CD, ArgoCD). Proficiency in Cloud Platforms (AWS, GCP,) and Containerization (Docker, Kubernetes). Experience with Infrastructure as Code (IaC) using Terraform , or CloudFormation. Deep understanding of Linux system administration and networking (TCP/IP, DNS, Firewalls). Knowledge of monitoring & logging tools (Prometheus, Grafana, ELK ). Experience in scripting and automation using Python, Bash, or Go. Understanding of security best practices (IAM, firewalls, encryption). Good to have but not Mandatory Skills: Experience with low-latency trading infrastructure and market data feeds . Knowledge of high-frequency trading (HFT) environments . Exposure to FIX protocol, FPGA, and network optimizations . Experience with Redis, Nginx for real-time data processing. Perks & Benefits: Competitive salary & performance bonuses Opportunity to work in the high-frequency trading and fintech industry Flexible work environment with hybrid work options Cutting-edge tech stack and infrastructure Health insurance & wellness programs Continuous learning & certification support

Posted 2 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Mumbai, Navi Mumbai

Work from Office

Naukri logo

Technical and people leadership of sample analysis activities carried out at the Innovation Centre, Supervise and provide support to Analytical Scientist team, Responsibility for the reporting and dissemination of sample analysis results to internal and external stakeholders as necessary, To perform as proactive liaison with the R&D and Innovation Centre faculty team to ensure planned work can proceed to schedule, To coordinate with key stakeholders to ensure successful project deliveries, To develop enhanced analytical methods used for sample characterization, Contribute to the progression of CCSL s IP portfolio, To develop and present various standard and ad-hoc reports for a variety of audiences, Ensuring the smooth operation of analytical suite. This role will be based on site 5 days a week in the office Masters degree in Chemical Engineering, Chemistry, Analytical Chemistry or Materials Science (or related), PhD preferable. 5+ years of analytical laboratory research experience necessary. Strong demo

Posted 2 weeks ago

Apply

7.0 - 10.0 years

9 - 12 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

Contract Job Summary : We are seeking an experienced SAP Consultant - DTC (Direct-to-Consumer) with strong expertise in SAP Customer Activity Repository (CAR) and Retail (S/4HANA Fashion) solutions. The consultant will play a critical role in integrating core merchandising (SAP S/4 Fashion) with multi-channel POS systems, covering areas such as POS data capture via CAR, retail price and markdown management, and sales audit processes. This role involves close collaboration with global retail business stakeholders and offers exposure to the latest SAP cloud technologies , providing a unique opportunity to enhance your experience in a mature SAP ecosystem. Experience & Skills Required : 7-10 years of SAP consulting experience with SAP CAR , Retail , AFS , or S/4HANA Fashion Proven experience in POS data integration , retail pricing , and sales audit processes Full lifecycle implementation experience in ERP transformation programs (minimum 1-2 end-to-end cycles) In-depth knowledge of POS data processing (sales, inventory, receipts, tenders, financial transactions) within SAP ERP Familiarity with S/4HANA Fiori apps , Launchpad, Personas, and retail-specific transactions Ability to work independently and collaboratively across global teams Excellent problem-solving, communication, and stakeholder management skills Nice to Have : Experience in Agile project environments Knowledge of integration components across functional SAP modules

Posted 2 weeks ago

Apply

8.0 - 10.0 years

18 - 22 Lacs

Pune, Hinjewadi

Work from Office

Naukri logo

job requisition idJR1027361 Job Summary Synechron seeks a highly skilled AI/ML Engineer specializing in Natural Language Processing (NLP), Large Language Models (LLMs), Foundation Models (FMs), and Generative AI (GenAI). The successful candidate will design, develop, and deploy advanced AI solutions, contributing to innovative projects that transform monolithic systems into scalable microservices integrated with leading cloud platforms such as Azure, Amazon Bedrock, and Google Gemini. This role plays a critical part in advancing Synechrons capabilities in cutting-edge AI technologies, enabling impactful business insights and product innovations. Software Required Proficiency: Python (core librariesTensorFlow, PyTorch, Hugging Face transformers, etc.) Cloud platformsAzure, AWS, Google Cloud (familiarity with AI/ML services) ContainerizationDocker, Kubernetes Version controlGit Data management toolsSQL, NoSQL databases (e.g., MongoDB) Model deployment and MLOps toolsMLflow, CI/CD pipelines, monitoring tools Preferred Skills: Experience with cloud-native AI frameworks and SDKs Familiarity with AutoML tools Additional programming languages (e.g., Java, Scala) Overall Responsibilities Design, develop, and optimize NLP models, including advanced LLMs and Foundation Models, for diverse business use cases. Lead the development of large data pipelines for training, fine-tuning, and deploying models on big data platforms. Architect, implement, and maintain scalable AI solutions in line with MLOps best practices. Transition legacy monolithic AI systems into modular, microservices-based architectures for scalability and maintainability. Build end-to-end AI applications from scratch, including data ingestion, model training, deployment, and integration. Implement retrieval-augmented generation techniques for enhanced context understanding and response accuracy. Conduct thorough testing, validation, and debugging of AI/ML models and pipelines. Collaborate with cross-functional teams to embed AI capabilities into customer-facing and enterprise products. Support ongoing maintenance, monitoring, and scaling of deployed AI systems. Document system designs, workflows, and deployment procedures for compliance and knowledge sharing. Performance Outcomes: Production-ready AI solutions delivering high accuracy and efficiency. Robust data pipelines supporting training and inference at scale. Seamless integration of AI models with cloud infrastructure. Effective collaboration leading to innovative AI product deployment. Technical Skills (By Category) Programming Languages: Essential: Python (TensorFlow, PyTorch, Hugging Face, etc.) Preferred: Java, Scala Databases/Data Management: SQL (PostgreSQL, MySQL), NoSQL (MongoDB, DynamoDB) Cloud Technologies: Azure AI, AWS SageMaker, Bedrock, Google Cloud Vertex AI, Gemini Frameworks and Libraries: Transformers, Keras, scikit-learn, XGBoost, Hugging Face engines Development Tools & Methodologies: Docker, Kubernetes, Git, CI/CD pipelines (Jenkins, Azure DevOps) Security & Compliance: Knowledge of data security standards and privacy policies (GDPR, HIPAA as applicable) Experience 8 to 10 years of hands-on experience in AI/ML development, especially NLP and Generative AI. Demonstrated expertise in designing, fine-tuning, and deploying LLMs, FMs, and GenAI solutions. Proven ability to develop end-to-end AI applications within cloud environments. Experience transforming monolithic architectures into scalable microservices. Strong background with big data processing pipelines. Prior experience working with cloud-native AI tools and frameworks. Industry experience in finance, healthcare, or technology sectors is advantageous. Alternative Experience: Candidates with extensive research or academic experience in AI/ML, especially in NLP and large-scale data processing, are eligible if they have practical deployment experience. Day-to-Day Activities Develop and optimize sophisticated NLP/GenAI models fulfilling business requirements. Lead data pipeline construction for training and inference workflows. Collaborate with data engineers, architects, and product teams to ensure scalable deployment. Conduct model testing, validation, and performance tuning. Implement and monitor model deployment pipelines, troubleshoot issues, and improve system robustness. Document models, pipelines, and deployment procedures for audit and knowledge sharing. Stay updated with emerging AI/ML trends, integrating best practices into projects. Present findings, progress updates, and technical guidance to stakeholders. Qualifications Bachelors degree in Computer Science, Data Science, or related field; Masters or PhD preferred. Certifications in AI/ML, Cloud (e.g., AWS, Azure, Google Cloud), or Data Engineering are a plus. Proven professional experience with advanced NLP and Generative AI solutions. Commitment to continuous learning to keep pace with rapidly evolving AI technologies. Professional Competencies Strong analytical and problem-solving capabilities. Excellent communication skills, capable of translating complex technical concepts. Collaborative team player with experience working across global teams. Adaptability to rapidly changing project scopes and emerging AI trends. Innovation-driven mindset with a focus on delivering impactful solutions. Time management skills to prioritize and manage multiple projects effectively.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

11 - 15 Lacs

Pune, Hinjewadi

Work from Office

Naukri logo

job requisition idJR1027356 Job Summary Synechron is seeking an experienced Asset Control Developer to support, maintain, and enhance asset control applications within our organization. This role involves understanding complex system architectures, delivering high-quality solutions, and collaborating with global teams to ensure efficient and compliant operations. The ideal candidate will contribute significantly to the stability, performance, and evolution of asset control systems, supporting the organizations operational and risk management objectives. Software Required Proficiency: Linux Shell Scripting (for automation and support tasks) Java (for application development and customization) Oracle Database (SQL and PL/SQL for data management and processing) DevOps/Agile tools and practices for continuous delivery and collaboration (e.g., Jenkins, JIRA) Version control systems (e.g., Git) Preferred Skills: Experience with additional scripting languages (Python, Perl) Configuration management tools (Ansible, Chef) Cloud platforms (AWS, Azure) integration experience Overall Responsibilities Comprehend the architecture of asset control applications and ensure their operational integrity. Deliver solutions to meet business requirements, including system development, modifications, and support. Conduct system and user acceptance tests; support deployment and post-implementation reviews. Create and maintain comprehensive documentation, including system designs, procedures, and change requests. Collaborate with global teams on joint development and maintenance efforts. Facilitate effective communication and coordination with stakeholders to align project goals. Support continuous improvement initiatives for asset control processes and system configurations. Prepare for audits and compliance checks by providing relevant documentation and support. Assist in setting up and optimizing DevOps toolchains for deployment and release management. Performance Outcomes: Stable, reliable asset control systems that meet business needs. Accurate and timely documentation supporting operational and compliance requirements. Successful implementation of change requests and system enhancements. Efficient collaboration across global teams ensuring continuous service improvement. Technical Skills (By Category) Programming Languages: Essential: Linux Shell Scripting, Java Preferred: Python, Perl Databases/Data Management: Oracle, Static and Timeseries data processing Cloud Technologies: Basic understanding of cloud solutions (preferred) Frameworks & Libraries: Custom in-house or specialized data processing frameworks Development Tools & Methodologies: Version control (Git) Agile/Scrum, DevOps practices Security Protocols: Best practices in data security, access controls, and audit readiness Experience 5 to 10 years of professional experience in asset control, application development, or related financial systems domains. Proven experience developing, supporting, and maintaining asset control systems such as IECL, Interfaces, Formula Engines, or Price Rules. Demonstrated ability to work with financial data, static and Timeseries data processing. Previous experience working in global or multi-team environments, supporting compliance and audit processes. Industry experience in banking, trading, or financial services is strongly preferred. Alternative Experience: Candidates with extensive technical experience in audit, compliance, or institutional asset management systems are also considered. Day-to-Day Activities Analyze system architecture and troubleshoot asset control application issues. Develop, test, and deploy system enhancements and fixes. Participate in change management activities, including raising change requests and documenting procedures. Support audit and compliance activities by generating reports and documentation. Collaborate with global teams for ongoing development and support. Monitor system health, perform troubleshooting, and apply updates as needed. Keep documentation current for system configurations, procedures, and project deliverables. Qualifications Bachelors degree in Computer Science, Information Technology, or related discipline. Relevant certifications in Java, Oracle, or asset control tools are a plus. Training in ITIL, Agile, or DevOps practices is advantageous. Continuous learning mindset, staying updated on industry standards and best practices. Professional Competencies Strong analytical and problem-solving skills. Excellent written and verbal communication abilities. Ability to work effectively within cross-functional and global teams. Adaptability to changing priorities and emerging technologies. Attention to detail, with a focus on accuracy and quality. Ability to manage multiple tasks with deadlines.

Posted 2 weeks ago

Apply

8.0 - 13.0 years

8 - 15 Lacs

Gurugram

Work from Office

Naukri logo

Job Summary: We are seeking a highly skilled and detail-oriented Assistant Manager - MIS & Workflow to oversee the management information systems (MIS) and streamline workflows across various departments. The ideal candidate will play a pivotal role in improving operational efficiency, ensuring the accuracy of reports, and fostering seamless communication and data flow within the organization. Key Responsibilities: MIS Management: Oversee the development, implementation, and maintenance of Management Information Systems (MIS) across the organization. Ensure timely and accurate reporting of key business metrics, KPIs, and performance indicators to senior management. Generate ad-hoc and scheduled reports, analyze trends, and provide actionable insights to management. Collaborate with different departments to ensure data accuracy and consistency across systems. Workflow Optimization: Design, implement, and optimize workflows to improve operational efficiency and reduce bottlenecks. Identify areas for process improvements and recommend solutions to streamline daily operations. Work closely with department heads to assess current workflows and propose automation solutions where applicable. Create and maintain standard operating procedures (SOPs) for business processes. Data Analysis & Reporting: Analyze data patterns and trends to derive actionable insights for business decisions. Regularly review system outputs to ensure accuracy and consistency. Prepare and present periodic reports for senior management, highlighting key operational metrics and performance. Technology Integration & Improvement: Stay updated with the latest trends and technologies in MIS and workflow management. Identify opportunities for system upgrades, integrations, and new software solutions to improve performance and efficiency. Oversee the integration of new tools or systems to improve workflow and data management. Qualifications & Requirements: Bachelors degree in Business Administration, Computer Science, Engineering, or a related field. A Masters degree is a plus. years of experience in MIS management, workflow optimization, or a similar role. Strong proficiency in MIS software and tools (e.g., Excel, Power BI, Tableau, ERP systems). Experience with workflow automation tools and process management. Strong analytical and problem-solving skills, with a focus on data integrity and business intelligence. Excellent communication and interpersonal skills, with the ability to work effectively with cross-functional teams.

Posted 2 weeks ago

Apply

0.0 - 3.0 years

0 - 2 Lacs

Ghaziabad

Work from Office

Naukri logo

Dear Professional, we are hiring a backend executive for our organization. The job description is given below: Roles and Responsibilities: Well versed with MS Office packages like MS Excel. MS Word Good knowledge of Outlook Mail. Able to communicate via emails Any backend/MIS experience would be an added advantage Preference to candidates will be given who have previous experience in the U.S. market. Desired Candidate Profile: Experience as a backend executive or similar role would be an added Advantage. Working knowledge of MS Office and databases. Excellent communication skills (written and oral). Problem-solving and critical-thinking skills. *Form Fillings, Knowledge of LinkedIn & other job portals.* Working Days: 5 Days/Night Shifts (rest depends on company requirement; it can be 6 days, so will pay extra for each 6th day) Shift Timings: 8:30 pm to 5:30 am (Summers) & 9:30 pm to 6:30 am (winters), fixed timings Location: RDC, Raj Nagar, Ghaziabad! Education: Graduates/postgraduates can apply. *Salary: 20k In-hand Fixed (no deductions)* *Initial 10 Days of the training period, 50% of salary will be paid to you.* Performance bonus is there up to 5000 (T&C applied). *Salary will be resumed at 100% after completion of 10-day of training period.* *Benefits: 3k food allowances + 1k travel allowances (after completion on 1 month)* Company Website: www.synergisticit.com Company LinkedIn Profile - https://www.linkedin.com/redir/redirect?url=https%3A%2F%2Fsynergisticit%2Ecom%2F&urlhash=rKyX&trk=about_website

Posted 2 weeks ago

Apply

0.0 - 3.0 years

0 - 2 Lacs

Ghaziabad

Work from Office

Naukri logo

Roles and Responsibilities: Well versed with MS Office packages like MS Excel. MS Word Good knowledge of Outlook Mails. Able to communicate mails properly. Any backend/MIS experience would be an added advantage Preference to candidates will be given who have previous experience in U.S Market. Desired Candidate Profile : Experience as Backend Executive or similar role would be an added Advantage. Working knowledge of MS Office and databases. Good communication skills (written and oral). Problem-solving and critical-thinking skills. Working Days - 5 Days/Night Shifts Shift Timings - 8:30 pm to 5:30 am fixed Location- RDC, Raj Nagar Ghaziabad ! Education- Graduates/Post graduates can apply. Job Type: Full-time Salary: 20,000.00 per month During your training period i.e., 10 days 50% of your monthly salary will be paid.(50%of 20,000) Benefits: 3K Food Allowances (after completion of 1 month ) 1k travel Allowances ( after completion of 1 month ) Performance Based Incentives per month up to 5k . We are strictly looking for male candidates only from Ghaziabad Location.

Posted 2 weeks ago

Apply

13.0 - 18.0 years

18 - 22 Lacs

Bengaluru

Work from Office

Naukri logo

Job Area: Engineering Group, Engineering Group > Software Test Engineering General Summary: Job description We are seeking an experienced AI Architect to design, develop, and deploy Retrieval-Augmented Generation (RAG) solutions for Qualcomm Cloud AI Platforms. Roles and Responsibilities Lead the design and development of applications for RAG AI models and provide APIs for frontend consumption. Manage the interaction between retrieval-augmented techniques and generative models. Build services that connect AI models (e.g., transformers, embeddings, and vector search) to handle tasks such as query retrieval, model inference, and generating responses. Leverage frameworks like Flask, FastAPI, or Django for API development. Design pipelines to preprocess, clean, and prepare data for AI model training, as well as for serving the models in production environments. Optimize these pipelines to support both batch and real-time data processing. Implement RESTful APIs or GraphQL endpoints for seamless frontend-backend interaction. Implement cloud solutions to host Python-based services, ensuring that AI models are scalable and that the infrastructure can handle high traffic. Leverage containerization (Docker) and orchestration (Kubernetes) for model deployment and management. Set up monitoring, logging, and alerting for Python backend services, ensuring smooth operation of AI features. Use tools like Prometheus, Grafana, and ELK stack for real-time performance tracking. Continuously optimize model performance by fine-tuning and adapting Python-based AI models for real-time use cases. Manage trade-offs between computation load, response time, and quality of generated content. Partner with data scientists, machine learning engineers, and mobile/web developers to ensure tight integration between AI models, mobile/web front-end, and backend infrastructure. - Experience: 13+ years of overall SW development experience 10+ years Strong experience in working with technologies (e.g., React, React Native, Flutter, Django, Flask, FastAPI). 5+ years of experience in building AI applications with a focus on NLP, machine learning, generative models, and retrieval-augmented systems. Proven experience in designing and deploying AI systems that integrate retrieval-based techniques (e.g., FAISS, Weaviate) and generative models (e.g., GPT, BERT). - Expertise in cloud platforms (e.g., AWS, GCP, Azure) and deployment of Python-based microservices. Building RESTful APIs or GraphQL services (using frameworks like Flask, FastAPI, or Django). Handling AI model inference and data processing (using libraries like NumPy, Pandas, TensorFlow, PyTorch, and Hugging Face Transformers). Integrating vector search solutions (e.g., FAISS, Pinecone, Weaviate) with the AI models for efficient retrieval-augmented generation. - Experience with containerization (Docker) and Kubernetes for deploying scalable Python-based services. Proficient in cloud infrastructure management, with a focus on managing Python services in the cloud. Experience in End-to-End product development and Software Lifecycle Key Skills: Advanced proficiency in Python for building backend services and data processing pipelines. Familiarity with frameworks like Flask, Django, and FastAPI. Experience with AI libraries and frameworks (TensorFlow, PyTorch, Hugging Face Transformers). Familiarity with vector databases (e.g., Pinecone, FAISS, Weaviate) and integration with retrieval-augmented systems. Strong knowledge of RESTful API design, GraphQL, and API security best practices (e.g., OAuth, JWT). Excellent problem-solving abilities and a strong focus on creating highly scalable and performant solutions. Strong communication skills, with the ability to collaborate across different teams and geography Ability to mentor junior team members and lead technical discussions. Minimum Qualifications: Bachelor's degree in Engineering, Information Systems, Computer Science, or related field and 6+ years of Software Test Engineering or related work experience. OR Master's degree in Engineering, Information Systems, Computer Science, or related field and 5+ years of Software Test Engineering or related work experience. OR PhD in Engineering, Information Systems, Computer Science, or related field and 4+ years of Software Test Engineering or related work experience. 2+ year of work experience with Software Test or System Test, developing and automating test plans, and/or tools (e.g., Source Code Control Systems, Continuous Integration Tools, and Bug Tracking Tools).

Posted 2 weeks ago

Apply

1.0 - 9.0 years

4 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

Career Category Information Systems Job Description Join Amgen s Mission of Serving Patients At Amgen, if you feel like you re part of something bigger, it s because you are. Our shared mission to serve patients living with serious illnesses drives all that we do. Since 1980, we ve helped pioneer the world of biotech in our fight against the world s toughest diseases. With our focus on four therapeutic areas -Oncology, Inflammation, General Medicine, and Rare Disease- we reach millions of patients each year. As a member of the Amgen team, you ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Sr Associate Software Engineer-RD Omics What you will do Let s do this. Let s change the world. In this vital role you will be responsible for Research Informatics and you will be responsible for development and maintenance of software in support of target/biomarker discovery at Amgen Develop software to transform and visualize omics (genomics, proteomics, transcriptomics) data using programming languages such as Python, Java, R. Develop data processing pipelines for large datasets in the cloud (e. g. Nextflow); integrate with other data sources where applicable Collaborate with the other engineering team members to ensure all services are reliable, maintainable, and well-integrated into existing platforms Adhere to best practices for testing and designing reusable code What we expect of you We are all different, yet we all use our unique contributions to serve patients. This role requires proficiency in code development (e. g. Python, R, etc), and some knowledge of CI/CD processes and cloud computing technologies (e. g. AWS, Google Cloud, etc). Additionally, the ability to work with cross functional teams and experience in agile practices is desired. Basic Qualifications: Master s degree and 1 to 3 years of in Software Development, IT, or related field, OR Bachelor s degree and 3 to 5 years of in Software Development, IT, or related field, OR Diploma and 7 to 9 years of in Software Development, IT, or related field. Preferred Qualifications: 2+ years of experience in biopharma or life sciences Experience in RESTFUL API development e. g flask, MuleSoft Experience in pipeline development using one or more of the following programming languages (Python, Nextflow, etc) Experience with Databricks Experience with cloud computing platforms and infrastructure Experience with Application development (Django, RShiny, Ploty Dash, etc) Work experience in the biotechnology or pharmaceutical industry. Experience using and adopting Agile Framework Soft Skills: Strong learning agility, ability to pick up new technologies used to support early drug discovery data analysis needs Collaborative with good communication skills. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers. amgen. com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .

Posted 2 weeks ago

Apply

9.0 - 12.0 years

16 - 18 Lacs

Hyderabad

Work from Office

Naukri logo

Job Description: Essential Job Functions: Participate in data engineering tasks, including data processing and integration activities. Assist in the development and maintenance of data pipelines. Collaborate with team members to collect, process, and store data. Contribute to data quality assurance efforts and adherence to data standards. Use data engineering tools and techniques to analyze and generate insights from data. Collaborate with data engineers and other analysts on data-related projects. Seek out opportunities to enhance data engineering skills and domain knowledge. Stay informed about data engineering trends and best practices. Basic Qualifications: Bachelors degree in a relevant field or equivalent combination of education and experience Typically, 5+ years of relevant work experience in industry, with a minimum of 2 years in a similar role Proven experience in data engineering Proficiencies in data engineering tools and technologies A continuous learner that stays abreast with industry knowledge and technology Other Qualifications: Advanced degree in a relevant field a plus Relevant certifications, such as Oracle Certified Professional, MySQL Database Administrator a plus Recruitment fraud is a scheme in which fictitious job opportunities are offered to job seekers typically through online services, such as false websites, or through unsolicited emails claiming to be from the company. These emails may request recipients to provide personal information or to make payments as part of their illegitimate recruiting process. DXC does not make offers of employment via social media networks and DXC never asks for any money or payments from applicants at any point in the recruitment process, nor ask a job seeker to purchase IT or other equipment on our behalf. More information on employment scams is available here .

Posted 2 weeks ago

Apply

2.0 - 11.0 years

16 - 18 Lacs

Pune

Work from Office

Naukri logo

Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Senior Software engineer In this role, you will: Ensure data quality, data governance, and compliance with regulatory requirements. Monitor and optimize data pipeline performance. Troubleshoot and resolve data-related issues promptly. Implement monitoring and alerting systems for data processes. Troubleshoot and resolve technical issues optimizing system performance ensuring reliability. Create and maintain technical documentation for new and existing system ensuring that information is accessible to the team. Implementing and monitoring solutions that identify both system bottlenecks and production issues. Requirements To be successful in this role, you should meet the following requirements: Good communication skills as the candidate need to work with globally dispersed and diversified teams. Flexible attitude - open to learn new technologies based on project requirements. Proficiency in Python/Scala/Bash for data pipeline development and automation Familiarity with CI/CD pipelines for deploying and managing data pipeline. Proven experience building and maintaining scalable data movement pipelines, Good understanding of Hadoop and GCP environments for data storage and data processing Familiarity with ETL tools and distributed data processing frameworks such as Spark Good understanding of scheduling and orchestration tools such as AirFlow or Control-M Good understanding of data principles, data integrity, data best practices etc

Posted 2 weeks ago

Apply

0.0 - 3.0 years

2 - 5 Lacs

Hyderabad

Work from Office

Naukri logo

ABOUT THE ROLE Role Description We are seeking an MDM Associate Data Steward who will be responsible for ensuring the accuracy, completeness, and reliability of master data across critical business domains such as Customer, Product, Affiliations, and Payer. This role involves actively managing and curating master data through robust data stewardship processes, comprehensive data cataloging, and data governance frameworks utilizing Informatica or Reltio MDM platforms. Additionally, the incumbent will perform advanced data analysis, data validation, and data transformation tasks through SQL queries and Python scripts to enable informed, data-driven business decisions. The role emphasizes cross-functional collaboration with various teams, including Data Engineering, Commercial, Medical, Compliance, and IT, to align data management activities with organizational goals and compliance standards. Roles & Responsibilities Responsible for master data stewardship, ensuring data accuracy and integrity across key master data domains (e.g., Customer, Product, Affiliations). Conduct advanced data profiling, cataloging, and reconciliation activities using Informatica or Reltio MDM platforms. Manage the reconciliation of potential matches, ensuring accurate resolution of data discrepancies and preventing duplicate data entries. Effectively manage Data Change Request (DCR) processes, including reviewing, approving, and documenting data updates in compliance with established procedures and SLAs. Execute and optimize SQL queries for validation and analysis of master data. Perform basic Python for data transformation, quality checks, and automation. Collaborate effectively with cross-functional teams including Data Engineering, Commercial, Medical, Compliance, and IT to fulfill data requirements. Support user acceptance testing (UAT) and system integration tests for MDM related system updates. Implement data governance processes ensuring compliance with enterprise standards, policies, and frameworks. Document and maintain accurate SOPs, Data Catalogs, Playbooks, and SLAs. Identify and implement process improvements to enhance data stewardship and analytic capabilities. Perform regular audits and monitoring to maintain high data quality and integrity. Basic Qualifications and Experience Masters degree with 1 - 3 years of experience in Business, Engineering, IT or related fieldOR Bachelors degree with 2 - 5 years of experience in Business, Engineering, IT or related fieldOR Diploma with 6 - 8 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Direct experience in data stewardship, data profiling, and master data management. Hands-on experience with Informatica or Reltio MDM platforms. Proficiency in SQL for data analysis and querying. Knowledge of data cataloging techniques and tools. Basic proficiency in Python scripting for data processing. Good-to-Have Skills: Experience with PySpark and Databricks for large-scale data processing. Background in the pharmaceutical, healthcare, or life sciences industries. Familiarity with AWS or other cloud-based data solutions. Strong project management and agile workflow familiarity (e.g., using Jira, Confluence). Understanding of regulatory compliance related to data protection (GDPR, CCPA). Professional Certifications Any ETL certification ( e.g. Informatica) Any Data Analysis certification (SQL) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams

Posted 2 weeks ago

Apply

9.0 - 12.0 years

7 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Role Description: We are looking for highly motivated expert Senior Data Engineer who can own the design & development of complex data pipelines, solutions and frameworks. The ideal candidate will be responsible to design, develop, and optimize data pipelines, data integration frameworks, and metadata-driven architectures that enable seamless data access and analytics. This role prefers deep expertise in big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities: Design, develop, and maintain scalable ETL/ELT pipelines to support structured, semi-structured, and unstructured data processing across the Enterprise Data Fabric. Implement real-time and batch data processing solutions, integrating data from multiple sources into a unified, governed data fabric architecture. Optimize big data processing frameworks using Apache Spark, Hadoop, or similar distributed computing technologies to ensure high availability and cost efficiency. Work with metadata management and data lineage tracking tools to enable enterprise-wide data discovery and governance. Ensure data security, compliance, and role-based access control (RBAC) across data environments. Optimize query performance, indexing strategies, partitioning, and caching for large-scale data sets. Develop CI/CD pipelines for automated data pipeline deployments, version control, and monitoring. Implement data virtualization techniques to provide seamless access to data across multiple storage systems. Collaborate with cross-functional teams, including data architects, business analysts, and DevOps teams, to align data engineering strategies with enterprise goals. Stay up to date with emerging data technologies and best practices, ensuring continuous improvement of Enterprise Data Fabric architectures. Must-Have Skills: Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL Apache Spark, AWS, Python, SQL, and Scaled Agile methodologies. Proficiency in workflow orchestration, performance tuning on big data processing. Strong understanding of AWS services Experience with Data Fabric, Data Mesh, or similar enterprise-wide data architectures. Ability to quickly learn, adapt and apply new technologies Strong problem-solving and analytical skills Excellent communication and teamwork skills Experience with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices. Good-to-Have Skills: Good to have deep expertise in Biotech & Pharma industries Experience in writing APIs to make the data available to the consumers Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Education and Professional Certifications 9 to 12 years of Computer Science, IT or related field experience AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills.

Posted 2 weeks ago

Apply

1.0 - 3.0 years

25 - 30 Lacs

Bengaluru

Remote

Naukri logo

We only hire from Top Tier Universities including IITs, BITS, DCE/NSIT, ISI, Top NITs etc. This role is only suitable for candidates with between 1 to 4 years experience. People with more than 4 years of experience need not apply. This is an URGENT requirement. We are hiring for a UK based Fintech company (name is kept confidential). The company is seeking an early stage Data Scientist to join the team and support the design, development, and Machine Learning and AI use-cases. Your work will directly support strategic initiatives and improve business outcomes. Job Summary : We are seeking a motivated Data Scientist to join our Data and AI team. The ideal candidate will assist in analysing complex datasets, building predictive models, and contributing to data-driven decision-making. This entry-level role is perfect for someone eager to apply their technical skills and grow in a collaborative, innovative environment. Key Responsibilities : Collect, clean, and preprocess structured and unstructured data from various sources Perform exploratory data analysis (EDA) to identify trends, patterns, and insights Develop, test, and deploy basic machine learning models under senior team guidance Create visualisations and dashboards to communicate findings to stakeholders Collaborate with cross-functional teams (e.g., product, engineering) to support business objectives Assist in maintaining data pipelines and ensuring data quality Stay updated on industry trends and emerging tools in data science Qualifications : Bachelors degree in data science, Computer Science, Statistics, Mathematics, or a related field 1-3 years of experience in data science or a related role (internships or academic projects count) Proficiency in Python for data analysis (e.g., pandas, NumPy, scikit-learn) Strong analytical and problem-solving skills Excellent communication skills to explain technical concepts to non-technical audiences. Understanding of machine learning algorithms (e.g., regression, classification, clustering). Nice to Have: Internship or project experience in data science or analytics Exposure to Gen AI architectures (RAG, MCP etc.) Experience working with cloud platforms (e.g., AWS, GCP, Azure) Experience with SQL for data querying Knowledge of version control (e.g., Git)

Posted 2 weeks ago

Apply

3.0 - 8.0 years

10 - 13 Lacs

Chennai

Work from Office

Naukri logo

Role & responsibilities Photogrammetric Data Processing Process stereo imagery for Planimetric and DTM (Digital Terrain Model) extraction using industry-standard methodologies. Apply concepts such as Aerial Triangulation (AT) and Block File Setup for photogrammetric project setup. Software Usage Utilize software tools such as MicroStation V8i , Erdas Imagine , DTMaster , DSM , and Datum for: Data processing Feature extraction Ortho image generation Ortho Image Production Conduct Ortho Image Processing to ensure spatial accuracy and clarity in the final deliverables. Quality Control (QA/QC) Perform rigorous QA/QC checks on photogrammetric outputs to ensure accuracy and adherence to client/project standards. Data Management Manage large volumes of image and geospatial datasets efficiently. Maintain organized project documentation and file structures. Team Collaboration Collaborate with project managers, GIS analysts, and QA teams to ensure deliverables meet technical expectations and deadlines. Preferred candidate profile Experience: 3 to 8 years of relevant hands-on experience in photogrammetry and geospatial data processing. Technical Skills: Proficient in tools like MicroStation V8i , Erdas , DTMaster , DSM , and Datum . Strong understanding of Planimetric mapping and DTM generation techniques. Educational Qualification: Diploma or Bachelor's Degree in Civil Engineering , Geomatics , Geoinformatics , or any related field . Domain Knowledge: In-depth understanding of aerial triangulation , block file setups , and orthophoto creation . Knowledge of geospatial industry standards and deliverable requirements. Soft Skills: Detail-oriented with a commitment to quality and accuracy. Good organizational and communication skills. Ability to work independently as well as part of a team in a deadline-driven environment. Work Mode: Must be willing to work from office at the Chennai location.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Hyderabad, Gurugram

Work from Office

Naukri logo

About the Role: Grade Level (for internal use): 10 Job Summary: We are seeking a talented Java Developer to join our dynamic team. The ideal candidate will have strong proficiency in Java experience working with public cloud platforms such as AWS or Microsoft Azure, and a solid foundation in computer science principles. What Youll Do: Design, develop, test, document, deploy, maintain, and enhance software applications for a quantitative product that conducts complex mathematical calculations to accurately derive and analyze the various S&P index. Manage individual project priorities, deadlines, and deliverables. Collaborate with key stakeholders to develop system architectures, API specifications, and implementation requirements. Engage in code reviews, knowledge sharing, and mentorship to promote ongoing technical development within the team. Analyze system performance and optimize applications for maximum speed and scalability. What You'll Need: 5+ years of proven experience as a Senior Developer with a strong command of Java, Springboot, Experience developing RESTful APIs using a variety of tools Hands-on experience with public cloud platforms (AWS, Microsoft Azure). Solid understanding of algorithms, data structures, and software architecture . Experience with distributed computing frameworks like Apache Spark. Familiarity with data lake architectures and data processing. Ability to translate abstract business requirements into concrete technical solutions. Strong analytical skills to assess the behavior and performance of loosely coupled systems, ensuring they operate efficiently and effectively in a distributed environment. Educational Qualifications: Bachelors/masters degree in computer science, Information Technology, or a related field. Technologies & Tools We Use: Programming LanguagesJava, Python FrameworksSpring Boot, Apache Spark Cloud PlatformsAWS, Microsoft Azure Development ToolsGit, Docker, Jenkins About S&P Global Dow Jones Indic e s At S&P Dow Jones Indices, we provide iconic and innovative index solutions backed by unparalleled expertise across the asset-class spectrum. By bringing transparency to the global capital markets, we empower investors everywhere to make decisions with conviction. Were the largest global resource for index-based concepts, data and research, and home to iconic financial market indicators, such as the S&P 500 and the Dow Jones Industrial Average . More assets are invested in products based upon our indices than any other index provider in the world. With over USD 7.4 trillion in passively managed assets linked to our indices and over USD 11.3 trillion benchmarked to our indices, our solutions are widely considered indispensable in tracking market performance, evaluating portfolios and developing investment strategies. S&P Dow Jones Indices is a division of S&P Global (NYSESPGI). S&P Global is the worlds foremost provider of credit ratings, benchmarks, analytics and workflow solutions in the global capital, commodity and automotive markets. With every one of our offerings, we help many of the worlds leading organizations navigate the economic landscape so they can plan for tomorrow, today. For more information, visit www.spglobal.com/spdji . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies