Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 6.0 years
6 - 8 Lacs
bengaluru
Work from Office
Backend Engineer - Python We love technology, we love design, and we love quality. Our diversity makes us unique and creates an inclusive and welcoming workplace where each individual is highly valued. We are looking for you who is immediate joiner and want to grow with us! With us, you have great opportunities to take real steps in your career and the opportunity to take great responsibility Job Title: Backend Engineer - Python Location: Bangalore / Hybrid Experience: 4-6 years Employment Type: Full-time About the Role We are looking for a talented Backend Engineer with strong expertise in Python, SQL, and modern backend frameworks. The role involves building scalable microservices, working with big data processing, and deploying solutions on cloud platforms. Key Responsibilities Design, develop, and maintain backend services using Python (FastAPI) . Work with PySpark for large-scale data processing. Build and optimize microservices-based architectures . Write efficient SQL queries and ensure data reliability. Deploy and manage applications on GCP or Azure . Implement containerization using Docker . Collaborate with cross-functional teams to deliver high-quality solutions. Requirements 4-6 years of backend development experience. Strong proficiency in Python and SQL . Hands-on experience with FastAPI and PySpark . Solid understanding of microservices frameworks . Experience with GCP or Azure cloud platforms. Strong knowledge of Docker . Good problem-solving and communication skills. Nice to Have Experience with Kubernetes and CI/CD pipelines. Exposure to large-scale distributed systems or data engineering. Location: Bangalore Start Date: Immediate Work Mode: Hybrid Language Requirement: English (Excellent written and verbal skills) Form of employment: Full-time until further notice, we apply 6 months probationary employment. We interview candidates on an ongoing basis, do not wait to submit your application.
Posted 3 weeks ago
4.0 - 6.0 years
6 - 8 Lacs
gurugram
Work from Office
The mission of the Finance Platform Strategies (FPS) team is to ensure optimized operating models within Finance, consistent across our global functional teams as appropriate To promote this objective, we connect key Finance resources with relevant functional specialists from across BlackRock, identify and document business requirements and manage the implementation and subsequent maintenance of platforms and processes designed to address those requirements FPS team members serve as internal change management consultants who apply knowledge of BlackRock s resources - people, processes, and technology - to ensure concepts become reality in support of both new business initiatives and ongoing business process enhancement Team Overview The FPS team has a global footprint Team members are responsible for managing projects that drive Finance objectives forward through initiatives that can have either regional or global focus Responsibilities include coordinating project team activity, supporting effective and timely communication among project team members and subject matter experts from across the firm, drafting clear and comprehensive business requirements and project management documentation, designing sound processes and workflow models, partnering with internal or third party resources to drive design specification sign-off, oversee technical development progress and coordinate quality unit testing, providing ongoing project updates to all relevant stakeholders, facilitating user training services and, ensuring timely delivery of effective, well designed solutions In addition to project oversight, the team works to manage, on an ongoing basis, the platforms supporting Finance s day-to-day operating model to ensure our technologies remain optimized and our operational processes are sustainable, efficient and support a robust control environment The team provides level one support for Finance operational platforms and as required, will assess/trouble-shoot and identify actionable opportunities to address complex issues related to the suite of technology solutions Finance employs including Oracle Cloud SaaS platform, Financial Consolidations and Close (FCCS), IBM Cognos Planning Analytics (TM1), IBM Cognos Analytics (BI) and our proprietary BlackRock Aladdin platform Role Responsibility: Development and Maintenance: Develop and maintain TM1 models and applications, including budgeting, forecasting, and actual reports Provide ongoing maintenance and enhancements to existing models Data Processing: Manage daily, weekly, monthly, and quarterly data processing to support actuals and forecast/budget close processes for the TM1 platform Requirements Gathering: Lead the development and enhancement work for various TM1 models throughout all phases, from requirements gathering to build, user testing, go-live, and support Cloud Migration: Play an active role in cloud migration and model transformation initiatives Report Conversion: Lead report conversion from TM1 perspectives to Planning Analytics for Excel (PAfE) and Planning Analytics Workspace (PAW) User Support: Provide day-to-day user support for the TM1 application, ensuring the accuracy and integrity of data and reports Documentation: Create and maintain process documentation to ensure it stays current as the process evolves Stakeholder Collaboration: Collaborate with internal and external stakeholders to understand business Experience: Required 4-6 years of relevant experience working as IBM Planning Analytics (TM1) developer Proficiency in TM1 Rules, Feeders, Turbo Integrator processes, and system configuration Experience with SQL queries and stored procedures Bachelors degree in finance, IT, or similar fields are preferred Knowledge of financial instruments and markets is beneficial Desired: Good understanding of Finance / Asset Management industry Strong written and verbal communication skills are crucial for this role Natural curiosity and interest in finance data and technologies to optimize finance processes from end to end Masters Degree in Finance or IT Personal Qualities: Strong work ethic and accountability owner Self-starter able to drive positive progress proactively with limited manager direction Solutions and service oriented Focused attention to detail; high standards for quality and accuracy in work product Professional, positive, collegial demeanor; collaborative relationship builder Comfortable interacting with all levels of management and able to thrive in a fast paced, innovative environment This mission would not be possible without our smartest investment the one we make in our employees It s why we re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: wwwlinkedincom / company / blackrock BlackRock is proud to be an Equal Opportunity Employer We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law
Posted 3 weeks ago
4.0 - 7.0 years
6 - 9 Lacs
hyderabad
Work from Office
Experience Required Moderate experience : Minimum 24 months overall. At least 12 months experience in one or more of the following: Software Testing Bug Triaging Audits / Quality Checks Technical Issue Resolution Subject Matter Expertise Key Responsibilities Take end-to-end ownership of assigned responsibilities to maintain high program health. Manage work allocation and ensure defined targets are met/exceeded (Productivity, Quality, SLA, Efficiency, Utilization). Ensure process adherence and identify process gaps for improvements. Conduct regular quality audits . Handle policy, training, reporting, and quality management where no separate POCs exist. Perform root cause analysis (Fishbone, RCA, 5-Whys, etc.) to resolve issues effectively. Identify and escalate high-impact issues quickly with minimal downtime. Manage multiple responsibilities while ensuring core duties are completed. Skills & Competencies Strong proficiency in MS Office / Google Suite . Basic knowledge of SQL and experience with JIRA or similar ticketing tools . Proficient in Excel/Google Sheets (Pivot Tables, VLOOKUP, Data Processing). Good knowledge of data analysis techniques . Excellent logical reasoning, problem-solving, and attention to detail . Strong English reading comprehension and writing skills (concise & accurate). Ability to read and interpret complex SOPs . High capability to perform repetitive tasks with accuracy . Ability to memorize technical/engineering terminologies and project details. Familiarity with smartphones, test platforms, and navigation tools . Max_Experience":"4
Posted 3 weeks ago
4.0 - 7.0 years
6 - 9 Lacs
bengaluru
Work from Office
Position Overview Job Title: Software Development Engineer 2 Department: Technology Location: Bangalore, India Reporting To: Senior Research Manager - Data Position Purpose The Research Engineer Data will play a pivotal role in advancing TookiTaki s AI-driven compliance and financial crime prevention platforms through applied research, experimentation, and data innovation. This role is ideal for professionals who thrive at the intersection of research and engineering, turning cutting-edge data science concepts into production-ready capabilities that enhance TookiTaki s competitive edge in fraud prevention, AML compliance, and data intelligence. The role exists to bridge research and engineering by: Designing and executing experiments on large, complex datasets. Prototyping new data-driven algorithms for financial crime detection and compliance automation. Collaborating across product, data science, and engineering teams to transition research outcomes into scalable, real-world solutions. Ensuring the robustness, fairness, and explainability of AI models within TookiTaki s compliance platform. Key Responsibilities Applied Research & Prototyping Conduct literature reviews and competitive analysis to identify innovative approaches for data processing, analytics, and model development. Build experimental frameworks to test hypotheses using real-world financial datasets. Prototype algorithms in areas such as anomaly detection, graph-based analytics, and natural language processing for compliance workflows. Data Engineering for Research Develop data ingestion, transformation, and exploration pipelines to support experimentation. Work with structured, semi-structured, and unstructured datasets at scale. Ensure reproducibility and traceability of experiments. Algorithm Evaluation & Optimization Evaluate research prototypes using statistical, ML, and domain-specific metrics. Optimize algorithms for accuracy, latency, and scalability. Conduct robustness, fairness, and bias evaluations on models. Collaboration & Integration Partner with data scientists to transition validated research outcomes into production-ready code. Work closely with product managers to align research priorities with business goals. Collaborate with cloud engineering teams to deploy research pipelines in hybrid environments. Documentation & Knowledge Sharing Document experimental designs, results, and lessons learned. Share best practices across engineering and data science teams to accelerate innovation. Qualifications and Skills Education Required: Bachelor s degree in Computer Science, Data Science, Applied Mathematics, or related field. Preferred: Master s or PhD in Machine Learning, Data Engineering, or a related research intensive field. Experience Minimum 4 7 years in data-centric engineering or applied research roles. Proven track record of developing and validating algorithms for large-scale data processing or machine learning applications. Experience in financial services, compliance, or fraud detection is a strong plus. Technical Expertise Programming: Proficiency in Scala, Java, or Python. Data Processing: Experience with Spark, Hadoop, and Flink. ML/Research Frameworks: Hands-on with TensorFlow, PyTorch, or Scikit-learn. Databases: Experience with both relational (PostgreSQL, MySQL) and NoSQL databases (MongoDB, Cassandra, ElasticSearch). Cloud Platforms: Experience with AWS (preferred) or GCP for research and data pipelines. Tools: Familiarity with experiment tracking tools like MLflow or Weights & Biases. Application Deployment: Strong experience with CI/CD practices, Containerized Deployments through Kubernetes, Docker, etc. Streaming frameworks: Strong experience in creating highly performant and scalable real time streaming applications with Kafka at the core Data Lakehouse: Experience with one of the modern data lakehouse platforms/formats such as Apache Hudi, Iceberg, Paimon is a very strong plus. Soft Skills Strong analytical and problem-solving abilities. Clear, concise communication skills for cross-functional collaboration. Adaptability in fast-paced, evolving environments. Curiosity-driven with a bias towards experimentation and iteration. Key Competencies Innovation Mindset : Ability to explore and test novel approaches that push boundaries in data analytics. Collaboration: Works effectively with researchers, engineers, and business stakeholders. Technical Depth: Strong grasp of advanced algorithms and data engineering principles. Problem Solving: Dives deep into the logs, metrics and code and identifying problems, opportunities for performance tuning and optimization Ownership: Drives research projects from concept to prototype to production. Adaptability: Thrives in ambiguity and rapidly changing priorities. Preferred Certifications in AWS Big Data, Apache Spark, or similar technologies. Experience in compliance or financial services domains. Success Metrics Research to Production Conversion: % of validated research projects integrated into TookiTaki s platform. Model Performance Gains: Documented improvements in accuracy, speed, or robustness from research initiatives. Efficiency of Research Pipelines: Reduced time from ideation to prototype completion. Collaboration Impact: Positive feedback from cross-functional teams on research integration. Benefits Competitive Salary : Aligned with industry standards and experience. Professional Development : Access to training in big data, cloud computing, and data integration tools. Comprehensive Benefits : Health insurance and flexible working options. Growth Opportunities : Career progression within Tookitaki s rapidly expanding Services Delivery team. Introducing Tookitaki Tookitaki: The Trust Layer for Financial Services Tookitaki is transforming financial services by building a robust trust layer that focuses on two crucial pillars: preventing fraud to build consumer trust and combating money laundering to secure institutional trust. Our trust layer leverages collaborative intelligence and a federated AI approach, delivering powerful, AI-driven solutions for real-time fraud detection and AML (Anti-Money Laundering) compliance. How We Build Trust: Our Unique Value Propositions AFC Ecosystem Community-Driven Financial Crime Protection The Anti-Financial Crime (AFC) Ecosystem is a community-driven platform that continuously updates financial crime patterns with real-time intelligence from industry experts. This enables our clients to stay ahead of the latest money laundering and fraud tactics. Leading digital banks and payment platforms rely on Tookitaki to protect them against evolving financial crime threats. By joining this ecosystem, institutions benefit from the collective intelligence of top industry players, ensuring robust protection. FinCense End-to-End Compliance Platform Our FinCense platform is a comprehensive compliance solution that covers all aspects of AML and fraud prevention from name screening and customer due diligence (CDD) to transaction monitoring and fraud detection. This ensures financial institutions not only meet regulatory requirements but also mitigate risks of non-compliance, providing the peace of mind they need as they scale. Industry Recognition and Global Impact Tookitaki s innovative approach has been recognized by some of the leading financial entities in Asia. We have also earned accolades from key industry bodies such as FATF and received prestigious awards like the World Economic Forum Technology Pioneer, Forbes Asia 100 to Watch, and Chartis RiskTech100. Serving some of the world s most prominent banks and fintech companies, Tookitaki is continuously redefining the standards of financial crime detection and prevention, creating a safer and more trustworthy financial ecosystem for everyone. ",
Posted 3 weeks ago
5.0 - 7.0 years
7 - 9 Lacs
chennai
Work from Office
Job Description We are looking for a highly skilled Lead Data Analyst with strong expertise in Data Warehousing & Analytics to join our team. The ideal candidate will have extensive experience in designing and managing data solutions, advanced SQL proficiency, and hands-on expertise in Python. Key Responsibilities: Design, develop, and maintain scalable data warehouse solutions. Write and optimize complex SQL queries for data extraction, transformation, and reporting. Develop and automate data pipelines using Python. Work with AWS cloud services for data storage, processing, and analytics. Collaborate with cross-functional teams to provide data-driven insights and solutions. Ensure data integrity, security, and performance optimization Qualifications 5- 7 years of experience in Data Warehousing & Analytics. Strong proficiency in writing complex SQL queries with deep understanding of query optimization, stored procedures, and indexing. Hands-o
Posted 3 weeks ago
5.0 - 8.0 years
7 - 10 Lacs
pune
Work from Office
":" Agivant is seeking an experienced Modern Microservice Developer to join our team and contribute to the design, development, and optimization of scalable microservices and data processing workflows. The ideal candidate will have expertise in Python, containerization, and orchestration tools, along with strong skills in SQL and data integration. Key Responsibilities: Develop and optimize data processing workflows and large-scale data transformations using Python. Write and maintain complex SQL queries in Snowflake to support efficient data extraction, manipulation, and aggregation. Integrate diverse data sources and perform validation testing to ensure data accuracy and integrity. Design and deploy containerized applications using Docker, ensuring scalability and reliability. Build and maintain RESTful APIs to support microservices architecture. Implement CI/CD pipelines and manage orchestration tools such as Kubernetes or ECS for automated deployments. Monitor and log application performance, ensuring high availability and quick issue resolution. Requirements Mandatory: Bachelors degree in Computer Science, Engineering, or a related field. 5-8 years of experience in Python development, with a focus on data processing and automation. Proficiency in SQL, with hands-on experience in Snowflake. Strong experience with Docker and containerized application development. Solid understanding of RESTful APIs and microservices architecture. Familiarity with CI/CD pipelines and orchestration tools like Kubernetes or ECS. Knowledge of logging and monitoring tools to ensure system health and performance. Preferred Skills: Experience with cloud platforms (AWS, Azure, or GCP) is a plus. ","Work_Experience":"5-8 years (Senior Engineer)","Job_Type":"Full time" , "Job_Opening_Name":"Python Microservice Developer" , "State":"Maharashtra" , "Country":"India" , "Zip_Code":"411045" , "id":"86180000007532968" , "Publish":true , "Date_Opened":"2025-08-05" , "Keep_on_Career_Site":false}]);
Posted 3 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
pune
Work from Office
[{"Remote_Job":false , "Posting_Title":"Sr. Data Engineer" , "Is_Locked":false , "City":"Pune" , "Industry":"IT Services" , "Job_Opening_ID":"RRF_5698" , "Job_Description":" Who are we Fulcrum Digital is an agile and next-generation digital accelerating company providing digital transformation and technology services right from ideation to implementation. These services have applicability across a variety of industries, including banking & financial services, insurance, retail, higher education, food, health care, and manufacturing. TheRole Designing and building optimized data pipelines using cutting-edge technologies in a cloud environment to drive analytical insights. Constructing infrastructure for efficient ETL processes from various sources and storage systems. Leading the implementation of algorithms and prototypes to transform raw data into useful information. Architecting, designing, and maintaining database pipeline architectures, ensuring readiness for AI/ML transformations. Creating innovative data validation methods and data analysis tools. Ensuring compliance with data governance and security policies. Interpreting data trends and patterns to establish operational alerts. Developing analytical tools, programs, and reporting mechanisms. Conducting complex data analysis and presenting results effectively. Preparing data for prescriptive and predictive modeling. Continuously exploring opportunities to enhance data quality and reliability. Applying strong programming and problem-solving skills to develop scalable solutions. Requirements Experience in the Big Data technologies (Hadoop, Spark, Nifi,Impala) 5+ years of hands-on experience designing, building, deploying, testing, maintaining, monitoring, and owning scalable, resilient, and distributed data pipelines. High proficiency in Scala/Java and Spark for applied large-scale data processing. Expertise with big data technologies, including Spark, Data Lake, and Hive. Solid understanding of batch and streaming data processing techniques. Proficient knowledge of the Data Lifecycle Management process, including data collection, access, use, storage, transfer, and deletion. Expert-level ability to write complex, optimized SQL queries across extensive data volumes. Experience on HDFS, Nifi, Kafka. Experience on Apache Ozone, Delta Tables, Databricks, Axon(Kafka), Spring Batch, Oracle DB Familiarity with Agile methodologies. Obsession for service observability, instrumentation, monitoring, and alerting. Knowledge or experience in architectural best practices for building data lakes. " , "Job_Type":"Permanent" , "Job_Opening_Name":"Sr. Data Engineer" , "State":"Maharashtra" , "Country":"India" , "Zip_Code":"411001" , "id":"613047000046604967" , "Publish":true , "Keep_on_Career_Site":false , "Date_Opened":"2025-08-12"}]
Posted 3 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
bengaluru
Work from Office
Function: The Data & Analytics team is responsible for integrating new data sources, creating data models, developing data dictionaries, and building machine learning models for Wholesale Bank. The primary objective is to design and deliver data products that assist squads at Wholesale Bank in achieving business outcomes and generating valuable business insights. Within this job family, we distinguish between Data Analysts and Data Scientists. Both roles work with data, write queries, collaborate with engineering teams to source relevant data, perform data munging (transforming data into a format suitable for analysis and interpretation), and extract meaningful insights from the data. Data Analysts typically work with relatively simple, structured SQL databases or other BI tools and packages. On the other hand, Data Scientists are expected to develop statistical models and be hands-on with machine learning and advanced programming, including Generative AI. Requirements: We are seeking a highly skilled Data Science, Machine Learning and Generative AI Specialist with 5+ years of relevant experience in Advanced Analytics, Statistical, ML model development, deep learning, and AI research. In this role, candidates will be responsible for leveraging data-driven insights and machine learning techniques to solve complex business problems, optimize processes, and drive innovation. The ideal candidate will be skilled in working with large datasets to identify opportunities for product and process optimization and using models to assess the effectiveness of various actions. They should have substantial experience in applying diverse data mining and analysis techniques, utilizing various data tools, developing and deploying models, creating and implementing algorithms, and conducting simulations. Generative AI exposure of advanced prompt engineering, chain of thought techniques, and AI agents to drive our cutting-edge will support candidacy. Qualifications: Bachelors, Masters or Ph.D in Engineering, Data Science, Mathematics, Statistics, or a related field. 5+ years of experience in Advance Analytics, Machine learning, Deep learning. Proficiency in programming languages such as Python, and familiarity with machine learning libraries (e.g., Numpy, Pandas, TensorFlow, Keras, PyTorch, Scikit-learn). Experience with generative models such as GANs (Generative Adversarial Networks), VAEs (Variational Autoencoders), and transformer-based models (e.g., GPT-3/4, BERT, DALL E). Understanding of model fine-tuning, transfer learning, and prompt engineering in the context of large language models (LLMs). Strong experience with data wrangling, cleaning, and transforming raw data into structured, usable formats. Hands-on experience in developing, training, and deploying machine learning models for various applications (e.g., predictive analytics, recommendation systems, anomaly detection). Experience with cloud platforms (AWS, GCP, Azure) for model deployment and scalability. Proficiency in data processing and manipulation techniques. Hands-on experience in building data applications using Streamlit or similar tools. Advanced knowledge in prompt engineering, chain of thought processes, and AI agents. Excellent problem-solving skills and the ability to work effectively in a collaborative environment. Strong communication skills to convey complex technical concepts to non-technical stakeholders. Good to Have: Experience in the [banking/financial services/industry-specific] sector. Familiarity with cloud-based machine learning platforms such as Azure, AWS, or GCP. Proven experience working with OpenAI or similar large language models (LLMs). Experience with deep learning, NLP, or computer vision. Experience with big data technologies (e.g., Hadoop, Spark) is a plus. Certifications in Data Science, Machine Learning, or AI. Key Responsibilities: Extract and analyze data from company databases to drive the optimization and enhancement of product development and marketing strategies. Analyze large datasets to uncover trends, patterns, and insights that can influence business decisions. Leverage predictive and AI/ML modeling techniques to enhance and optimize customer experience, boost revenue generation, improve ad targeting, and more. Design, implement, and optimize machine learning models for a wide range of applications such as predictive analytics, natural language processing, recommendation systems, and more. Stay up-to-date with the latest advancements in data science, machine learning, and artificial intelligence to bring innovative solutions to the team. Communicate complex findings and model results effectively to both technical and non-technical stakeholders. Implement advanced data augmentation, feature extraction, and data transformation techniques to optimize the training process. Deploy generative AI models into production environments, ensuring they are scalable, efficient, and reliable for real-time applications. Use cloud platforms (AWS, GCP, Azure) and containerization tools (e.g., Docker, Kubernetes) for model deployment and scaling. Create interactive data applications using Streamlit for various stakeholders. Conduct prompt engineering to optimize AI models performance and accuracy. Continuously monitor, evaluate, and refine models to ensure performance and accuracy. Conduct in-depth research on the latest advancements in generative AI techniques and apply them to real-world business problems.
Posted 3 weeks ago
6.0 - 11.0 years
8 - 13 Lacs
mumbai
Work from Office
Join us as an "FIF Trading - Analyst" at Barclays. The role owner will be responsible to make sure they act as an extension of the trading desk and help with daily reporting, analytics around P&L generation, book management by resolving loss making trades or any other P&L slippages, funding efficiency with complete ownership. The role holder would also have to create tactical analytical tools to help the desk make informed decisions and support their daily/monthly processes. You may be assessed on the key critical skills relevant for success in role, such as experience with FIF Trading - Analyst, as well as job-specific skillsets. To be successful as an "FIF Trading - Analyst", you should have experience with: Basic/ Essential Qualifications: Have a good basic understanding of Fixed Income Financing business flow across region and sub-asset class. Develop and maintain Python-based applications and scripts for automation, data processing, and system integration. Collaborate with IT infrastructure teams to build tools that enhance system monitoring, deployment, and performance. Design RESTful APIs and microservices for internal tools and platforms. Work with databases (SQL/NoSQL) for data storage, retrieval, and reporting. Desirable skillsets/ good to have: Implement CI/CD pipelines and automate deployment workflows using tools like Jenkins, GitLab CI, or similar. Ensure code quality through unit testing, integration testing, and code reviews. Collaborate across front office teams for trades/product to provide creative solutions via actionable analytics using technology stack / FIF data architecture for strategic solutions. Client Metrics/Financial Metrics based dashboards/reports for Top Management. The role requires sound understanding of trade life cycle of Securities Lending & Repo Financing, bond analysis and bond pricing concepts along with good python /sql programming skills and an ability to liaise with the trading desks to ensure the timely and accurate delivery of tasks/projects performed. Other Skills Include: This is a position which will require a high level of engagement with a variety of stakeholders across the firm, including Trading, Sales, Risk, Legal, QA, Technology and Operations. The role holder will interact with regional traders to understand the key areas of business, the P&L drivers, the risks involved and work closely to build and maintain solutions/models/reports that can help in making the business more efficient from cost / revenue perspective. Take ownership for managing risk and strengthening controls in relation to the work you do. Ensure that all activities and duties are carried out in full compliance with regulatory requirements, Enterprise-Wide Risk Management Framework and internal Barclays Policies and Policy Standards. Professional experience in Python development across Global Markets/Financing preferred (not mandatory). Required Hands on experience / knowledge on Database languages like Python, SQL with an expert understanding of databases, BI tools like Tableau, Spotfire or Power BI. Strong understanding of object-oriented programming and design patterns. Experience with frameworks like Flask, Django, or FastAPI and version control systems (Git) and agile development methodologies. Knowledge of networking concepts, system security, and IT operations is a plus. Good knowledge on Fixed Income concepts and familiarity across Stock Loan, Securities Lending, Prime Brokerage and Triparty (Finance) in general. Ability to train, upskill and empower the team to enhance team s capability and engagement to be able to attract additional / more complex projects from the regional stakeholders. Excellent problem-solving, communication, and collaboration skills. Well-organised with good time management skills and the ability to meet tight deadlines and prioritise tasks. This role will be based out of Nirlon Knowledge Park, Mumbai. Purpose of the role Liquid Finance Platform covers a variety of roles and is central to the delivery of best-in-class products and services, and for the provision of strategic client and risk solutions across the full spectrum of the Prime Financing businesses. Accountabilities Working with clients to optimise the Liquid Financing relationship Provide best-in-class service and escalation oversight Primary contact for Liquid Financing clients in areas such as trading, risk, billing and reporting Provide expertise on industry and regulatory initiatives Subject matter expert for our clients, with a deep understanding of each client s business mix, operational requirements and product sensitivities Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions. Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L Listen and be authentic, E Energise and inspire, A Align across the enterprise, D Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc).to solve problems creatively and effectively. Communicate complex information. Complex information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes.
Posted 3 weeks ago
6.0 - 11.0 years
8 - 13 Lacs
gurugram
Work from Office
Position: MLOPs Developer (NV610FCT RM 3522) Job Description: Required Qualifications: 6+ years of experience in machine learning operations or software/platform development. Strong experience with Azure ML, Azure DevOps, Blob Storage, and containerized model deployments on Azure. Strong knowledge of programming languages commonly used in AI/ML, such as Python, R, or C++. Experience with Azure cloud platform, machine learning services, and best practices. Roles: Design, develop, and maintain complex, high-performance, and scalable MLOps systems that interact with AI models and systems. Cooperate with cross-functional teams, including data scientists, AI researchers, and AI/ML engineers, to understand requirements, define project scope, and ensure alignment with business goals. Offer technical leadership and expertise in choosing, evaluating, and implementing software technologies, tools, and frameworks in a cloud-native (Azure + AML) environment. Troubleshoot and resolve intricate software problems, ensuring optimal performance and reliability when interfacing with AI/ML systems. Participate in software development project planning and estimation, ensuring efficient resource allocation and timely solution delivery. Contribute to the development of continuous integration and continuous deployment (CI/CD) pipelines. Contribute to the development of high-performance data pipelines, storage systems, and data processing solutions. Drive integration of GenAI models (e.g., LLMs, foundation models) in production workflows, including prompt orchestration and evaluation pipelines. Support edge deployment use cases via model optimization, conversion (e.g., to ONNX, TFLite), and containerization for edge runtimes. Contribute to the creation and maintenance of technical documentation, including design specifications, API documentation, data models, data flow diagrams, and user manuals. Preferred Qualifications: Experience with machine learning frameworks such as TensorFlow, PyTorch, or Keras. Experience with version control systems, such as Git, and CI/CD tools, such as Jenkins, GitLab CI/CD, or Azure DevOps. Knowledge of containerization technologies like Docker and Kubernetes, and infrastructureas-code tools such as Terraform or Azure Resource Manager (ARM) templates. Experience with Generative AI workflows, including prompt engineering, LLM fine-tuning, or retrieval-augmented generation (RAG). Exposure to GenAI frameworks: LangChain, LlamaIndex, Hugging Face Transformers, OpenAI API integration. Experience deploying optimized models on edge devices using ONNX Runtime, TensorRT, OpenVINO, or TFLite. Hands-on with monitoring LLM outputs, feedback loops, or LLMOps best practices. Familiarity with edge inference hardware like NVIDIA Jetson, Intel Movidius, or ARM CortexA/NPU devices. Job Category: Digital_Cloud_Web Technologies Job Type: Full Time Job Location: Gurgaon Experience: 6-10 years Notice period: 0-30 days
Posted 3 weeks ago
12.0 - 15.0 years
45 - 50 Lacs
bengaluru
Work from Office
Role Overview As a Principal Architect at Plum, you will set the architectural vision and technical strategy for Plum s Insurance and Health care platforms. You ll be responsible for designing highly scalable, resilient, and secure platforms that power our product ecosystem. You ll work closely with engineering, product, business and cross-functional stakeholders to define the technical roadmap, design systems, and ensure our platform is robust and secure for long-term scale while enabling rapid innovation. You ll play a critical role in setting the engineering best practices, mentoring teams, and solving complex architectural challenges across products. Roles and Responsibilities Define and enforce architectural standards, principles, and patterns across services and platforms. Collaborate with engineering teams to design scalable, event-driven, and secure systems. Work closely with the product and data teams to align engineering goals with business outcomes. Ensure high availability, fault tolerance, and performance of mission-critical applications. Responsible for design, architecture, and delivery of a feature or component/product with the highest quality with high-level directions Driving innovations in the platform constantly & remaining ahead of the curve Provide functional, design, and code reviews in related areas of expertise with-in team and cross-team. Mentors/coaches engineers to facilitate their development and provide technical leadership to them Evaluate and introduce new technologies, frameworks, and tools to improve developer productivity and system performance. Oversee API design principles, ensuring consistency, maintainability, and security across all services. Rises above detail to see broader issues and implications for the whole product/team. Desired Skills and Experience BE/MS in Computer Science or equivalent. 12-15 years of strong design/development experience, including significant time in senior or architect-level roles Proven expertise in at least two backend technology stacks such as Node.js, Java, Python, Go, or similar. Deep understanding of frameworks like Express (Node.js), Spring Boot (Java), Django (Python), or equivalent. Strong experience designing and scaling distributed systems, event-driven architectures, and microservices. Experience integrating multiple data sources and databases, with a strong grasp of data modeling and performance optimization. Strong understanding of API design, security best practices, and high-throughput data processing. Experience with cloud-native architectures (AWS, GCP, or Azure) and infrastructure-as-code tools. Experience and knowledge of open source software, frameworks and broader cutting edge technologies. Exposure to Gen AI technologies such as OpenAI, Anthropic APIs, frameworks like LangChain or LlamaIndex; experience with vector databases for retrieval-augmented generation; and familiarity with prompt engineering, fine-tuning, model evaluation, and scalable deployment. Superior organization, communication, interpersonal and leadership skills Must be a proven performer and team player that enjoy challenging assignments in a high-energy, fast growing and start-up workplace Must be a self-starter who can work well with minimal guidance and in fluid environment Agility and ability to adapt quickly to changing requirements and scope and priorities
Posted 3 weeks ago
16.0 - 21.0 years
50 - 55 Lacs
bengaluru
Work from Office
A Position Overview Position Title Data Science Department TDABSG Level/ Band VP Role Summary : He or She will be responsible for advancing analytics opportunities in the BFSIsector, managing project delivery, people, and stakeholders. The ideal candidate will have a strong background in advanced analytics and experience in leading high-impact analytics solutions and programs with Indian clients. He/she should have setup and managed large, scaled Business intelligence unit B Organizational Relationships Reports to SVP Supervises NA C Job Dimensions Geographic Area Covered Pan India Stakeholders Internal TDA-BSG and all other TATA AIA internal departments External Partners D Key Result Areas Stakeholder Management Manage stakeholder interactions through regular updates and grow stakeholder engagement with channel partners. Design, plan, and scope projects with stakeholders. Explain project methodology and project approach to required stakeholders. Project Management and Training Manage end-to-end project deliverables, adhering to timelines, project budgets, and stakeholder expectations. Ensure adherence to Standard Operating Procedures and maintain updated relevant documents. Work with junior team members on projects and coach them on-the-job. Conduct in-house training based on team requirements. Delivery Strong experience in multi-scale, multi- geography experience of setting up and managing a team of 20+ data scientists. Manage the entire delivery process and be responsible for all aspects of a project to ensure high-quality standards. Understand business problems and address them to solve client problems. E Competencies Competency For Proficiency Scale Proficiency Scale Description Eg Business Acumen and Strategic Orientation Ability to align with the organizations vision with a fair understanding of the insurance industry, regulations, financial markets and the agency business model in order to deliver profitable and sustainable business growth. 3 Has an in-depth understanding of the market of operations and proposes changes if required as per market dynamics. Understands how and to what extent business complexities impacts ones own area of work. Is able to identify trends and analyze performance of self and various branches in ones geography. Has expert knowledge of BA model, BA compensation and Agent commission and progression and uses it to deliver superior performance. Is able to device plans to recruit, develop and sustain distributors on a long term basis with organization. Strives towards achieving proportionate business contribution from all the branches of assigned territory. G Skills Required Technical Data processing and data science libraries of Python (NumPy, Pandas, Scikit learn, etc.). Experience and understanding of working with massive datasets locally/ distributed using tools such as Apache Spark, and familiarity with packages like Vaex, and Dask, and using these datasets to train and develop and train machine learning algorithms. Knowledge of analytical models such as promotion optimization, Natural Language Processing (NLP) including experience in (BERT and DisilBert Models), Cluster Analysis, Segmentation, Next best recommendation, Neural network models, Logistic Regression (Fraud Models, Lapsation etc), ANN based model, LSTM, Transformers, Attention Models, Bagging and boosting, Generative model, Experience with cloud-based analytics platforms, such as Azure, GCP, or AWS. Familiarity with automated training, deployment, and monitoring of models in production using ML-Ops pipelines. Experience working on projects end-to-end, not just POCs. Experience in machine learning and deep learning. Deep expertise in Python, SAS. Behavioral Essential Desired Interpersonal skills Communication skills Creative thinking skills Supervising/Leadership skills Teamwork Skills Influencing skills Relationship Building skills Decision making skills H Incumbent Characteristics Essential Desired Qualification 1. Graduate/Postgraduate degree from Tier-I/II colleges. 2. Ability to design and review new solution concepts and lead the delivery of high-impact analytics solutions and programs for global clients. 3. Strong understanding of statistical concepts. Experience 16+ years of experience in advanced analytics, running data science function
Posted 3 weeks ago
8.0 - 13.0 years
10 - 15 Lacs
chennai
Work from Office
About the Team: ZF COE Team is effectively communicate complex technical concepts related to AI, ML, DL, and RL to both technical and non-technical audiences. This might involve presenting research findings at conferences or writing papers for academic journals. What you can look forward to as AI Research Scientist (m/f/d): Conduct cutting-edge research to identify and develop novel AI/ML methodologies, including Deep Learning (DL) and Reinforcement Learning (RL). Design and conduct experiments to test hypotheses, validate new approaches, and compare the effectiveness of different ML algorithms. Analyze data to uncover hidden patterns and relationships that can inform the development of new AI techniques. Stay at the forefront of the field by keeping abreast of the latest advancements in algorithms, tools, and theoretical frameworks. This might involve researching areas like interpretability of machine learning models or efficient training methods for deep neural networks. Prototype and explore the potential of advanced machine learning models, including deep learning architectures like convolutional neural networks (CNNs) or recurrent neural networks (RNNs). Contribute to the development of fundamental algorithms and frameworks that can be applied to various machine learning problems. This may involve improving existing algorithms or exploring entirely new approaches. Focus on theoretical aspects of model design, such as improving model efficiency, reducing bias, or achieving explainability in complex models. Document research methodologies, experimental procedures, and theoretical contributions for future reference and knowledge sharing within the research community. Contribute to the development of the research team''s long-term research goals and strategies. Your profile as AI Research Scientist (m/f/d): Master''s degree in Mathematics, Computer Science or other related technical fields. Phd is good to have. 8+ years of experience in research development, with a strong understanding of data structures and algorithms. 5+ years of experience building and deploying machine learning models in production. Proven experience with machine learning frameworks like TensorFlow, PyTorch, or scikit-learn. Experience with distributed computing systems and large-scale data processing & Excellent communication and collaboration skills. Contribution to invention disclosure process & A strong publication record in top machine learning conferences or journals.
Posted 3 weeks ago
5.0 - 10.0 years
7 - 11 Lacs
pune
Work from Office
We re looking for an Azure Cloud Engineer to play a critical role in driving DevOps excellence across our organization. The role will entail design, build and operational aspects of our Azure cloud infrastructure supporting multiple teams. This position will work with architecture team to deploy Cloud based solutions using PAAS, IAAS, etc. Azure resources through IAC leveraging Terraform. The ideal candidate will possess extensive experience in Azure Cloud and Terraform. You will work closely with the engineering and technical support teams to proactively improve our offerings to our clients from a scalability, reliability, security and speed to market perspective. This position will primarily be based out of our Pune office and the resource is expected to be in the office for at least 3 to 4 days on a weekly basis depending on the workload. Responsibilities: Design, build, and maintain Azure cloud infrastructure to support multiple teams. Collaborate with the architecture team to deploy cloud-based solutions using PaaS, IaaS, and other Azure resources through IaC leveraging Terraform. Develop, implement, and deploy applications to the Azure cloud. Deploy and manage applications leveraging Azure services such as App Gateway, APIM, Container Apps, Azure SQL Database, Redis, etc. Configure and deploy using various network topologies (VNET, NSG, Private DNS, Private Endpoints, ExpressRoute, etc.). Monitor Azure applications using services such as Log Analytics, Application Insights, and Metrics. Provision Azure resources using Infrastructure as Code (IaC) tools such as Terraform. Collaborate with engineering/development squads to design, implement, and optimize CI/CD pipelines. Evaluate and recommend tools, technologies, and processes related to Azure services and DevOps.Ensure deployments follow best practices for scalability, security, and reliability. Enable engineering teams by standardizing DevOps resources and knowledge sharing. Stay current on industry trends and advancements in Azure services and DevOps practices. . Education: Bachelor s degree in Programming/Systems, Computer Science, or a related field, or equivalent work experience. Work Experience: 5+ years of experience in DevOps roles, with a focus on Azure DevOps as an engineer, technical lead, or related cloud-focused role. 5+ years of experience in software definition, analysis, testing, configuration, and/or development. Broad understanding of information systems and application architecture. Proven track record of leading and delivering complex DevOps projects. Experience working in production environments to support, operate, and maintain applications. Skills and Knowledge: Strong knowledge of Microsoft Azure cloud-native services (e.g., DNS, Application Gateway, Azure SQL Server, App Services, Blob Storage, Load Balancer, WAF). Deep understanding of DevOps technologies including IaaS, PaaS, SaaS, containerization, orchestration, and CI/CD. Proficiency with Infrastructure as Code (IaC) tools such as Terraform. Strong scripting skills (e.g., Bash, Python). Extensive experience with containerization technologies, including Docker and Kubernetes. Strong networking knowledge (IP addressing, virtual networks, security models). Familiarity with observability and site-reliability principles (SLIs, SLOs, SLAs). Good understanding of SQL and relational databases. Experience with logging and monitoring tools (e.g., Azure Monitor, Log Analytics, Dynatrace). Solid knowledge of the OSI model. Excellent problem-solving, presentation, and interpersonal skills. Ability to work effectively under pressure in fast-paced, collaborative environments. Relevant Azure certifications (e.g., Azure DevOps Engineer Expert, Azure Architect, or similar). Desirable: Knowledge of programming languages (e.g., Java, JavaScript). Understanding of data engineering infrastructure and data processing. Cyber Security Essentials or similar certification. Experience in vulnerability management (e.g., Checkmarx). Experience with single sign-on products (e.g., SAP Gigya/CDC).
Posted 3 weeks ago
4.0 - 9.0 years
13 - 14 Lacs
bengaluru
Work from Office
Own, manage and co-ordinate direct tax compliance filing of India and foreign entities with the assistance from third party consultants Manage multiple third-party service providers to complete global compliance, VAT compliance and various global consulting. Plan and coordinate with financial audits for India required for 12/31 and 3/31. Own and manage the Local and international monthly closure activities of global consolidation (journal entries, reconciliations, and analysis) Prepare and initiate foreign currency settlement of India inter-company receivable and support bank queries STPI regulations: Preparation and filing of STPI returns and to monitor and liaise with India banks to close the Export Data Processing Management (EDPMS) Manage ad-hoc projects as and when requested and create and update standard operating procedures (SOP) Education : Inter (CA), MBA, MCOM Skill set: 4 + years of work experience Excellent communications skills Independent and Proactive Ability to take Ownership Ability to communicate across multiple business functions Flexibility in working with teams in various time zones
Posted 3 weeks ago
2.0 - 11.0 years
30 - 35 Lacs
pune
Work from Office
HSBC electronic data processing india pvt ltd is looking for Senior Software Engineer to join our dynamic team and embark on a rewarding career journey Develop high-quality software design and architecture Identify, prioritize and execute tasks in the software development life cycle Develop tools and applications by producing clean, efficient code Automate tasks through appropriate tools and scripting Review and debug code Perform validation and verification testing Collaborate with internal teams and vendors to fix and improve products Document development phases and monitor systems Ensure software is up-to-date with latest technologies Serve as a technical lead contributing to and directing the efforts of development teams, including internal and external team members. Contribute to the ongoing evolution of the existing content supply portfolio of applications and services. Design, develop, modify, implement, and support software components anywhere in the software stack. Determine root cause for the most complex software issues and develop practical, efficient, and permanent technical solutions. Remain current on new technologies and available vendor packages; evaluate and make recommendations as necessary. Assist in task planning, estimation, scheduling, and staffing. Mentor Software Engineers to allow for skill/knowledge development through advice, coaching, and training opportunities. Determine process improvements, best practices, and develop new processes. Work in close partnership with cross-functional teams and management.
Posted 3 weeks ago
7.0 - 12.0 years
10 - 14 Lacs
bengaluru
Work from Office
Your future role Take on a new challenge and apply your data engineering expertise in a cutting-edge field. Youll work alongside collaborative and innovative teammates. You'll play a key role in enabling data-driven decision-making across the organization by ensuring data availability, quality, and accessibility. Day-to-day, youll work closely with teams across the business (e.g., Data Scientists, Analysts, and ML Engineers), mentor junior engineers, and contribute to the architecture and design of our data platforms and solutions. Youll specifically take care of designing and developing scalable data pipelines, but also managing and optimizing object storage systems. Well look to you for: Designing, developing, and maintaining scalable and efficient data pipelines using tools like Apache NiFi and Apache Airflow. Creating robust Python scripts for data ingestion, transformation, and validation. Managing and optimizing object storage systems such as Amazon S3, Azure Blob, or Google Cloud Storage. Collaborating with Data Scientists and Analysts to understand data requirements and deliver production-ready datasets. Implementing data quality checks, monitoring, and alerting mechanisms. Ensuring data security, governance, and compliance with industry standards. Mentoring junior engineers and promoting best practices in data engineering. All about you We value passion and attitude over experience. Thats why we dont expect you to have every single skill. Instead, weve listed some that we think will help you succeed and grow in this role: Bachelors or Masters degree in Computer Science, Engineering, or a related field. 7+ years of experience in data engineering or a similar role. Strong proficiency in Python and data processing libraries (e.g., Pandas, PySpark). Hands-on experience with Apache NiFi for data flow automation. Deep understanding of object storage systems and cloud data architectures. Proficiency in SQL and experience with both relational and NoSQL databases. Familiarity with cloud platforms (AWS, Azure, or GCP). Exposure to the Data Science ecosystem, including tools like Jupyter, scikit-learn, TensorFlow, or MLflow. Experience working in cross-functional teams with Data Scientists and ML Engineers. Cloud certifications or relevant technical certifications are a plus.
Posted 3 weeks ago
5.0 - 10.0 years
7 - 17 Lacs
bengaluru
Work from Office
About this role: Wells Fargo is seeking a Lead Software Engineer. In this role, you will: Lead complex technology initiatives including those that are companywide with broad impact Act as a key participant in developing standards and companywide best practices for engineering complex and large scale technology solutions for technology engineering disciplines Design, code, test, debug, and document for projects and programs Review and analyze complex, large-scale technology solutions for tactical and strategic business objectives, enterprise technological environment, and technical challenges that require in-depth evaluation of multiple factors, including intangibles or unprecedented technical factors Make decisions in developing standard and companywide best practices for engineering and technology solutions requiring understanding of industry best practices and new technologies, influencing and leading technology team to meet deliverables and drive new initiatives Collaborate and consult with key technical experts, senior technology team, and external industry groups to resolve complex technical issues and achieve goals Lead projects, teams, or serve as a peer mentor Required Qualifications: 5+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: 5+ years of Software Engineering experience 5+ years of experience with large volume data processing and big data tools such as Apache Spark, SQL, Scala and Hadoop technologies Hands on Core Java experience Experience with data processing pipelines, including Spark Streaming and Spark SQL Advanced understanding of data modeling, SQL and database design Advanced knowledge of database technologies: MongoDB, SQL/No SQL DB. Engineering and development experience on middleware messaging platforms: IBM MQ. Solace & Confluent Kafka. Experience with DevOps practices in cloud-based data environments Experience with Agile Scrum (Daily Standup, Sprint Planning and Sprint Retrospective meetings) Job Expectations: Ability to work in a fast-paced environment both as an individual contributor and a tech lead Consistently demonstrates clear and concise written and verbal communication Strong understanding of distributed computing principles and experience with distributed data processing frameworks Exposure to data governance, security and compliance frameworks Familiarity with data observability tools
Posted 3 weeks ago
8.0 - 13.0 years
11 - 15 Lacs
bengaluru
Work from Office
We are looking for: Lead Software Engineer Python , Youll make an impact by: Lead the design and development of Python-based software components for AI-driven systems. Define and uphold coding standards, best practices, and architectural principles (OOP, SOLID, design patterns). Collaborate with AI/ML engineers to integrate and productionize Machine Learning, Deep Learning, and Generative AI models. Architect and develop scalable RESTful APIs and backend systems using FastAPI or equivalent frameworks. Advocate for performance optimization, testability, and non-functional requirements (NFRs) across all solutions. Champion CI/CD practices, observability (logging, metrics, tracing), and maintain system reliability at scale. Mentor junior engineers and create a culture of high-quality, maintainable software development. Contribute to solution design for RAG (Retrieval-Augmented Generation) and Agentic AI workflows. Collaborate with multi-disciplinary teams to align software solutions with AI and business goals. Use your skills to move the world forward! Bachelors or Masters degree in Computer Science, Engineering, or a related field. 8+ years of hands-on experience in backend software development, with significant expertise in Python. Proven experience in leading and mentoring software development teams. Strong command over software architecture, design patterns, and clean coding principles. Experience in building and scaling API-based systems, preferably with FastAPI or similar frameworks. Solid understanding of integrating ML/DL/GenAI models into production applications. Familiarity with RAG, single and multi-agent architectures, and AI solution patterns. Practical experience with AWS (Sagemaker, Bedrock) and/or Azure (ML Studio, OpenAIq Service). Exposure to MLOps, model versioning, and observability tools. Working knowledge of Java or Rust is a plus. Experience designing software systems for cloud-native environments. Prior experience working in the Power and Energy domain. Familiarity with scalable data processing, real-time systems, or event-driven architectures. Exposure to open-source tools and frameworks in the GenAI ecosystem (e.g., LangGraph, LlamaIndex, SmolAgents). Deep commitment to quality, performance, and engineering excellence.
Posted 3 weeks ago
1.0 - 3.0 years
3 - 5 Lacs
mohali
Remote
Role & responsibilities 1.Data Pipeline and Model Development 2. ML/AI Project Implementation 3. LLM Development and Implementation 4. Quality Assurance and Testing 5. Research and Development 6. Collaboration and Communication Preferred candidate profile Machine Learning Algorithms 2. Deep Learning Architectures 3. Mathematics & Statistics 4. Data Structures & Algorithms 5. MLOps Tools and Practices 6. Big Data Technologies 7. Cloud Computing Platforms 8. Model Optimization Techniques 9. Data Privacy & Security 10. Software Development Life Cycle 11. Transformer Architecture 12. LLM Frameworks and Tools 13. Token-based Processing 14. Context Window Management
Posted 3 weeks ago
4.0 - 9.0 years
10 - 20 Lacs
pune, mumbai (all areas)
Work from Office
Job Title Process Mining & Data Analytics Analyst - Celonis Job Type: Full-Time Experience Range 4-6 Years Job Location Mumbai / Pune Work Mode WFO Max Budget Up to 20 LPA including 10% variables Notice Period Requirement 0-15 Days Job Summary: We are seeking a highly analytical and detail-oriented Analyst with strong expertise in process mining tools (e.g., Celonis, UiPath Process Mining, Signavio), SQL, PQL, Power BI, and VBA. The ideal candidate will support business stakeholders by uncovering insights into process performance, identifying bottlenecks and inefficiencies, and driving data-driven decision-making across the organization. Key Responsibilities: Utilize process mining tools to analyze end-to-end business processes and identify improvement opportunities. • Build and maintain SQL queries to extract, transform, and load (ETL) data from various sources. • Develop interactive and insightful dashboards and reports using Power BI to visualize KPIs and trends. • Automate recurring data tasks and report generation using VBA within Excel and other Microsoft Office tools. • Collaborate with cross-functional teams to understand business processes, pain points, and requirements. • Translate complex datasets into actionable insights for stakeholders across operations, finance, and IT. • Support digital transformation and process improvement initiatives with data-driven evidence. • Document data models, process flows, and reporting logic for transparency and maintainability. Required Skills and Experience: Bachelor's degree in Computer Science, Information Systems, Business Analytics, Engineering, or a related field. • 5+ years of experience in data analysis, process mining, or business intelligence roles. • Hands-on experience with process mining tools (e.g., Celonis, UiPath Process Mining, Disco, Signavio). • Proficient in SQL for data extraction and manipulation. • Strong proficiency in Power BI (including DAX, data modeling, custom visuals). • Skilled in VBA for Excel automation and custom tool development. • Solid understanding of business processes in domains such as finance, procurement, operations, or supply chain. • Strong analytical, problem-solving, and communication skills. • Ability to work independently and manage multiple tasks in a fast-paced environment. Preferred Qualifications: Experience with scripting languages like Python or R is a plus. • Familiarity with ERP systems (e.g., SAP, Oracle). • Knowledge of Lean Six Sigma or other process improvement methodologies. • Certification in process mining tools (e.g., Celonis Analyst Certification) is a plus. Education: B.E, B.Tech, BCA, MCA, B.Sc IT or Similar qualification in Computer Science
Posted 3 weeks ago
5.0 - 9.0 years
19 - 20 Lacs
bengaluru
Work from Office
We are seeking a skilled and motivated individual to join our team. The ideal candidate will have experience working with Microsoft CoPilot Studio and Azure Communication Framework, with a passion for building scalable and reliable solutions. Key Responsibilities Design, develop, and implement custom applications using CoPilot Studio, ensuring alignment with business requirements. Integrate and maintain communication tools and services utilizing the Azure Communication Framework. Collaborate with product owners and technical leads to design effective workflows and solutions. Develop, test, and deploy end-to-end solutions, ensuring scalability, reliability, and maintainability. Troubleshoot, debug, and optimize applications for maximum performance and functionality. Conduct code reviews and follow best practices for code quality, version control, and testing. Stay updated with the latest trends and developments in Microsoft CoPilot, Azure technologies, and other relevant tools to enhance solutions.
Posted 3 weeks ago
1.0 - 2.0 years
3 - 6 Lacs
dhule
Work from Office
Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)
Posted 3 weeks ago
13.0 - 17.0 years
32 - 35 Lacs
noida
Work from Office
Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)
Posted 3 weeks ago
0.0 - 1.0 years
1 - 1 Lacs
chennai
Work from Office
Role: Data Processing Executive Location : Chennai Validate and correct OCR-extracted data Ensure accuracy and consistency Update internal systems with clean data Resolve data discrepancies Maintain SLAs and quality standards UG/PG & Freshers Gratuity Provident fund Referral bonus Performance bonus Annual bonus Health insurance Maternity benefits in mediclaim policy Shift allowance Over time allowance
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |