Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 years
0 Lacs
india
Remote
Title: Senior Snowflake Consultant Position: Snowflake Data Architect Location: Remote Experience Required: 8–10 years (minimum 6 years in Snowflake & Matillion, BFSI domain expertise mandatory) We are seeking a highly skilled Snowflake Expert with hands-on experience in Matillion ETL and proven expertise in the Banking, Financial Services domain. The ideal candidate will design, develop, and optimize scalable data pipelines, data models, and analytics solutions on Snowflake, ensuring compliance, security, and performance to meet regulatory and business needs. Responsibilities: Design, implement, and optimize Snowflake data warehouses, schemas, and data-sharing solutions. Define and implement best practices around performance tuning, cost optimization, and data governance. Create scalable data models supporting BFSI use cases (e.g., regulatory reporting, fraud detection, risk analytics). Build, schedule, and monitor complex data pipelines using Matillion. Optimize ELT workflows for large-scale BFSI data volumes (transactional, customer, risk, and compliance datasets). Implement robust error handling, logging, and recovery mechanisms. Understanding of Maia Translate Bnaking data requirements into scalable Snowflake solutions. Work with structured/unstructured data like core banking systems, payments, credit risk, AML/KYC, collections. Ensure regulatory compliance in data management. Partner with business stakeholders, data scientists, and reporting teams to deliver analytical datasets. Collaborate with cloud engineers to ensure secure, compliant cloud deployments. Support and mentor junior engineers on Snowflake and Matillion best practices. Requirements: Minimum 8 years of experience with over 6 years hands-on experience with Snowflake (data modeling, performance tuning, secure data sharing, resource monitors, streams & tasks). Advanced Matillion ETL development (or similar cloud-native ETL tools). Proficiency in SQL, Python, and cloud services (AWS/GCP/Azure). Experience with CI/CD pipelines, Git, DevOps integration for data workflows. Deep understanding of financial data models (customer 360, payments, credit/debit, claims, risk, fraud). Exposure to BFSI regulations & compliance in data handling and reporting. Experience delivering analytics for fraud detection, credit risk scoring, AML/KYC. Strong analytical and problem-solving skills. Excellent communication and stakeholder management. Ability to lead data discussions with both business and technical teams. Preferred Qualifications Snowflake SnowPro Advanced Architect / Data Engineer certification. Experience with Tableau/Power BI for reporting layer integration. BFSI-specific project experience in core banking, digital payments, wealth management analytics Familiarity with data security, masking, tokenization, and encryption in BFSI context.
Posted 6 days ago
0 years
0 Lacs
mumbai, maharashtra, india
On-site
MAIN DUTIES/RESPONSIBILITIES: Technical & Delivery Leadership Own and manage the entire lifecycle of wallet implementation projects—from initiation and planning to certification and go-live. Lead architectural and technical specification reviews, ensuring alignment with digital wallet provider and Visa standards. Oversee API integration efforts, including ISO and API formats for card payment processing. Ensure secure and compliant implementation of Visa and Mastercard Tokenization, including token provisioning, ID&V flows, and fraud mitigation strategies. Drive operational readiness assessments, provisioning workflows, and performance validation. Stakeholder & Project Management Act as the primary liaison between internal teams, clients, wallet partner, and other external partners. Coordinate cross-functional efforts to ensure timely, compliant, and high-quality delivery. Facilitate collaboration across business, technology, and compliance teams, resolving roadblocks and aligning priorities. Lead project planning, resource allocation, milestone tracking, and risk management. Communicate project status, risks, and dependencies to senior leadership and client stakeholders. Compliance & Certification Guide clients through the digital wallet technical specification, certification process, including lab testing. Ensure full compliance with digital wallet functional/operational guidelines and Visa’s functional and security requirements. Maintain detailed documentation, audit trails, and readiness for internal and external reviews. Monitor and report on key performance indicators such as provisioning success rates, fraud rates, and user adoption metrics. SKILLS & EXPERIENCE Qualifications & Expertise: Strong knowledge of card payment processing standards (ISO, APIs). Proven experience in API implementation and integration projects. Deep understanding of Visa Token Service (VTS),issuer APIs, and mobile app provisioning. Familiarity with digital wallet specifications, operational guidelines, and certification processes. Demonstrated ability to manage multiple stakeholders including clients, technical teams, and external partners. Excellent project management skills with a track record of successful end-to-end delivery. Bachelor’s or Master’s degree in Computer Science, Engineering, Business, or related field. Certification in Project Management or Agile methodologies. Prior experience with Visa, MasterCard, or other payment networks is a strong advantage.
Posted 6 days ago
3.0 - 4.0 years
5 - 8 Lacs
bengaluru, karnataka, india
On-site
Key Responsibilities: Globalize/Localize/Test features using native language expertise Collaboration with internal teams to address technical / operational challenges. Problem-Solving / Troubleshooting Identify and troubleshoot technical issues in a timely manner. Communication: Effectively communicate with team members, stakeholders, and clients. Learning Stay updated with the latest industry trends and technologies. Continuous evaluation and improvements of processes to enhance the partner experience and drive business growth. Language Proficiency :Native/Read-Write-Spoken proficiency in one or more of the following languages is essential: Language Priority Punjabi and HindiP0 TamilP0 Bengali and HindiP1Gujarati and HindiP1 Malayalam P1Urdu and HindiP2 TeluguP2 Technical Requirements: Must-Have Skills: Strong understanding of software internationalization, localization processes, and cross-site communication challenges Knowledge of localization of search features and how cultural nuances impact search results. Advanced proficiency in Google Sheets (formulas, data manipulation, analysis). Ability to create data visualizations and dashboards to monitor localization progress and efficiency . Ability to handle dynamic changes and evolving priorities in an agile environment. Good-to-Have Skills: Understanding of Basic NLP Text Preprocessing: Tokenization, stopword etc. Part-of-Speech Tagging: Assigning parts of speech to words in a sentence (e.g., noun, verb). Basic understanding of metrics like precision, recall and accuracy Basic Analytics skills 3 must have Language specialist Localize
Posted 6 days ago
10.0 years
0 Lacs
gurugram, haryana, india
Remote
Our story At Alight, we believe a company’s success starts with its people. At our core, we Champion People, help our colleagues Grow with Purpose and true to our name we encourage colleagues to “Be Alight.” Our Values: Champion People – be empathetic and help create a place where everyone belongs. Grow with purpose – Be inspired by our higher calling of improving lives. Be Alight – act with integrity, be real and empower others. It’s why we’re so driven to connect passion with purpose. Alight helps clients gain a benefits advantage while building a healthy and financially secure workforce by unifying the benefits ecosystem across health, wealth, wellbeing, absence management and navigation. With a comprehensive total rewards package, continuing education and training, and tremendous potential with a growing global organization, Alight is the perfect place to put your passion to work. Join our team if you Champion People, want to Grow with Purpose through acting with integrity and if you embody the meaning of Be Alight. Learn more at careers.alight.com About the Role – Senior Data Architect As a Senior Data Architect on our Global Data & Advanced Analytics team, you will be the visionary and designer of our data ecosystem. You’ll leverage your expertise in AWS cloud-native technologies and big data platforms to build scalable data solutions that empower advanced analytics and AI-driven products. This is a senior role for a hands-on leader who can both strategize at the enterprise level and dive into technical design. You will collaborate with cross-functional teams to ensure our data architecture is robust, secure, and aligned with business needs, enabling Alight’s mission to provide insightful, real-time solutions in health, wealth, and human capital. Responsibilities: Design and Strategy : Work with data architecture team to define data architecture blueprints for our products, including data flow diagrams, system integrations, and storage solutions. Continuously refine the architecture to meet evolving business requirements and to incorporate new AWS capabilities and industry best practices. Cloud Data Platform Development : Lead the development of our cloud-based data platform on AWS. Implement data pipelines and warehouses using AWS services – e.g., AWS Glue for ETL, AWS Lambda for serverless processing, Amazon Redshift for data warehousing, and S3 for data storage. Big Data & Legacy Integration : Oversee the ingestion of large-scale datasets from various sources (transactional systems, APIs, external files). Optimize processing of big data using Spark and integrate legacy Hadoop-based data into our AWS environment. Data Modeling: Develop and maintain data models (conceptual, logical, physical) for our databases and data lakes. Design relational schemas and dimensional models that cater to both operational applications and analytical workloads. Ensure data is organized for easy access and high performance (for example, optimizing Redshift schema design and using partitioning or sort keys appropriately). Advanced Analytics Enablement : Work closely with Data Science and Analytics teams to enable AI and advanced analytics. Provide well-structured data sets and create pipelines that feed machine learning models (e.g., customer personalization models, predictive analytics). Implement mechanisms to handle real-time streaming data (using tools like Kinesis or Kafka if needed) and ensure data quality and freshness for AI use cases. Efficiency and Scalability : Design efficient, scalable processes for data handling. This includes optimizing ETL jobs (monitoring and tuning Glue/Spark jobs), implementing incremental data loading strategies instead of full loads where possible, and ensuring our data infrastructure can scale to growing data volumes. You will continually seek opportunities to automate manual data management tasks and improve pipeline reliability (CI/CD for data pipelines). Data Governance & Security : Embed data governance into the architecture – implement data cataloging, lineage tracking, and governance policies. Ensure compliance with data privacy and security standards: implement access controls, encryption (at-rest and in-transit), and data retention policies aligned with Alight and client requirements. Work with the InfoSec team to perform regular audits of data access and to support features like data masking or tokenization for sensitive information. Collaboration and Leadership : Collaborate with other technology leadership and architects, product managers, business analysts, and engineering leads to understand data needs and translate them into technical solutions. Provide technical leadership to data engineers – set development standards, guide them in choosing the right tools/approaches, and conduct design/code reviews. Lead architecture review sessions and be the go-to expert for any questions on data strategy and implementation. Innovation and Thought Leadership: Stay abreast of emerging trends in data architecture, big data, and AI. Evaluate and recommend new technologies or approaches (for example, evaluate the use of data lakehouses, graph databases, or new AWS analytics services). Provide thought leadership on how Alight can leverage data for competitive advantage, and pilot proof-of-concepts for new ideas. Required Qualifications: Experience : 10+ years (preferred 15+ years) of experience in data architecture, data engineering, or related fields, with a track record of designing and implementing large-scale data solutions. Demonstrated experience leading data-centric projects from concept to production. Hands on experience in: AWS Cloud & Big Data Expertise Data Modeling & Warehousing Programming & Scripting : Proficiency in programming for data engineering – Python (or Scala/Java) for ETL/ELT scripting, and solid SQL skills for data manipulation and analysis. Experience with infrastructure-as-code (Terraform/CloudFormation) and CI/CD pipelines for deploying data infrastructure is a plus. Analytics and AI Orientation Leadership & Soft Skills Education: Bachelor’s degree in Computer Science, Information Systems, or a related field required. (Master’s degree in a relevant field is a plus.) Certifications : (Preferred) AWS Certified Solutions Architect or AWS Certified Data Analytics certification. Any big data or database certifications (Cloudera Data Platform, Oracle/SQL Server certs, etc.) will be a plus and reinforce your expertise in the field. Location- Noida/Gurgaon/Chennai Alight requires all virtual interviews to be conducted on video. Alight has been a leader in the flexible workspace and “Top 100 Company for Remote Jobs” 5 years in a row. Benefits We offer programs and plans for a healthy mind, body, wallet and life because it’s important our benefits care for the whole person. Options include a variety of health coverage options, wellbeing and support programs, retirement, vacation and sick leave, maternity, paternity & adoption leave, continuing education and training as well as several voluntary benefit options. By applying for a position with Alight, you understand that, should you be made an offer, it will be contingent on your undergoing and successfully completing a background check consistent with Alight’s employment policies. Background checks may include some or all the following based on the nature of the position: SSN/SIN validation, education verification, employment verification, and criminal check, search against global sanctions and government watch lists, credit check, and/or drug test. You will be notified during the hiring process which checks are required by the position. Our commitment to Inclusion We celebrate differences and believe in fostering an environment where everyone feels valued, respected, and supported. We know that diverse teams are stronger, more innovative, and more successful. At Alight, we welcome and embrace all individuals, regardless of their background, and are dedicated to creating a culture that enables every employee to thrive. Join us in building a brighter, more inclusive future. As part of this commitment, Alight will ensure that persons with disabilities are provided reasonable accommodations for the hiring process. If reasonable accommodation is needed, please contact alightcareers@alight.com . Equal Opportunity Policy Statement Alight is an Equal Employment Opportunity employer and does not discriminate against anyone based on sex, race, color, religion, creed, national origin, ancestry, age, physical or mental disability, medical condition, pregnancy, marital or domestic partner status, citizenship, military or veteran status, sexual orientation, gender, gender identity or expression, genetic information, or any other legally protected characteristics or conduct covered by federal, state, or local law. In addition, we take affirmative action to employ, disabled persons, disabled veterans and other covered veterans. Authorization to work in the Employing Country Applicants for employment in the country in which they are applying (Employing Country) must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the Employing Country and with Alight. Note, this job description does not restrict management's right to assign or reassign duties and responsibilities of this job to other entities; including but not limited to subsidiaries, partners, or purchasers of Alight business units. We offer you a competitive total rewards package, continuing education & training, and tremendous potential with a growing worldwide organization.
Posted 1 week ago
7.0 years
0 Lacs
bengaluru east, karnataka, india
On-site
Visa is a world leader in payments and technology, with over 259 billion payments transactions flowing safely between consumers, merchants, financial institutions, and government entities in more than 200 countries and territories each year. Our mission is to connect the world through the most innovative, convenient, reliable, and secure payments network, enabling individuals, businesses, and economies to thrive while driven by a common purpose – to uplift everyone, everywhere by being the best way to pay and be paid. Make an impact with a purpose-driven industry leader. Join us today and experience Life at Visa. Job Description Client Services provides a world-class service experience to Visa’s clients that begins with pre-sales and continues through onboarding, implementation of new products and services, issue resolution and service optimization. The Services Digitization organization within Client Services is dedicated to automating business processes and making client interactions with Visa straightforward, intuitive and rewarding through the design, development, and delivery of enhanced digital experiences. As a member of the Onboarding team within Services Digitization, you will play a pivotal role in driving process automation and digitization of solutions supporting the Licensing, Implementation and Testing functions. About the Role We are seeking a visionary Senior Product Manager to lead the end-to-end strategy, delivery, and scaling of an in-house Centralized Payment Testing Platform. You will define the product vision, partner with Platform/Payments Engineering to build core capabilities from the ground up, sunset fragmented internal tools, and launch a secure, scalable, white-labelled testing experience for internal stakeholders and enterprise clients. You will collaborate with Product, Operations, SMEs, Engineering, SRE, Security, Data, Sales, and Client Success to deliver a seamless developer and client experience across payment authorization, clearing, settlement, and dispute flows. Key Responsibilities · Product strategy and vision Define the long-term vision, strategy, and roadmap for an internally built testing platform aligned to company objectives and payment domain needs. Own build scope, MVP definition, and iterative releases with clear success criteria. · Requirements and experience design Engage internal users and external client stakeholders to capture pain points and desired capabilities; translate insights into actionable PRDs, user stories, and discovery artifacts. Define personas and end-to-end journeys (developer, QA, payment ops, client engineer). · Platform architecture partnership Partner with Platform/Payments Engineering to shape platform architecture, APIs and SDKs, data models, and environment strategy (multi-tenant, staging, sandbox, simulators). Prioritize foundational capabilities: protocol simulators (ISO 8583/ISO 20022), EMV, tokenization, digital wallets, 3-DS, network issuer acquirer emulation, data replay, and automation. · Migration and deprecation Lead the transition from legacy in-house tools to the new platform; define parity thresholds, dual-run pilots, data integrity checks, and a staged deprecation plan. Deliver self-serve migration tooling, training, and change communications. · White-label enablement Define branding, theming, tenant isolation, usage metering, audit trails, and customization to support client-facing, white-label deployments. · Delivery leadership Run a disciplined backlog, quarterly planning, and a hybrid delivery model (Agile with milestone gates) across multiple squads, manage dependencies with Infra, Security, and Data. Establish operating cadences, decision logs, and RACI across product/engineering/design. · Observability, quality, and operations Specify telemetry, analytics, and BI for usage, performance, and test outcomes. Collaborate with Product Operation team on maintaining & support SLOs/SLIs, capacity plans, incident response expectations, and runbooks with SRE. · Compliance, privacy, and security by design Embed PCI DSS requirements, organizational controls, encryption, key management, data retention, and data residency. · Business case and TCO Build the business case for in-house development vs. alternatives: track TCO, unit economics, and ROI: partner with Finance on budget, resourcing, and cost controls. · Go-to-market and adoption Partner with Marketing, Sales, and Client Success to define value propositions and packaging for client access: enable internal stakeholder onboarding and client activation. · Performance measurement and continuous improvement Define and track product KPIs (adoption, efficiency, reliability, migration progress, NPS/CSAT) and drive a continuous improvement roadmap. Qualifications Basic Qualifications (Must have) Bachelor’s degree in Computer Science, Engineering, Business, or related field. MBA preferred. 7+ years of product management experience in payments, fintech, or enterprise SaaS/platforms , including platform or developer focused products. Deep understanding of payment processing and testing across auth, clearing, settlement, chargebacks, and related regulatory/industry standards (PCI DSS, ISO 8583, ISO 20022, EMV, tokenization, digital wallets, 3DS). Proven success building platform capabilities in‑house: defining architectures with engineering, establishing APIs/SDKs, and launching multi‑tenant services at scale. Demonstrated experience sunsetting legacy tools and migrating diverse user groups to new platforms with measurable outcomes. Strong analytical and decision‑making skills: adept at defining KPIs, instrumentation, and experimentation. Excellent communication and stakeholder management skills across technical and non‑technical audiences. Expertise with Agile and hybrid delivery models: comfortable with complex, multi‑squad programs. Familiarity with security, privacy, and compliance controls for payments and SaaS. Preferred Qualifications (Good to have) Experience building protocol simulators, payment network emulators, or high‑fidelity test harnesses. Background in SRE/DevOps‑adjacent product areas (observability, environments, CI/CD). Experience delivering white‑label or OEM‑style client experiences, including theming and tenant governance. Knowledge of data pipelines for test artifacts, lineage, and auditability Additional Information Visa is an EEO Employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability or protected veteran status. Visa will also consider for employment qualified applicants with criminal histories in a manner consistent with EEOC guidelines and applicable local law.
Posted 1 week ago
4.0 years
0 Lacs
bengaluru, karnataka, india
On-site
Company Description Global Technology Partners is a premier partner for digital transformation, with a diverse team of software engineering experts in the US and India. They combine strategic thinking, innovative design, and robust engineering to deliver exceptional results for their clients. GTP Labs is our internal R&D and client-facing innovation group. You’ll prototype, evaluate, and productionize AI features across our products—then help customers adopt them. We’re looking for a strong junior who can grow into leading our AI/ML team. Experience: 1–4 years (high upside, lead potential) Responsibilities Build & Experiment Design rapid POCs for LLM-powered features (RAG, agents, tool use) and graduate the best ideas into roadmap items. Stand up MCP servers over OpenAPI specs to enable “ask & test the API” experiences. Create evaluation harnesses (offline & human-in-the-loop) for prompts, models, and pipelines. Productize Ship ML/LLM services (APIs, microservices) with clean interfaces, telemetry, tests, and CI/CD. Implement guardrails (PII handling, redaction, safety filters) and quality gates. Data & MLOps Build datasets (label/curate), retrieval indexes, vector stores, and monitoring dashboards. Own experiment tracking and model/artifact versioning; help with cost/performance tuning. Partner with stakeholders Work closely with Product, Backend (Java/Spring), Frontend (Angular), and Customer teams; write clear docs and present findings Ideal candidate(s) must have Solid Python (data/ML) and at least one of PyTorch or TensorFlow. Working knowledge of LLMs: tokenization, embeddings, RAG, basic fine-tuning/LoRA, prompt design, evaluation. API engineering: build/consume REST; JSON; pagination; auth; rate limits. Retrieval & storage: one vector DB (pgvector, Pinecone, Weaviate, or similar) + SQL basics. Packaging & delivery: Docker, Git, testing, simple CI. Excellent written & spoken communication; strong product sense; ownership mindset Nice to have LangChain/LlamaIndex; agents (CrewAI/AutoGen); MCP (Model Context Protocol). Cloud: AWS/Azure/GCP (S3, Lambda, Bedrock/Azure OpenAI, basic IAM). Document AI (OCR, layout/structure), OpenAPI tooling, Postman/Newman, Spectral, Dredd/Schemathesis. Frontend familiarity (Angular/TypeScript) or Java/Spring Boot integration experience. Experiment tracking (Weights & Biases/MLflow); Kubernetes; basic Kafka/stream processing How we work Velocity + Rigor: short POC cycles with measurable evals; promote what wins. Security by design: PII handling, encryption, role-based access, audit trails. Mentorship: direct access to senior architects and product leaders; learning budget & GPU time. Growth path 0–6 months: lead POCs end-to-end; own an evaluation framework; ship 1–2 features into production. 6–12 months: tech lead for a product work-stream 12–18 months: mentor juniors; co-own roadmap; drive customer pilots—path to AI/ML Lead Qualifications Bachelor’s or Master’s degree in Computers, or a related field with 1-4 years of relevant experience
Posted 1 week ago
9.0 - 12.0 years
0 Lacs
noida, uttar pradesh, india
On-site
Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you’d like, where you’ll be supported and inspired by a collaborative community of colleagues around the world, and where you’ll be able to reimagine what’s possible. Join us and help the world’s leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Your role: We are looking for experienced professionals in Data Tokenization and Data Protection with expertise in AWS and Knowledge of Protegrity for Pan India locations, with 9 to 12 years of experience. T his role is ideal for individuals who are passionate about securing critical infrastructure and industrial environments. Candidates should be capable of leading technical discussions, engaging with stakeholders, and delivering secure, scalable data security solutions. Data Tokenization & Security: Design and manage tokenization strategies using Protegrity and AWS to protect sensitive data, ensuring compliance with industry standards. AWS Expertise: Proficient in Amazon Redshift, Athena, Glue, and S3 for data storage, processing, and analysis; experience in cloud infrastructure and automation. Protegrity & Data Protection: Hands-on experience with Protegrity’s tokenization, encryption, and masking solutions; familiarity with other data protection technologies. Technical Skills: Strong command of SQL and scripting languages (e.g., Python); adept at troubleshooting and optimizing performance and scalability. Collaboration & Governance: Work with cross-functional teams to implement security solutions; knowledge of data governance, privacy regulations, and data analytics concepts. Your Profile Data Security & Tokenization AWS Services Proficiency Protegrity Platform Knowledge Technical & Scripting Skills Cloud Infrastructure & Troubleshooting What You'll Love About Working Here You can shape your career with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders.You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work.At Capgemini, you can work on cutting-edge projects in tech and engineering with industry leaders or create solutions to overcome societal and environmental challenges. Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.
Posted 1 week ago
7.0 years
0 Lacs
hyderabad, telangana, india
Remote
Are you ready to make an impact at DTCC? Do you want to work on innovative projects, collaborate with a dynamic and supportive team, and receive investment in your professional development? At DTCC, we are at the forefront of innovation in the financial markets. We are committed to helping our employees grow and succeed. We believe that you have the skills and drive to make a real impact. We foster a thriving internal community and are committed to creating a workplace that looks like the world that we serve. DTCC Digital Assets is at the forefront of driving institutional adoption of digital assets technology with a steadfast commitment to innovation anchored in security and stability. As the financial services industry’s trusted technology partner, we pride ourselves on empowering a globally interconnected and efficient ecosystem. Our mission is to provide secure and compliant infrastructure for digital assets, enabling financial institutions to unlock the full potential of blockchain technology Pay and Benefits: Competitive compensation, including base pay and annual incentive Comprehensive health and life insurance and well-being benefits, based on location Pension / Retirement benefits Paid Time Off and Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being. DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (onsite Tuesdays, Wednesdays and a third day unique to each team or employee). The impact you will have in this role: We are currently seeking an experienced Technical Product Manager to join our team. As a Technical Product Manager at DTCC Digital, you will play a key role in defining and driving the development of our blockchain-based solutions. You will work closely with cross-functional teams including software engineers, data scientists, and business analysts to translate customer requirements into technical specifications and deliver innovative products that meet market needs. Your Primary Responsibilities: Collaborate with stakeholders to define product requirements and roadmap Translate customer needs into clear and actionable technical specifications Write and refine business requirements and acceptance criteria Work closely with product managers to prioritize the backlog in JIRA Work closely with engineering teams to ensure successful and timely product development and delivery Act as the primary liaison between business stakeholders, clients, and technical teams Run cross-functional meetings and facilitate decision-making Provide transparency on product status, risks, and dependencies Review and test deliverables to ensure they meet business requirements Manage the entire product lifecycle, from concept to launch and beyond Monitor market trends and competition to identify new opportunities and inform product strategy Ensure products meet quality standards and follow regulatory requirements Act as a subject matter expert and provide technical guidance to internal teams and external stakeholders In a constantly changing market landscape, stay current to maintain a strong understanding of digital asset infrastructure and tokenization frameworks **NOTE: The Primary Responsibilities of this role are not limited to the details above. ** Qualifications: Bachelor's degree in Computer Science, Engineering, or related field (Master's degree preferred) 3–7+ years of experience in product management, business analysis, or product ownership in financial services or fintech Strong technical background with knowledge of blockchain technology and distributed systems Experience with Agile/Scrum methodologies and product management tools such as JIRA Excellent problem-solving and analytical skills Strong communication and leadership abilities Strong organizational and facilitation skills Ability to work effectively in a fast-paced, dynamic environment Collaborative approach with the ability to influence without authority Passion for innovation in financial markets and digital assets Talents Needed for Success: Experience in tokenization of securities, digital custody, or cryptocurrency Exposure to regulatory and compliance considerations in financial markets Experience in QA or UAT coordination Knowledge of financial markets and financial services, including: Collateral Management Custody Fund Accounting / Investor Services Central Clearing Margin Management OTC trading Payments Investment Management Operations Broker-Dealer Operations Exchange Operations Actual salary is determined based on the role, location, individual experience, skills, and other considerations. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 1 week ago
4.0 - 6.0 years
30 - 35 Lacs
noida, uttar pradesh, india
On-site
Experience: 4-6 Years Roles And Responsibilities We’re looking for a talented and driven Senior NLP Research Engineer to join our AI/ML team. In this role, you’ll design and implement advanced natural language processing models that interpret metadata and user queries across complex video datasets. You’ll also contribute to the development of Generative AI-based assistants, integrated into our video management ecosystem, enhancing user experience through natural interaction and smart search. Design, develop, Train, fine-tune, and deploy LLM-based NLP models for various enterprise use cases. Experience in Designing the network from scratch according to the problem statement given. Experience in training methodologies to handle the large dataset to train the networks. Collaborate with product, engineering, and data teams to integrate NLP solutions into production-grade systems. Build and optimize custom pipelines for document intelligence, question answering, summarization, and conversational AI. Experience and understanding with state-of-the-art open-source LLMs (e.g.,GPT, LLaMA, Claude, Falcon etc.). Implement prompt engineering, retrieval-augmented generation (RAG), and fine-tuning strategies. Ensure ethical, safe, and scalable deployment of LLM-based systems.. Ensure model reliability safeDesign, develop, Train, fine-tune, and deploy LLM-based NLP models for various enterprise use cases.ty, performance, and compliance using evaluation and monitoring frameworks. Stay updated with the latest research in GenAI and NLP, and apply relevant findings to improve current systems. Job Requirements 4–6 years of experience in NLP, Machine Learning, or Deep Learning roles. Proven experience in working with VLMs and LLMs in real-world projects. Proficiency with tools such as Hugging Face, LangChain,LLama.cpp, TensorFlow/PyTorch . Hands-on experience with vector databases (e.g., FAISS, Pinecone, Weaviate) and RAG frameworks (Not required but appreciated). Strong programming skills in Python and some experience with ML Ops or cloud platforms (AWS/GCP/Azure). Deep understanding of NLP techniques including basics such as NER, text classification, embeddings, tokenization, Transformers , loss functions etc. Knowledge of distributed training and serving of large models. Strong problem-solving and communication skills Skills: llm,ai,deep learning,hugging face,langchain,tensorflow,research,ml,nlp,machine learning,pytorch
Posted 1 week ago
15.0 years
6 - 9 Lacs
hyderābād
On-site
Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Machine Learning Good to have skills : Microsoft Azure Machine Learning Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary ML Engineer : Build features across Agentic, LLM and ML workflows including suggestion/rules components, search & retrieval, document extraction, and basic image/OCR processing. Translate problem statements into production-ready code, write clear documentation, and partner closely with MLOps for reliable releases. Should be aware of Model Drift & Data Drift practices. Roles and responsibilities: • Write queries and extract data from structured/unstructured sources; implement parsing and normalization pipelines. • Develop web/document extraction (Playwright/Selenium, Trafilatura; pypdf/pdfplumber/ocrmypdf) and convert to validated schemas. • Implement prompts, tools/functions, and agent steps using LangChain; contribute to retrieval (BM25 + embeddings) and RAG modules. • Add basic image processing with OpenCV and OCR using pytesseract where needed. • Write clean, tested Python; add unit-style LLM tests with DeepEval; maintain experiment logs and evaluation datasets. • Collaborate in Agile ceremonies; produce concise design notes and experiment reports. Technical experience & Professional attributes: • Python with hands-on PyTorch; familiarity with deep-learning packages and the Hugging Face stack (transformers, datasets, SBERT). • Web automation/scraping using Selenium or Playwright; robust HTML/text processing. • Search basics and RAG patterns; vector stores and embeddings at a practical level. • Image processing fundamentals (OpenCV) and OCR integration (pytesseract). • Evaluation mindset: DeepEval for LLM outputs; Optuna/SHAP exposure is a plus. Preferred Skills • spaCy, scikit-learn; LightGBM/Flair where relevant. • Experience with schema validation (pydantic/JSON Schema) and tokenization (tiktoken). • Streamlit for internal demos (local-only). Education qualifications: • Experience shipping ML/LLM features or strong applied projects demonstrating end-to-end ownership. • Clear written/spoken communication and collaborative ways of working. • You will be working with a Trusted Tax Technology Leader, committed to delivering reliable and innovative solutions 15 years full time education
Posted 1 week ago
6.0 years
20 - 30 Lacs
hyderābād
On-site
Role: Gen AI Engineer Location: Hyderabad Mode of Work : Hybrid Notice Period : 0-25 Days Job Description: Key Responsibilities: Designing and Developing AI Models: This includes creating architectures, algorithms, and frameworks for generative AI applications. Implementing AI models: This involves building and integrating AI models into existing systems and applications. Working with LLMs and other AI technologies: This includes using tools and techniques like LangChain, Haystack, and prompt engineering. Data preprocessing and analysis: This involves preparing data for use in AI models. Collaborating with other teams: This includes working with data scientists, product managers, and other stakeholders. Testing and deploying AI models: This involves evaluating model performance and deploying them to production environments. Monitoring and optimizing AI models: This involves tracking model performance, identifying issues, and optimizing models for better results. Staying up to date with the latest advancements in Gen AI: This includes learning about new techniques, models, and frameworks. Required Skills: Strong programming skills in Python: Python is the preferred language for AI development. Knowledge of Generative AI, NLP, and LLMs: This includes understanding the principles behind these technologies and how to use them effectively. Experience with RAG pipelines and vector databases: This includes understanding how to build and use retrieval-augmented generation pipelines. Familiarity with AI frameworks and libraries: This includes knowledge of frameworks like LangChain, Haystack, and open-source libraries. Understanding of prompt engineering and tokenization: This includes understanding how to optimize prompts and manage tokenization. Experience in integrating and fine-tuning AI models: This includes knowledge of deploying and maintaining AI models in production environments. Excellent communication and problem-solving skills: This includes the ability to communicate complex technical concepts to non-technical stakeholders. Optional Skills: Experience with cloud computing platforms (GCP, AWS, Azure): This can help deploy and manage AI models. Familiarity with MLOps practices: This can help with building and deploying AI models in a scalable and reliable manner. Experience with DevOps practices: This can help with automating the development and deployment of AI models. Job Type: Permanent Pay: ₹2,000,000.00 - ₹3,000,000.00 per year Experience: Total: 6 years (Required) GenAI : 3 years (Required) Python : 4 years (Required) LLM : 3 years (Required) OpenAI, Claude, Gemini : 4 years (Required) Azure : 3 years (Required) LangChain and LangGraph: 1 year (Preferred) Work Location: In person
Posted 1 week ago
3.0 years
0 Lacs
andhra pradesh, india
On-site
Key Responsibilities Develop, deploy, and maintain Python-based applications powered by LLMs and other Gen AI models (e.g., GPT4, Claude, Gemini, Mistral). Implement RAG (Retrieval-Augmented Generation) pipelines using vector databases (e.g., Pinecone, FAISS, Weaviate). Build prompt templates, manage system prompts, and experiment with prompt engineering techniques for performance tuning. Integrate LLMs with APIs, microservices, databases, and user-facing apps. Optimize latency, throughput, and cost-efficiency for inference workloads, including caching, batching, or quantization where applicable. Collaborate with ML and DevOps teams to enable CI/CD, model versioning, and monitoring for AI applications. Work with evaluation frameworks (e.g., LangChain, Weights & Biases, Trulens) to measure output quality (e.g., relevance, coherence, safety). Troubleshoot and enhance LLM behavior, including addressing hallucinations, bias, and task completion accuracy. Required Qualifications Bachelors or Masters degree in Computer Science, AI/ML, Engineering, or a related field. 3+ years of experience in Python software development, preferably with AI/ML projects. Strong knowledge of LLMs, transformers, and modern NLP tools. Experience working with LLM APIs (OpenAI, Anthropic, Google Gemini, Cohere, etc.). Familiarity with vector databases (e.g., FAISS, Pinecone, Chroma) and semantic search principles. Hands-on experience with prompt engineering and LLM orchestration tools (LangChain, LlamaIndex, Haystack, etc.). Solid understanding of REST APIs, containers (Docker), and cloud platforms (AWS/GCP/Azure). Preferred Qualifications Experience with chatbot frameworks, agent-based LLM systems, or multi-agent orchestration. Understanding of tokenization, context window management, and cost optimization strategies for LLMs. Exposure to multimodal AI (e.g., text-to-image, speech-to-text, OCR) and computer vision libraries. Familiarity with evaluation techniques for AI output (BLEU, ROUGE, semantic similarity, etc.). Contributions to open-source LLM or Gen AI frameworks are a strong plus.
Posted 1 week ago
3.0 years
0 Lacs
chennai, tamil nadu, india
On-site
Job Title: Pre-Sales Consultant Data Security Privacy Location: Chennai Department: Pre-Sales Reporting To: Technical Director Employment Type: Full-Time About The Role We are seeking a dynamic and knowledgeable Pre-Sales Consultant (Mid-Level) to join our growing team. This individual will play a critical role in supporting the sales cycle by providing expert-level guidance on data security and data privacy solutions. The ideal candidate will bridge the gap between technical capabilities and client business needs, articulating the value and fit of our offerings to prospects and existing clients. Key Responsibilities Engage with customers to understand their data protection, compliance, and privacy requirements. Recommend tailored solutions aligned with business objectives and regulatory standards (GDPR, HIPAA, CCPA, etc.). Provide in-depth technical support on data security products such as Data Classification , DLP (Data Loss Prevention), Encryption, Tokenization, and privacy-enhancing technologies. Deliver compelling product presentations, demos, and proofs of concept (PoCs) to both technical and business stakeholders. Respond to technical sections of RFPs, RFIs, and other procurement documentation. Stay up-to-date on data security/privacy trends, competitor offerings, and emerging technologies. Required Qualifications 3 to 6 years of experience in pre-sales, solutions engineering, or a technical consulting role, preferably within the data security or cybersecurity domain. Strong understanding of Data Security and Data privacy regulations (e.g., GDPR, CCPA, HIPAA) and their technical implications. Familiarity with security standards and frameworks (ISO 27001, NIST, SOC 2). Experience with leading security platforms or solutions such as Fortra, Forcepoint Microsoft Purview, Big ID, One Trust, Varonis, or similar. Excellent communication, presentation, and interpersonal skills. Ability to translate complex technical concepts into clear business language. Preferred Skills Certifications Relevant certifications such as CIPT, CIPP/E, CIPM ,CISSP, CCSP, etc Knowledge of data privacy regulations (e.g., GDPR, CCPA, HIPAA) etc Experience in PoC execution and solution architecture. Strong documentation and proposal-writing skills. What We Offer Competitive salary and performance incentives Comprehensive benefits package Learning and development opportunities Exposure to high-impact enterprise security projects Collaborative and inclusive work environment Contact Us: +91 96772 47808 / amozajobs@gmail.com This job is provided by Shine.com
Posted 1 week ago
2.0 years
0 Lacs
gurugram, haryana, india
On-site
Key Responsibilities: Develop and fine-tune LLMs for classification, NLP, and generative AI use cases Build RAG pipelines with vector databases and implement prompt engineering strategies Conduct data preprocessing for LLM training including tokenization and embedding generation Design custom chatbots, text generation, and document analysis solutions Implement LLM evaluation metrics (BLEU, ROUGE, perplexity) and human feedback loops Deploy conversational AI and content generation models in production Technical Requirements: 2 years hands-on LLM development (GPT, BERT, T5, Llama, Claude) Proficient in Transformers, Hugging Face, LangChain, OpenAI API Experience with vector databases (Pinecone, Weaviate, ChromaDB) and semantic search Strong Python, SQL, and deep learning (PyTorch, TensorFlow) Knowledge of fine-tuning techniques (LoRA, QLoRA, PEFT) and model quantization Understanding of prompt engineering, chain-of-thought, and few-shot learning Familiarity with MLOps for LLMs (model versioning, A/B testing, monitoring) -- GOOD TO HAVE
Posted 1 week ago
8.0 years
0 Lacs
trivandrum, kerala, india
On-site
Job Title: Cards Test Lead Way4 | Base24 | Tokenisation Company: Closing Gap Location: Trivandrum, India About the Role Are you an experienced QA professional with deep expertise in Way4 CMS, Base24 Switch, and Tokenization (Apple Pay, Google Pay). This role offers the opportunity to lead testing for debit/credit card programs, ensuring comprehensive coverage across the card lifecycle, transaction processing, and digital wallet/tokenization flows. Key Responsibilities: Define and drive SIT strategy for card lifecycle, limits, fees, and tokenization. Validate end-to-end flows across ATM, POS, and E-Commerce via Base24 (ISO 8583). Test tokenization lifecycle: provision, suspend/resume, delete, multi-device DPAN. Oversee integration flows: Way4 ↔ Base24 ↔ Finacle. Provide test evidence, dashboards, and UAT sign-off. Skills & Experience: Hands-on experience in Way4 CMS and Base24 Switch. Strong knowledge of Tokenization (Apple Pay, Google Pay). Proven experience across debit/credit card flows: issuance, hotlisting, PIN management, reversals, refunds, chargebacks. Understanding of PCI DSS, HSM, AML/Fraud screening. Familiarity with Azure DevOps and card scheme simulators. Qualifications: 4–8 years of QA experience, including hands-on testing across Payments systems. Strong documentation, reporting, and communication skills. Ability to work effectively with vendors and stakeholders under tight timelines. 📩 Apply Now If you have the required expertise and leadership skills, please send your CV to hr@theclosinggap.net
Posted 1 week ago
9.0 - 12.0 years
7 - 11 Lacs
noida
Work from Office
Your role: We are looking for experienced professionals in Data Tokenization and Data Protection with expertise in AWS and Knowledge of Protegrity for Pan India locations, with 9 to 12 years of experience. T his role is ideal for individuals who are passionate about securing critical infrastructure and industrial environments. Candidates should be capable of leading technical discussions, engaging with stakeholders, and delivering secure, scalable data security solutions. Data Tokenization & SecurityDesign and manage tokenization strategies using Protegrity and AWS to protect sensitive data, ensuring compliance with industry standards. AWS ExpertiseProficient in Amazon Redshift, Athena, Glue, and S3 for data storage, processing, and analysis; experience in cloud infrastructure and automation. Protegrity & Data ProtectionHands-on experience with Protegritys tokenization, encryption, and masking solutions; familiarity with other data protection technologies. Technical Skills: Strong command of SQL and scripting languages (e.g., Python); adept at troubleshooting and optimizing performance and scalability. Collaboration & GovernanceWork with cross-functional teams to implement security solutions; knowledge of data governance, privacy regulations, and data analytics concepts. Your Profile Data Security & Tokenization AWS Services Proficiency Protegrity Platform Knowledge Technical & Scripting Skills Cloud Infrastructure & Troubleshooting What you"ll love about working here You can shape yourcareerwith us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders.You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work.At Capgemini, you can work oncutting-edge projectsin tech and engineering with industry leaders or createsolutionsto overcome societal and environmental challenges.
Posted 1 week ago
5.0 years
0 Lacs
thane, maharashtra, india
On-site
Experience: 3–5 years (hands-on AI/ML) Location: Thane Role Overview We are looking for an AI/ML Developer with strong practical expertise in modern AI stacks. You will design, train, and deploy models, build LLM-powered applications, and work on scalable AI solutions. Core Technical Skills Programming: Strong Python, good knowledge of data structures & algorithms. AI/ML Frameworks: PyTorch / TensorFlow, Hugging Face Transformers, scikit-learn. LLMs & NLP: Fine-tuning, embeddings, tokenization, prompt engineering, vector databases (Pinecone, Weaviate, FAISS). RAG Architectures: Building retrieval-augmented systems with LangChain/LlamaIndex. Model Ops: MLOps tools (MLflow, Weights & Biases, Kubeflow) for experiment tracking, deployment, and monitoring. Data Handling: Pandas, NumPy, Spark (preferred), time-series modeling. Cloud & Deployment: Model serving via REST/gRPC, Docker, Kubernetes, deployment on AWS Sagemaker / Azure ML / GCP Vertex AI. Multi-Agent Systems: Familiarity with MCP/ACP, agent frameworks, orchestration patterns. Database Skills: SQL + NoSQL, vector stores, knowledge graph integration. API & Integration: Ability to wrap models into APIs/microservices. Nice to Have Experience with reinforcement learning (RLHF, PPO). Exposure to optimization problems (scheduling, forecasting). Domain knowledge in energy trading / financial systems . Ideal Candidate Hands-on builder with a proven record of delivering working AI tools, not just research . Strong software engineering discipline (Git, CI/CD, testing). Able to evaluate new models and frameworks quickly and adapt them for enterprise use
Posted 1 week ago
2.0 years
0 Lacs
noida, uttar pradesh, india
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Risk Consulting – Digital Risk Senior – Artificial Intelligence - Data Scientist We always support and enable big ideas with the ambition to keep doing more. We leverage leading practices and years of practical experience developing Artificial Intelligence (AI) capabilities for and with clients across sectors to ensure the highest level of execution and satisfaction for our clients. Come help define the future of AI while enabling safe innovation at pace. Your Key Responsibilities Do you have sharp analytic skills? Do you know how to solve problems? Do you know how to develop innovative solutions? We’re looking for someone like that who can: Turn data into an asset for clients Rapidly perform Data Science analysis to provide insights using a variety of techniques Apply experience and expertise in several tools such as Python, Machine Learning and AI libraries, etc. Perform full lifecycle of Data Scientist activities, including conceptual and operational aspects Manage end to end projects - clarify requirements and devise research approaches Collect and combine data from multiple sources, analyze it for insights and produce great visuals Stay on top of current business and industry trends At the heart of this project is the ability to systematically analyze and build statistical and machine learning models for our clients across markets. We are looking for talented and experienced Data Scientists to bring innovation at EY To qualify for the role, you must have Degree in Computer Science, Mathematics or similar quantitative field; Minimally 2 years of relevant experience Strong knowledge and experience of data mining and machine learning - algorithms, regression techniques, decision trees, clustering, pattern recognition, probability theory, statistical techniques and dimensionality reduction Experience in at-least 3-4 of below areas: Machine Learning / Data Science: Basic understanding of machine learning concepts and data science workflows. Familiarity with data preprocessing, feature extraction, and model evaluation. Natural Language Processing (NLP): Basic to intermediate knowledge of NLP techniques, including lemmatization, tokenization, part-of-speech tagging, and named entity recognition. Basic understanding of transformer architecture and its significance in modern NLP tasks. Experience with Generative AI Projects: Hands-on experience with generative AI projects, specifically those involving large language models (LLMs), Familiarity with different LLMs and their applications. Document Q&A Projects: Experience with question-answering systems and understanding of architectures like Retrieval Augmented Generation (RAG) or Knowledge Graphs (KG). Vectorization and Vector Databases: Understanding of vectorization in the context of NLP and machine learning. Experience with vector databases and their use in managing and querying high-dimensional data. Programming and Development: Proficiency in Python, essential for most AI and data science tasks. Experience with developing web applications using Streamlit.. Knowledge of API development with frameworks like Flask. Experience with deploying applications on cloud platforms such as Azure and AW, is added advantage Strong analytical and problem-solving skills with a passion for research An ideal candidate would also have below exposure leading business or IT transformation projects that have supported data science, business intelligence, artificial intelligence, and related areas at scale Experience developing AI systems in either a design, business, or technical capacity Experience with model risk management Experience with Agile and/or Scrum methods of delivery What We Look For We’re interested in passionate leaders with strong vision and a desire to stay on top of trends in the Artificial Intelligence. Above all else, if you have a genuine passion and commitment to building a more responsible technology future, we will greatly welcome your application. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 week ago
2.0 years
0 Lacs
bengaluru, karnataka, india
On-site
This role is for one of the Weekday's clients Min Experience: 2 years Location: Bengaluru JobType: full-time We are seeking a highly motivated and skilled Senior Data Scientist to join our team and play a key role in driving innovation through data science, artificial intelligence (AI), and advanced analytics. The ideal candidate will bring strong expertise in machine learning, deep learning, and natural language processing (NLP) , along with a passion for solving complex business problems through data-driven approaches. This role is ideal for professionals who enjoy working in a dynamic, fast-paced environment and who are eager to contribute to designing and deploying intelligent systems that impact real-world outcomes. Requirements Key Responsibilities: Develop, implement, and optimize machine learning and AI models to address business challenges across various domains. Apply deep learning techniques for tasks such as image recognition, text processing, predictive analytics, and recommendation systems. Lead projects in natural language processing (NLP), including sentiment analysis, entity recognition, text classification, and conversational AI. Collaborate with cross-functional teams—including product managers, engineers, and domain experts—to identify opportunities where data science and AI can create measurable value. Conduct data exploration and analysis to extract meaningful insights from structured and unstructured datasets. Design, build, and maintain end-to-end pipelines for data collection, preprocessing, model training, evaluation, and deployment. Stay updated with the latest advancements in AI, ML, NLP, and deep learning, and recommend innovative approaches for their practical implementation. Mentor junior data scientists and analysts, providing guidance on best practices in data science, machine learning, and model evaluation. Communicate findings, insights, and recommendations clearly to both technical and non-technical stakeholders. Ensure models are scalable, explainable, and ethical, while addressing concerns of bias, fairness, and data privacy. Required Skills and Qualifications: Bachelor's or Master's degree in Computer Science, Data Science, Artificial Intelligence, Statistics, or a related field. 2-6 years of hands-on experience in data science, machine learning, and AI-driven projects. Strong proficiency in programming languages such as Python or R, with expertise in libraries like TensorFlow, PyTorch, Scikit-learn, Keras, or similar frameworks. Solid understanding of deep learning architectures (CNNs, RNNs, LSTMs, Transformers, GANs) and their applications. Proven experience working with NLP techniques including tokenization, embeddings, attention mechanisms, and transformer-based models (e.g., BERT, GPT). Experience with data preprocessing, feature engineering, and statistical analysis. Familiarity with cloud-based platforms (AWS, Azure, GCP) for model training and deployment. Strong problem-solving skills and ability to translate business requirements into data science solutions. Excellent communication skills with the ability to present technical concepts to non-technical audiences. Preferred Skills: Experience in deploying machine learning models in production environments. Familiarity with big data tools such as Spark, Hadoop, or Hive. Knowledge of MLOps practices for managing model lifecycle and performance monitoring. Exposure to reinforcement learning or other advanced AI subfields
Posted 1 week ago
6.0 years
0 Lacs
india
Remote
🤖 We’re Hiring | Gen AI Engineer 📍 Preferred Locations: Hyderabad / Bangalore / Delhi 🖥 Mode of Work: Hybrid (1 day onsite if from these cities, else fully remote) 💼 Experience: 6+ Years (Min. 1.6+ years in GenAI, 6+ months in Agentic AI) 💰 Budget: Up to ₹25 LPA (stretchable to ₹27 LPA for exceptional candidates) 🕒 Notice Period: 0–25 Days 📞 Contact: Ayushi Antil | 📧 ayushi@skyleaf.global | 📱 9899823445 🏢 Client: Confidential (Global AI-driven Enterprise) About the Role: We’re hiring 10 experienced Gen AI Engineers to build, optimize, and deploy Generative AI solutions . This is a high-impact role for engineers passionate about LLMs, RAG pipelines, AI agents, and prompt engineering — with the opportunity to work across advanced AI use cases. Key Responsibilities: ⚙️ Design & develop Generative AI architectures, algorithms & frameworks 📌 Build, integrate & fine-tune LLMs & AI models into production systems 📊 Implement RAG pipelines & manage vector databases 💡 Apply prompt engineering & tokenization for optimized outputs 🤝 Collaborate with data scientists, PMs & cross-functional teams ✅ Test, deploy & monitor models for performance & scalability 🔍 Stay up to date with Gen AI advancements, tools & frameworks Mandatory Skills: ✅ Gen AI – 1.6+ years ✅ Agentic AI – 6+ months ✅ Python – 3–5 years (with prompt engineering experience) ✅ LLMs – OpenAI, Claude, Gemini, etc. ✅ Vector databases & RAG pipelines ✅ LangChain libraries & AI agents ✅ Cloud – Azure preferred (AWS/GCP also acceptable) Good to Have: ⭐ Experience with MLOps & DevOps practices ⭐ Cloud deployment expertise (Azure, GCP, AWS) ⭐ Familiarity with scalable AI production environments 📩 Looking to design the next wave of AI solutions? Let’s connect. 📧 ayushi@skyleaf.global | 📱 9899823445
Posted 1 week ago
3.0 years
4 - 7 Lacs
chennai
On-site
DESCRIPTION About Amazon.com: Amazon.com strives to be Earth's most customer-centric company where people can find and discover virtually anything they want to buy online. By giving customers more of what they want - low prices, vast selection, and convenience - Amazon.com continues to grow and evolve as a world-class e-commerce platform. Amazon's evolution from Web site to e-commerce partner to development platform is driven by the spirit of innovation that is part of the company's DNA. The world's brightest technology minds come to Amazon.com to research and develop technology that improves the lives of shoppers and sellers around the world. Overview of the role The Business research Analyst will be responsible for data and Machine learning part of continuous improvement projects across compatibility and basket building space. This will require collaboration with local and global teams, which have process and technical expertise. Therefore, RA should be a self-starter who is passionate about discovering and solving complicated problems, learning complex systems, working with numbers, and organizing and communicating data and reports. In compatibility program, RA perform Big data analysis to identify patterns, train model to generate product to product relationship and product to brand & model relationship. RA also continuously improve the ML solution for higher solution accuracy, efficiency and scalability. RA should writes clear and detailed functional specifications based on business requirements as well as writes and reviews business cases. Key job responsibilities Scoping, driving and delivering complex projects across multiple teams. Performs root cause analysis by understand the data need, get data / pull the data and analyze it to form the hypothesis and validate it using data. Conducting a thorough analysis of large datasets to identify patterns, trends, and insights that can inform the development of NLP applications. Developing and implementing machine learning models and deep learning architectures to improve NLP systems. Designing and implementing core NLP tasks such as named entity recognition, classification and part-of-speech tagging. Dive deep to drive product pilots, build and analyze large data sets, and construct problem hypotheses that help steer the product feature roadmap (e.g. with use of Python), tools for database (e.g. SQL, spark) and ML platform (tensorflow, pytorch) Conducting regular code reviews and implementing quality assurance processes to maintain high standards of code quality and performance optimization. Providing technical guidance and mentorship to junior team members and collaborating with external partners to integrate cutting-edge technologies. Find the scalable solution for business problem by executing pilots and build Deterministic and ML model (plug and play on readymade ML models and python skills). Performs supporting research, conduct analysis of the bigger part of the projects and effectively interpret reports to identify opportunities, optimize processes, and implement changes within their part of project. Coordinates design effort between internal team and external team to develop optimal solutions for their part of project for Amazon’s network. Ability to convince and interact with stakeholders at all level either to gather data and information or to execute and implement according to the plan. About the team Amazon.com operates in a virtual, global eCommerce environment without boundaries, and operates a diverse set of businesses in 14 countries, including Retail, third party marketplaces, eCommerce platforms, web services for developers. Retail Business Service (RBS) organization is a core part of leading customer experience and selling partners experience optimization. This team is part of RBS Customer Experience business unit. The team’s primary role is to create and enhance retail selection on the worldwide Amazon online catalog. The compatibility program handled by this team has a direct impact on customer buying decisions and online user experience. Compatibility program aims to address Customer purchase questions if two products works together, as well as reduce return due to incompatibility. BASIC QUALIFICATIONS Basic Qualifications Ability to analyse and then articulate business issues to a wide range of audiences using strong data, written and verbal communication skills Good mastery of BERT and other NLP frameworks such as GPT-2, XLNet, and Transformer models Experience in NLP techniques such as tokenization, parsing, lexing, named entity recognition, sentiment analysis and spellchecking Strong problem-solving skills, creativity and ability to overcome challenges SQL/ETL, Automation Tools Relevant bachelor’s degree or higher 3+ years combined of relevant work experience in a related field/s (project management, customer advocate, product owner, engineering, business analysis) - Diverse experience will be favored eg. a mix of experience across different roles Be self motivated and autonomous with an ability to prioritize well, and remain focused when working within a team located in across several countries and time zones PREFERRED QUALIFICATIONS Preferred Qualifications 3+ years combined of relevant work experience in a related field/s (project management, customer advocate, product owner, engineering, business analysis) - Diverse experience will be favored eg. a mix of experience across different roles Understanding of machine learning concepts including developing models and tuning the hyper-parameters, as well as deploying models and building ML service Experience with computer vision algorithms and libraries such as OpenCV, TensorFlow, Caffe or PyTorch. Technical expertise, experience in Data science and ML Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 1 week ago
3.0 years
4 - 7 Lacs
bengaluru
On-site
DESCRIPTION About Amazon.com: Amazon.com strives to be Earth's most customer-centric company where people can find and discover virtually anything they want to buy online. By giving customers more of what they want - low prices, vast selection, and convenience - Amazon.com continues to grow and evolve as a world-class e-commerce platform. Amazon's evolution from Web site to e-commerce partner to development platform is driven by the spirit of innovation that is part of the company's DNA. The world's brightest technology minds come to Amazon.com to research and develop technology that improves the lives of shoppers and sellers around the world. Overview of the role The Business research Analyst will be responsible for data and Machine learning part of continuous improvement projects across compatibility and basket building space. This will require collaboration with local and global teams, which have process and technical expertise. Therefore, RA should be a self-starter who is passionate about discovering and solving complicated problems, learning complex systems, working with numbers, and organizing and communicating data and reports. In compatibility program, RA perform Big data analysis to identify patterns, train model to generate product to product relationship and product to brand & model relationship. RA also continuously improve the ML solution for higher solution accuracy, efficiency and scalability. RA should writes clear and detailed functional specifications based on business requirements as well as writes and reviews business cases. Key job responsibilities Scoping, driving and delivering complex projects across multiple teams. Performs root cause analysis by understand the data need, get data / pull the data and analyze it to form the hypothesis and validate it using data. Conducting a thorough analysis of large datasets to identify patterns, trends, and insights that can inform the development of NLP applications. Developing and implementing machine learning models and deep learning architectures to improve NLP systems. Designing and implementing core NLP tasks such as named entity recognition, classification and part-of-speech tagging. Dive deep to drive product pilots, build and analyze large data sets, and construct problem hypotheses that help steer the product feature roadmap (e.g. with use of Python), tools for database (e.g. SQL, spark) and ML platform (tensorflow, pytorch) Conducting regular code reviews and implementing quality assurance processes to maintain high standards of code quality and performance optimization. Providing technical guidance and mentorship to junior team members and collaborating with external partners to integrate cutting-edge technologies. Find the scalable solution for business problem by executing pilots and build Deterministic and ML model (plug and play on readymade ML models and python skills). Performs supporting research, conduct analysis of the bigger part of the projects and effectively interpret reports to identify opportunities, optimize processes, and implement changes within their part of project. Coordinates design effort between internal team and external team to develop optimal solutions for their part of project for Amazon’s network. Ability to convince and interact with stakeholders at all level either to gather data and information or to execute and implement according to the plan. About the team Amazon.com operates in a virtual, global eCommerce environment without boundaries, and operates a diverse set of businesses in 14 countries, including Retail, third party marketplaces, eCommerce platforms, web services for developers. Retail Business Service (RBS) organization is a core part of leading customer experience and selling partners experience optimization. This team is part of RBS Customer Experience business unit. The team’s primary role is to create and enhance retail selection on the worldwide Amazon online catalog. The compatibility program handled by this team has a direct impact on customer buying decisions and online user experience. Compatibility program aims to address Customer purchase questions if two products works together, as well as reduce return due to incompatibility. BASIC QUALIFICATIONS Basic Qualifications Ability to analyse and then articulate business issues to a wide range of audiences using strong data, written and verbal communication skills Good mastery of BERT and other NLP frameworks such as GPT-2, XLNet, and Transformer models Experience in NLP techniques such as tokenization, parsing, lexing, named entity recognition, sentiment analysis and spellchecking Strong problem-solving skills, creativity and ability to overcome challenges SQL/ETL, Automation Tools Relevant bachelor’s degree or higher 3+ years combined of relevant work experience in a related field/s (project management, customer advocate, product owner, engineering, business analysis) - Diverse experience will be favored eg. a mix of experience across different roles Be self motivated and autonomous with an ability to prioritize well, and remain focused when working within a team located in across several countries and time zones PREFERRED QUALIFICATIONS Preferred Qualifications 3+ years combined of relevant work experience in a related field/s (project management, customer advocate, product owner, engineering, business analysis) - Diverse experience will be favored eg. a mix of experience across different roles Understanding of machine learning concepts including developing models and tuning the hyper-parameters, as well as deploying models and building ML service Experience with computer vision algorithms and libraries such as OpenCV, TensorFlow, Caffe or PyTorch. Technical expertise, experience in Data science and ML Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 1 week ago
1.0 years
4 Lacs
india
On-site
Job Description – AI/LLM Developer We are seeking an AI Developer with deep expertise in Machine Learning (ML), Deep Learning, and Large Language Models (LLMs) . The role involves designing, building, and training custom LLMs from scratch , as well as fine-tuning and optimizing models for specific tasks and deploying them into real-world applications. The candidate should have strong research and engineering skills to push the boundaries of AI while also ensuring scalable, production-ready solutions. The role involves building intelligent systems using machine learning and large language models , and embedding them seamlessly into full-stack applications built on the MERN stack. Key Responsibilities Research and design new architectures for LLMs and generative AI models. Curate, preprocess, and manage large datasets for pretraining and fine-tuning. Implement training optimization techniques (parallelism, quantization, pruning, distillation). Build tokenization and embedding strategies suited for domain-specific applications Fine-tune and adapt pre-trained LLMs for specialized use cases. Integrate ML/LLM models into MERN applications , ensuring scalability, efficiency, and low-latency inference. Stay updated with latest AI/ML research and apply novel methods to improve system capabilities. Skills & Qualifications Strong programming skills in JavaScript/TypeScript and Python . Hands-on experience with Machine Learning/Deep Learning frameworks : PyTorch, TensorFlow, Hugging Face Transformers. Experience in LLM integration (fine-tuning, prompt engineering, API or custom model integration) Knowledge of data engineering (data preprocessing, pipelines, ETL) Ability to bridge AI systems with MERN applications effectively Familiarity with distributed training frameworks Strong background in optimization and deep learning. Job Types: Full-time, Permanent Pay: Up to ₹400,000.00 per year Ability to commute/relocate: Sahakaranagar P.O, Bengaluru, Karnataka: Reliably commute or planning to relocate before starting work (Required) Experience: AI/ML: 1 year (Required) Work Location: In person
Posted 1 week ago
0.0 - 1.0 years
0 Lacs
sahakaranagar p.o, bengaluru, karnataka
On-site
Job Description – AI/LLM Developer We are seeking an AI Developer with deep expertise in Machine Learning (ML), Deep Learning, and Large Language Models (LLMs) . The role involves designing, building, and training custom LLMs from scratch , as well as fine-tuning and optimizing models for specific tasks and deploying them into real-world applications. The candidate should have strong research and engineering skills to push the boundaries of AI while also ensuring scalable, production-ready solutions. The role involves building intelligent systems using machine learning and large language models , and embedding them seamlessly into full-stack applications built on the MERN stack. Key Responsibilities Research and design new architectures for LLMs and generative AI models. Curate, preprocess, and manage large datasets for pretraining and fine-tuning. Implement training optimization techniques (parallelism, quantization, pruning, distillation). Build tokenization and embedding strategies suited for domain-specific applications Fine-tune and adapt pre-trained LLMs for specialized use cases. Integrate ML/LLM models into MERN applications , ensuring scalability, efficiency, and low-latency inference. Stay updated with latest AI/ML research and apply novel methods to improve system capabilities. Skills & Qualifications Strong programming skills in JavaScript/TypeScript and Python . Hands-on experience with Machine Learning/Deep Learning frameworks : PyTorch, TensorFlow, Hugging Face Transformers. Experience in LLM integration (fine-tuning, prompt engineering, API or custom model integration) Knowledge of data engineering (data preprocessing, pipelines, ETL) Ability to bridge AI systems with MERN applications effectively Familiarity with distributed training frameworks Strong background in optimization and deep learning. Job Types: Full-time, Permanent Pay: Up to ₹400,000.00 per year Ability to commute/relocate: Sahakaranagar P.O, Bengaluru, Karnataka: Reliably commute or planning to relocate before starting work (Required) Experience: AI/ML: 1 year (Required) Work Location: In person
Posted 1 week ago
0.0 years
0 Lacs
noida, uttar pradesh, india
Remote
We&aposre building something audacious, something global, in next tech at Mai Labs: a new digital infrastructure layer, an internet architectural rail that puts users, builders, creators and developers first. Our mission To distribute participatory power to billions of people in the digital economy. What this actually means: We have built our own L1 blockchain, and a backend technical structure for protocols and ecosystem to make digital infrastructure efficient, secure and more accessible. Our global products and tools are natively built for web 3.0 world. You will work with teams working on tech products across blockchain, and distributed systems for a real-world problem solving. We&aposre taking on established paths and conventional wisdom about how the Tech and Internet should work. Underlying principle is to solve the hard problem of protecting user rights, digital intellectual property rights and protection of assets in an age of AI and instant replication. Cultural Expectations: Our start-up journey involves constant evolution and adaptation to market dynamics. People work on strategizing entirely new systems with a hands-on approach, within short time frames. Resources consciousness is high, and you get freedom to operate across products, do your best work, and stand ahead in the tech curve. You can expect: Thriving in decision making in an ambiguous, fast-paced environment To exhibit exceptional integrity and reliability on promise delivery Will collaborative and have inclusive attitude A value outcome driven thinking with resource optimization If above resonates with you, we will love to have a discussion with you. Role: Content Creator Location: Noida (Hybrid)- 3 days About the Role: We are looking for a Web3 Content Creator who can conceptualize, script, and produce engaging video content tailored for platforms like YouTube, LinkedIn, Instagram, and other social channels. The ideal candidate is deeply curious about blockchain, tokenization, and Web3 products and has a knack for simplifying complex topics into relatable, creative, and visually compelling stories. What will you get to do Content Strategy Alignment: Collaborate with marketing and product teams to ensure content resonates with the target Web3 community. Content Ideation: Generate innovative video content ideas aligned with Web3, Blockchain, Tokenization trends. Scripting & Storyboarding: Write engaging scripts, structure content in a way that educates, entertains, and builds brand authority. Video Production: Shoot short and long-form videos for different social platforms, host podcasts, interviews, ensuring content is platform-native. Community Engagement: Stay updated with blockchain trends and contribute fresh takes on industry news, product updates, and thought leadership. Profile Expectations: Proven experience in creating engaging video content for social media (portfolio-In English-must have) with knowledge of platform-native formats and trends. Good grasp of Web3, Blockchain, emerging tech concepts & trends (or ability to learn quickly & explain simply). Skilled in scriptwriting, video production and storytelling with comfort being on-camera or directing talent. Ability to host podcasts (remote and in-studio) with confidence and clarity. Youll have the flexibility to work from any 3 days from our Noida office Hiring Process- 1-2 rounds of interviews with Function, HR & senior leader Show more Show less
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |