Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
8.0 years
0 Lacs
Kerala, India
On-site
Job Requirements We are seeking an experienced and visionary AI Architect / Subject Matter Expert (SME) to lead the design, development, and deployment of scalable artificial intelligence and machine learning (AI/ML) solutions. The ideal candidate will have deep expertise in AI technologies, hands-on experience in model development, and the ability to translate business challenges into technical strategies. As an AI Architect, you will serve as a strategic advisor, technology leader, and innovation driver, ensuring our AI solutions are impactful, ethical, and aligned with enterprise goals. Roles & Responsibilities Define the AI vision and roadmap aligned with organizational objectives. Design enterprise-grade AI/ML architectures integrating with data pipelines, APIs, cloud services, and business applications. Evaluate and recommend AI platforms, tools, and frameworks. Lead the end-to-end lifecycle of ML models: from data exploration, feature engineering, and training to deployment and monitoring. Work on supervised, unsupervised, and reinforcement learning models, as well as generative AI (LLMs, diffusion models, etc.). Apply deep learning (CNNs, RNNs, Transformers) for use cases in NLP, computer vision, or forecasting. Collaborate with data engineers to build robust data pipelines and preprocessing workflows. Implement MLOps practices for continuous integration, model versioning, retraining, and deployment. Integrate models with cloud services (AWS Sagemaker, Azure ML, GCP Vertex AI, etc.). Ensure AI solutions are transparent, explainable, fair, and compliant with data privacy laws (e.g., GDPR, HIPAA). Establish AI governance policies, including risk assessments, audits, and bias detection mechanisms. Act as a technical SME across AI initiatives, mentoring data scientists, engineers, and analysts. Lead cross-functional teams in developing POCs and MVPs for high-impact AI projects. Collaborate with executives, product managers, and stakeholders to prioritize use cases. Stay abreast of cutting-edge AI research, tools, and industry trends. Drive innovation through rapid prototyping, collaboration with academia/startups, and participation in AI communities. Publish internal white papers or contribute to patents and publications. Work Experience Required Skills Bachelor's or Master’s in Computer Science, AI, Data Science, or related field (PhD preferred). 8+ years in software or data engineering, with 4+ years in AI/ML roles. Proficient in Python and frameworks like TensorFlow, PyTorch, Scikit-learn, or Hugging Face Transformers. Strong grasp of algorithms, statistics, and probability. Experience deploying models in production via REST APIs, batch scoring, or real-time inference. Hands-on with cloud AI services (AWS, Azure, GCP). Experience with data tools: Spark, Kafka, Airflow, Snowflake, etc. Preferred Qualifications Certifications in AI/ML (e.g., AWS Certified Machine Learning, Google Cloud ML Engineer). Experience with LLMs, RAG pipelines, or enterprise chatbot systems. Knowledge of vector databases (e.g., Pinecone, Weaviate, FAISS). Contributions to open-source AI projects or academic publications. Experience in regulated industries (healthcare, finance, etc.) applying ethical AI principles. Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description We are looking for “Data Solution Architect” to join FC India IT Architecture team. In this role, you will define analytics solutions and guide engineering teams to implement big data solutions on the cloud. Work involves migrating data from legacy on-prem warehouses to Google cloud data platform. This role will provide architecture assistance to data engineering teams in India, with key responsibility of supporting applications globally. This role will also drive business adoption of the new platform and sunset of legacy platforms. Responsibilities Utilize Google Cloud Platform & Data Services to modernize legacy applications. Understand technical business requirements and define architecture solutions that align to Ford Motor & Credit Companies Patterns and Standards. Collaborate and work with global architecture teams to define analytics cloud platform strategy and build Cloud analytics solutions within enterprise data factory. Provide Architecture leadership in design & delivery of new Unified data platform on GCP. Understand complex data structures in analytics space as well as interfacing application systems. Develop and maintain conceptual, logical & physical data models. Design and guide Product teams on Subject Areas and Data Marts to deliver integrated data solutions. Provide architectural guidance for optimal solutions considering regional Regulatory needs. Provide architecture assessments on technical solutions and make recommendations that meet business needs and align with architectural governance and standard. Guide teams through the enterprise architecture processes and advise teams on cloud-based design, development, and data mesh architecture. Provide advisory and technical consulting across all initiatives including PoCs, product evaluations and recommendations, security, architecture assessments, integration considerations, etc. Leverage cloud AI/ML Platforms to deliver business and technical requirements. Qualifications Google Professional Solution Architect certification. 8+ years of relevant work experience in analytics application and data architecture, with deep understanding of cloud hosting concepts and implementations. 5+ years’ experience in Data and Solution Architecture in analytics space. Solid knowledge of cloud data architecture, data modelling principles, and expertise in Data Modeling tools. Experience in migrating legacy analytics applications to Cloud platform and business adoption of these platforms to build insights and dashboards through deep knowledge of traditional and cloud Data Lake, Warehouse and Mart concepts. Good understanding of domain driven design and data mesh principles. Experience with designing, building, and deploying ML models to solve business challenges using Python/BQML/Vertex AI on GCP. Knowledge of enterprise frameworks and technologies. Strong in architecture design patterns, experience with secure interoperability standards and methods, architecture tolls and process. Deep understanding of traditional and cloud data warehouse environment, with hands on programming experience building data pipelines on cloud in a highly distributed and fault-tolerant manner. Experience using Dataflow, pub/sub, Kafka, Cloud run, cloud functions, Bigquery, Dataform, Dataplex , etc. Strong understanding on DevOps principles and practices, including continuous integration and deployment (CI/CD), automated testing & deployment pipelines. Good understanding of cloud security best practices and be familiar with different security tools and techniques like Identity and Access Management (IAM), Encryption, Network Security, etc. Strong understanding of microservices architecture. Nice to Have Bachelor’s degree in Computer science/engineering, Data science or related field. Strong leadership, communication, interpersonal, organizing, and problem-solving skills Good presentation skills with ability to communicate architectural proposals to diverse audiences (user groups, stakeholders, and senior management). Experience in Banking and Financial Regulatory Reporting space. Ability to work on multiple projects in a fast paced & dynamic environment. Exposure to multiple, diverse technologies, platforms, and processing environments. Show more Show less
Posted 1 week ago
0.0 - 15.0 years
0 Lacs
Tamil Nadu
On-site
Job Information Job Type Full time Work Experience 10-15 years Industry IT Services Date Opened 06/06/2025 City Chennai City Corporation State/Province Tamil Nadu Country India Zip/Postal Code 600015 Job Description Job Title: SAP Hybris Commerce Lead / Senior Developer Experience: 10 to 15 years Location: Chennai Job Type: Full-time About the Role We are seeking a highly experienced SAP Hybris Commerce Architect / Senior Developer with 10 to 15 years of expertise in designing, architecting, and optimizing enterprise-level B2B and B2C eCommerce solutions for Retail, Manufacturing, and CPG industries. The ideal candidate will play a key role in solution design, technical leadership, and end-to-end delivery of scalable SAP Commerce Cloud applications. Key Responsibilities 1. Solution Architecture & Design Lead the architecture, design, and implementation of SAP Hybris Commerce Cloud solutions, ensuring scalability, security, and high performance. Define technical roadmaps, best practices, and development standards for enterprise eCommerce solutions. Collaborate with business stakeholders, product managers, and UX/UI teams to define solution requirements. Oversee multi-region, multi-country, multi-currency, and multi-language commerce implementations . 2. Hybris Development & Customization Develop and customize SAP Hybris Commerce Cloud (B2B/B2C/Marketplace) solutions tailored for Retail, Manufacturing, and CPG businesses . Optimize Product Catalog, Pricing, Promotions, Order Management, Checkout, Cart, and Customer Experience . Extend Hybris OOTB (Out-of-the-Box) functionalities to support complex workflows, bulk ordering, and regional pricing models . Lead customer-centric personalization strategies , including AI-driven recommendations, loyalty programs, and predictive analytics . 3. Integration & API Development Architect and oversee integrations between SAP Hybris and ERP (SAP ECC/S4HANA), CRM (SAP Customer Experience), WMS, OMS, PIM, and third-party applications . Develop and manage RESTful APIs, OCC (Omni-Commerce Connect) services , and microservices-based architectures. Ensure seamless connectivity with payment gateways, fraud detection services, and tax engines (Avalara, Vertex) . Drive real-time inventory management, logistics automation, and intelligent order routing . 4. Performance Optimization & Security Optimize high-traffic, high-volume eCommerce applications for peak seasons, omnichannel commerce, and global expansion . Implement caching strategies, CDN optimizations (Akamai, Cloudflare), and Solr-based search enhancements . Ensure PCI-DSS compliance, OAuth/JWT authentication, data encryption, and enterprise security standards . 5. Agile Development & DevOps Lead Agile/Scrum development teams , ensuring timely delivery of high-quality SAP Commerce Cloud solutions. Establish CI/CD pipelines with Jenkins, Git, Docker, Kubernetes , and cloud-native deployments on AWS/Azure/GCP . Implement feature toggling, A/B testing, and data-driven decision-making to enhance user experiences. 6. Leadership & Stakeholder Engagement Mentor and lead a team of SAP Commerce Cloud developers , ensuring adherence to best practices. Act as a technical advisor to business leaders and project managers , aligning technology with business objectives. Participate in RFPs, client presentations, and strategic discussions for SAP Commerce Cloud adoption. Required Skills & Qualifications 1. Core Technical Skills 10 to 15 years of experience in SAP Hybris Commerce (SAP Commerce Cloud) development and architecture. Expertise in Java, Spring MVC, Spring Boot, Hibernate, JSP, JavaScript, and SAP Hybris framework . Extensive experience in: Retail : AI-driven personalization, multi-channel commerce, loyalty programs. Manufacturing : B2B commerce, RFQ (Request for Quote), vendor and partner portals. CPG : Direct-to-Consumer (DTC) commerce, subscription models, omnichannel analytics. Strong experience with Solr Search, MySQL, Apache Tomcat, cloud-based infrastructure . Hands-on experience with Spartacus (Headless Commerce UI - Angular) is preferred. 2. Integration & Middleware Proven experience integrating SAP Hybris with SAP S/4HANA, ERP, OMS, and CRM . Strong knowledge of SAP CPI (Cloud Platform Integration), Kafka, RabbitMQ, GraphQL, event-driven architectures . 3. Cloud, DevOps & Security Experience in cloud deployments (AWS, Azure, GCP) and containerized architectures (Docker, Kubernetes) . Expertise in security best practices , including PCI-DSS compliance, OAuth 2.0, JWT authentication . 4. Leadership & Communication Ability to lead technical teams, mentor developers, and drive architectural decisions . Strong problem-solving, analytical, and stakeholder management skills. Excellent communication skills with experience in client-facing roles and strategic decision-making . Educational Qualification Bachelor’s/Master’s degree in Computer Science, Information Technology, or a related field . SAP Commerce Cloud certification is a plus .
Posted 1 week ago
12.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
We are seeking an experienced AI Solution Architect to lead the design and implementation of AI-driven, cloud-native applications. The ideal candidate will possess deep expertise in Generative AI, Agentic AI, cloud platforms (AWS, Azure, GCP), and modern data engineering practices. This role involves collaborating with cross-functional teams to deliver scalable, secure, and intelligent solutions in a fast-paced, innovation-driven environment. Key Responsibilities: Design and architect AI/ML solutions, including Generative AI, Retrieval-Augmented Generation (RAG), and fine-tuning of Large Language Models (LLMs) using frameworks like LangChain, LangGraph, and Hugging Face. Implement cloud migration strategies for monolithic systems to microservices/serverless architectures using AWS, Azure, and GCP. Lead development of document automation systems leveraging models such as BART, LayoutLM, and Agentic AI workflows. Architect and optimize data lakes, ETL pipelines, and analytics dashboards using Databricks, PySpark, Kibana, and MLOps tools. Build centralized search engines using ElasticSearch, Solr, and Neo4j for intelligent content discovery and sentiment analysis. Ensure application and ML pipeline security with tools like OWASP ZAP, SonarQube, WebInspect, and container security tools. Collaborate with InfoSec and DevOps teams to maintain CI/CD pipelines, perform vulnerability analysis, and ensure compliance. Guide modernization initiatives across app stacks and coordinate BCDR-compliant infrastructures for mission-critical services. Provide technical leadership and mentoring to engineering teams during all phases of the SDLC. Required Skills & Qualifications 12+ years of total experience, with extensive tenure as a Solution Architect in AI and cloud-driven transformations. Hands-on experience with: Generative AI, LLMs, Prompt Engineering, LangChain, AutoGen, Vertex AI, AWS Bedrock Python, Java (Spring Boot, Spring AI), PyTorch Vector & Graph Databases: ElasticSearch, Solr, Neo4j Cloud Platforms: AWS, Azure, GCP (CAF, serverless, containerization) DevSecOps: SonarQube, OWASP, oAuth2, container security Strong background in application modernization, cloud-native architecture, and MLOps orchestration. Familiarity with front-end technologies: HTML, JavaScript, Angular, JQuery. Certifications Google Professional Cloud Architect AWS Solution Architect Associate Cisco Certified Design Associate (CCDA) Cisco Certified Network Associate (CCNA) Cisco Security Ninja Green Belt Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Working as an AI/ML Engineer at Navtech, you will: * Design, develop, and deploy machine learning models for classification, regression, clustering, recommendations, or NLP tasks. Clean, preprocess, and analyze large datasets to extract meaningful insights and features. Work closely with data engineers to develop scalable and reliable data pipelines. Experiment with different algorithms and techniques to improve model performance. Monitor and maintain production ML models, including retraining and model drift detection. Collaborate with software engineers to integrate ML models into applications and services. Document processes, experiments, and decisions for reproducibility and transparency. Stay current with the latest research and trends in machine learning and AI. Who Are We Looking for Exactly? * 2–4 years of hands-on experience in building and deploying ML models in real-world applications. Strong knowledge of Python and ML libraries such as Scikit-learn, TensorFlow, PyTorch, XGBoost, or similar. Experience with data preprocessing, feature engineering, and model evaluation techniques. Solid understanding of ML concepts such as supervised and unsupervised learning, overfitting, regularization, etc. Experience working with Jupyter, pandas, NumPy, and visualization libraries like Matplotlib or Seaborn. Familiarity with version control (Git) and basic software engineering practices. You consistently demonstrate strong verbal and written communication skills as well as strong analytical and problem-solving abilities You should have a master’s degree /Bachelors (BS) in computer science, Software Engineering, IT, Technology Management or related degrees and throughout education in English medium. We’ll REALLY love you if you: * Have knowledge of cloud platforms (AWS, Azure, GCP) and ML services (SageMaker, Vertex AI, etc.) Have knowledge of GenAI prompting and hosting of LLMs. Have experience with NLP libraries (spaCy, Hugging Face Transformers, NLTK). Have familiarity with MLOps tools and practices (MLflow, DVC, Kubeflow, etc.). Have exposure to deep learning and neural network architectures. Have knowledge of REST APIs and how to serve ML models (e.g., Flask, FastAPI, Docker). Why Navtech? * Performance review and Appraisal Twice a year. Competitive pay package with additional bonus & benefits. Work with US, UK & Europe based industry renowned clients for exponential technical growth. Medical Insurance cover for self & immediate family. Work with a culturally diverse team from different geographies. About Us Navtech is a premier IT software and Services provider. Navtech’s mission is to increase public cloud adoption and build cloud-first solutions that become trendsetting platforms of the future. We have been recognized as the Best Cloud Service Provider at GoodFirms for ensuring good results with quality services. Here, we strive to innovate and push technology and service boundaries to provide best-in-class technology solutions to our clients at scale. We deliver to our clients globally from our state-of-the-art design and development centers in the US & Hyderabad. We’re a fast-growing company with clients in the United States, UK, and Europe. We are also a certified AWS partner. You will join a team of talented developers, quality engineers, product managers whose mission is to impact above 100 million people across the world with technological services by the year 2030. Navtech is looking for a AI/ML Engineer to join our growing data science and machine learning team. In this role, you will be responsible for building, deploying, and maintaining machine learning models and pipelines that power intelligent products and data-driven decisions. Show more Show less
Posted 1 week ago
3.0 - 5.0 years
30 - 35 Lacs
Pune
Work from Office
: Job TitleDevOps Engineer, AVP LocationPune, India Role Description We are seeking a highly skilled and experienced DevOps Engineer to join our team, with a focus on Google Cloud as we migrate and build the financial crime risk platforms on the cloud. The successful candidate will be responsible for designing, implementing, and maintaining our teams infrastructure and workflows on Google Cloud Platforms. This is a unique opportunity to work at the intersection of software development, infrastructure management and to contribute to the growth and success of our team. DevOps Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel.You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support." What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Design, implement, and maintain our teams infrastructure and workflows on Google Cloud Platform, including GCP services such as Google Kubernetes Engine (GKE), Cloud Storage, Vertex AI, Anthos, Monitoring etc. Design, implement, and maintain our containerization and orchestration strategy using Docker and Kubernetes. Collaborate with development teams to ensure seamless integration of containerized applications into our production environment. Collaborate with software developers to integrate machine learning models and algorithms into our products, using PyTorch, TensorFlow or other machine learning frameworks. Develop and maintain CI/CD pipelines for our products, using tools such as GitHub and GitHub actions. Create and maintain Infrastructure as Code templates using Terraform. Ensure the reliability, scalability, and security of our infrastructure and products, using monitoring and logging tools such as Anthos Service Mesh (ASM), Google Cloud's operations (GCO) etc. Work closely with other teams, such as software development, data science, and product management, to identify and prioritize infrastructure and machine learning requirements. Stay up to date with the latest developments in Google Cloud Platform and machine learning and apply this knowledge to improve our products and processes. Your skills and experience Bachelors degree in computer science, Engineering, or a related field. At least 3 years of experience in a DevOps or SRE role, with a focus on Google Cloud Platform. Strong experience with infrastructure as code tools such as Terraform or Cloud Formation. Experience with containerization technologies such as Docker and container orchestration tools such as Kubernetes. Knowledge of machine learning frameworks such as TensorFlow or PyTorch. Experience with CI/CD pipelines and automated testing. Strong understanding of security and compliance best practices, including GCP security and compliance features. Excellent communication and collaboration skills, with the ability to work closely with cross-functional teams Preferred Qualifications Masters degree in computer science, Engineering, or a related field. Knowledge of cloud-native application development, including serverless computing and event-driven architecture. Experience with cloud cost optimization and resource management. Familiarity with agile software development methodologies and version control systems such as Git How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 1 week ago
5.0 - 8.0 years
7 - 11 Lacs
Chennai
Work from Office
Job Information Job Opening ID ZR_2200_JOB Date Opened 15/04/2024 Industry Technology Job Type Work Experience 5-8 years Job Title AI/ML Engineer City Chennai Province Tamil Nadu Country India Postal Code 600004 Number of Positions 4 Experience in CI/CD pipelines, scripting languages, and a deep understanding of version control systems (e.g. Git), containerization (e.g. Docker), and continuous integration/deployment tools (e.g. Jenkins) third party integration is a plus, cloud computing platforms (e.g. AWS, GCP, Azure), Kubernetes and Kafka. Experience in 4+ years of experience building production-grade ML pipelines. Proficient in Python and frameworks like Tensorflow, Keras, or PyTorch. Experience with cloud build, deployment, and orchestration tools Experience with MLOps tools such as MLFlow, Kubeflow, Weights & Biases, AWS Sagemaker, Vertex AI, DVC, Airflow, Prefect, etc., Experience in statistical modeling, machine learning, data mining, and unstructured data analytics. Understanding of ML Lifecycle, MLOps & Hands on experience to Productionize the ML Model Detail-oriented, with the ability to work both independently and collaboratively. Ability to work successfully with multi-functional teams, principals, and architects, across organizational boundaries and geographies. Equal comfort driving low-level technical implementation and high-level architecture evolution Experience working with data engineering pipelines. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 1 week ago
10.0 - 15.0 years
12 - 17 Lacs
Hyderabad
Work from Office
Cross functional integration with MM-LE, FI Must have done at-least 2 Implementation Functional knowledge in SD, MM, HCM and FI with Media domain should have worked with developer on custom Fiori app ideally should have AATP experience should have experience with flexible workflows Vertex and custom Idocs solid integration experience (upstream / downstream) understanding of BRF+
Posted 1 week ago
10.0 - 15.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Your Team Responsibilities MSCI Inc. seeks a dynamic SAP Finance Technology Director. This role will be part of the Business Technology department and will lead the Finance Tech team located in India (Mumbai or Pune), working closely with finance and business stakeholders to ensure the systems that support MSCI's order-to-cash, record-to-report, revenue recognition, procure-to-pay, and FP&A operate seamlessly and efficiently. As the SAP Finance Technology Director, you will be responsible for driving global projects for the implementation, enhancement, and maintenance of both finance systems and finance processes to automate and streamline operations. This includes working with SAP modules like ECC, BPC, and GRC while ensuring integration with other financial and reporting tools like RevStream, Vertex, Concur, or Board, as well as other core solutions like Salesforce or Workday. You will collaborate across regions and departments to ensure that technology solutions meet business needs and regulatory requirements, driving continuous improvements, while ensuring compliance with SOX controls and audit requirements. The SAP Finance Technology Director will manage a team of direct reports and contractors to support the overall operations of both Financial Systems and Financial Processes. Your Key Responsibilities As a Finance Technology Lead, you will be responsible for: Finance Systems Management: Operate, maintain, and improve the systems that support order-to-cash, record-to-report, revenue recognition, procure-to-pay, and FP&A processes. System Enhancements: Lead the identification, development, and implementation of system enhancements that automate and improve the accuracy and efficiency of finance operations. Cross-functional Collaboration: Work closely with Business (finance, sales, sales operations, product, client services, etc.) and IT teams to translate functional requirements into technical solutions. Ensure smooth integration of financial systems with other financial and operational platforms. Project Management: Oversee projects related to system upgrades like a migration to SAP S/4HANA, new functionality deployments, and process automation initiatives. Ensure that projects are delivered on time and within budget. Compliance and Audit Support: Ensure that the systems supporting revenue processes are compliant with internal controls and regulatory requirements, including SOX cycles, Standard Operating Procedures (SOPs), and audit protocols. Training and Support: Provide technical guidance and training to finance and IT teams on the systems and processes supporting Revenue Recognition. Act as the go-to expert for system-related inquiries and troubleshooting. Stakeholder Communication: Communicate system updates, enhancements, and issues to senior IT leadership and key stakeholders within the finance and business departments. Your Skills And Experience That Will Help You Excel Technical Expertise in ERP Systems: Extensive experience with SAP systems (e.g., ECC, BPC, GRC). Experience in integrating with complementary business applications such as Salesforce, Concur, Vertex, or Workiva is helpful. Business Process Knowledge: Familiarity with finance processes: order-to-cash, record-to-report, procure-to-pay, revenue recognition, and FP&A. Project Management: Proven experience in managing global large-scale IT projects, particularly those involving financial systems implementation or upgrades. Cross-functional Collaboration: Ability to effectively collaborate with finance, business, and IT teams to deliver technology solutions that meet business requirements. Systems Integration: Strong understanding of systems integration principles, particularly as they relate to financial reporting and revenue management systems. Compliance and Security: Knowledge of SOX compliance, internal controls, and system security requirements as they pertain to financial systems. Problem-solving Skills: Analytical and problem-solving skills with the ability to troubleshoot complex system issues and provide efficient solutions. Leadership and Team Development: Experience managing a team of IT professionals, providing leadership coaching, and guidance in system support and project execution. Communication Skills: Strong written and verbal communication skills in English, with the ability to clearly explain technical concepts to non-technical stakeholders. Preferred Qualifications: Masters in Technology-related or Finance-related domain Preferred Experience: 10-15 years of strong operational experience with SAP or similar ERP systems. About MSCI What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In data analysis at PwC, you will focus on utilising advanced analytical techniques to extract insights from large datasets and drive data-driven decision-making. You will leverage skills in data manipulation, visualisation, and statistical modelling to support clients in solving complex business problems. Years of Experience: Candidates with 4+ years of hands on experience Position: Senior Associate Industry: Telecom / Network Analytics / Customer Analytics Required Skills: Successful candidates will have demonstrated the following skills and characteristics: Must Have Proven experience with telco data including call detail records (CDRs), customer churn models, and network analytics Deep understanding of predictive modeling for customer lifetime value and usage behavior Experience working with telco clients or telco data platforms (like Amdocs, Ericsson, Nokia, AT&T etc) Proficiency in machine learning techniques, including classification, regression, clustering, and time-series forecasting Strong command of statistical techniques (e.g., logistic regression, hypothesis testing, segmentation models) Strong programming in Python or R, and SQL with telco-focused data wrangling Exposure to big data technologies used in telco environments (e.g., Hadoop, Spark) Experience working in the telecom industry across domains such as customer churn prediction, ARPU modeling, pricing optimization, and network performance analytics Strong communication skills to interface with technical and business teams Nice To Have Exposure to cloud platforms (Azure ML, AWS SageMaker, GCP Vertex AI) Experience working with telecom OSS/BSS systems or customer segmentation tools Familiarity with network performance analytics, anomaly detection, or real-time data processing Strong client communication and presentation skills Roles And Responsibilities Assist analytics projects within the telecom domain, driving design, development, and delivery of data science solutions Develop and execute on project & analysis plans under the guidance of Project Manager Interact with and advise consultants/clients in US as a subject matter expert to formalize data sources to be used, datasets to be acquired, data & use case clarifications that are needed to get a strong hold on data and the business problem to be solved Drive and Conduct analysis using advanced analytics tools and coach the junior team members Implement necessary quality control measures in place to ensure the deliverable integrity like data quality, model robustness, and explainability for deployments. Validate analysis outcomes, recommendations with all stakeholders including the client team Build storylines and make presentations to the client team and/or PwC project leadership team Contribute to the knowledge and firm building activities Professional And Educational Background BE / B.Tech / MCA / M.Sc / M.E / M.Tech /Master’s Degree /MBA from reputed institute Show more Show less
Posted 1 week ago
3.0 - 5.0 years
6 - 8 Lacs
India
On-site
Job Summary: We are seeking an experienced and motivated Machine Learning Engineer with 3-5 years of experience in designing, developing, and deploying machine learning models. The ideal candidate must be well-versed in Python, SQL/MySQL, ML libraries, and cloud deployment using Google Cloud Platform (GCP). This role is ideal for candidates passionate about solving real-world problems using AI and ML technologies. Key Responsibilities: Build, train, evaluate, and deploy scalable machine learning models. Perform data preprocessing, feature engineering, and pipeline development. Design end-to-end ML workflows from model development to deployment on GCP. Leverage GCP tools such as Vertex AI, BigQuery, Cloud Functions, AI Platform, etc. Work with SQL/MySQL databases for data extraction and manipulation. Collaborate with cross-functional teams including data engineers, software developers, and business stakeholders. Monitor and optimize deployed models for performance and accuracy. Ensure model reproducibility, versioning, and documentation. Required Skills: Strong proficiency in Python and popular ML libraries (scikit-learn, TensorFlow, Keras, PyTorch, XGBoost, Pandas, NumPy). Experience with SQL and MySQL. Solid understanding of ML algorithms and data handling techniques. Hands-on experience with GCP services: Vertex AI BigQuery Cloud Storage AI Platform Cloud Functions Experience in model deployment using Flask, FastAPI, or similar frameworks. Proficient in data visualization using Matplotlib, Seaborn, or similar tools. Familiarity with Git, CI/CD, and Agile practices. Preferred Qualifications: GCP Certification (e.g., Professional Machine Learning Engineer or Data Engineer). Experience with ML Ops tools. Exposure to Docker and Kubernetes. Familiarity with Big Data tools like Apache Spark or Hadoop. Perks & Benefits: 5-day working week Competitive salary and performance-based incentives Learning & development opportunities Dynamic and growth-driven work environment Job Type: Full-time Pay: ₹600,000.00 - ₹800,000.00 per month Schedule: Day shift Work Location: In person
Posted 1 week ago
0 years
0 Lacs
Bihar
On-site
Experience 8 + of experience in SAP SD having cross functional integration with LE with solid S4 Experience Qualification Cross functional integration with MM-LE, FI Must have done at-least 2 Implementation Functional knowledge in SD, MM, HCM and FI with Media domain should have worked with developer on custom Fiori app ideally should have AATP experience should have experience with flexible workflows Vertex and custom Idocs solid integration experience (upstream / downstream) understanding of BRF+ Key Responsibility Areas Lead SD pricing, Invoicing, Billing Plan Sales & Distribution module experience - Sales Process, Billing process, Cross catalogue Mapping, Product Hierarchy, Service catalogue, Sales order process, Sales GL Posting. Integration with any external OMS/ECom Application for Sales Orders etc. if any Lead Cross integration topics with MM, Finance, HCM Experience in MM-LE with SD on S4 Hana Execute support & roll out projects Support Integration of cloud & on-premise Bring in industry best practices and standards Write technical and functional specifications as required Job Description ͏ ͏ ͏ ͏ Experience 8+ of experience in SAP SD having cross functional integration with LE with solid S4 Experience Qualification Cross functional integration with MM-LE, FI Must have done at-least 2 Implementation Functional knowledge in SD, MM, HCM and FI with Media domain should have worked with developer on custom Fiori app ideally should have AATP experience should have experience with flexible workflows Vertex and custom Idocs solid integration experience (upstream / downstream) understanding of BRF+ Key Responsibility Areas Lead SD pricing, Invoicing, Billing Plan Sales & Distribution module experience - Sales Process, Billing process, Cross catalogue Mapping, Product Hierarchy, Service catalogue, Sales order process, Sales GL Posting. Integration with any external OMS/ECom Application for Sales Orders etc. if any Lead Cross integration topics with MM, Finance, HCM Experience in MM-LE with SD on S4 Hana Execute support & roll out projects Support Integration of cloud & on-premise Bring in industry best practices and standards Write technical and functional specifications as required
Posted 1 week ago
7.0 years
0 Lacs
Himachal Pradesh, India
Remote
As a global leader in cybersecurity, CrowdStrike protects the people, processes and technologies that drive modern organizations. Since 2011, our mission hasn’t changed — we’re here to stop breaches, and we’ve redefined modern security with the world’s most advanced AI-native platform. We work on large scale distributed systems, processing almost 3 trillion events per day. We have 3.44 PB of RAM deployed across our fleet of C* servers - and this traffic is growing daily. Our customers span all industries, and they count on CrowdStrike to keep their businesses running, their communities safe and their lives moving forward. We’re also a mission-driven company. We cultivate a culture that gives every CrowdStriker both the flexibility and autonomy to own their careers. We’re always looking to add talented CrowdStrikers to the team who have limitless passion, a relentless focus on innovation and a fanatical commitment to our customers, our community and each other. Ready to join a mission that matters? The future of cybersecurity starts with you. About The Role The charter of the Data + ML Platform team is to harness all the data that is ingested and cataloged within the Data LakeHouse for exploration, insights, model development, ML Engineering and Insights Activation. This team is situated within the larger Data Platform group, which serves as one of the core pillars of our company. We process data at a truly immense scale. Our processing is composed of various facets including threat events collected via telemetry data, associated metadata, along with IT asset information, contextual information about threat exposure based on additional processing, etc. These facets comprise the overall data platform, which is currently over 200 PB and maintained in a hyper scale Data Lakehouse, built and owned by the Data Platform team. The ingestion mechanisms include both batch and near real-time streams that form the core Threat Analytics Platform used for insights, threat hunting, incident investigations and more. As an engineer in this team, you will play an integral role as we build out our ML Experimentation Platform from the ground up. You will collaborate closely with Data Platform Software Engineers, Data Scientists & Threat Analysts to design, implement, and maintain scalable ML pipelines that will be used for Data Preparation, Cataloging, Feature Engineering, Model Training, and Model Serving that influence critical business decisions. You’ll be a key contributor in a production-focused culture that bridges the gap between model development and operational success. Future plans include generative AI investments for use cases such as modeling attack paths for IT assets. What You’ll Do Help design, build, and facilitate adoption of a modern Data+ML platform Modularize complex ML code into standardized and repeatable components Establish and facilitate adoption of repeatable patterns for model development, deployment, and monitoring Build a platform that scales to thousands of users and offers self-service capability to build ML experimentation pipelines Leverage workflow orchestration tools to deploy efficient and scalable execution of complex data and ML pipelines Review code changes from data scientists and champion software development best practices Leverage cloud services like Kubernetes, blob storage, and queues in our cloud first environment What You’ll Need B.S. in Computer Science, Data Science, Statistics, Applied Mathematics, or a related field and 7 + years related experience; or M.S. with 5+ years of experience; or Ph.D with 6+ years of experience. 3+ years experience developing and deploying machine learning solutions to production. Familiarity with typical machine learning algorithms from an engineering perspective (how they are built and used, not necessarily the theory); familiarity with supervised / unsupervised approaches: how, why, and when and labeled data is created and used 3+ years experience with ML Platform tools like Jupyter Notebooks, NVidia Workbench, MLFlow, Ray, Vertex AI etc. Experience building data platform product(s) or features with (one of) Apache Spark, Flink or comparable tools in GCP. Experience with Iceberg is highly desirable. Proficiency in distributed computing and orchestration technologies (Kubernetes, Airflow, etc.) Production experience with infrastructure-as-code tools such as Terraform, FluxCD Expert level experience with Python; Java/Scala exposure is recommended. Ability to write Python interfaces to provide standardized and simplified interfaces for data scientists to utilize internal Crowdstrike tools Expert level experience with CI/CD frameworks such as GitHub Actions Expert level experience with containerization frameworks Strong analytical and problem solving skills, capable of working in a dynamic environment Exceptional interpersonal and communication skills. Work with stakeholders across multiple teams and synthesize their needs into software interfaces and processes. Experience With The Following Is Desirable Go Iceberg Pinot or other time-series/OLAP-style database Jenkins Parquet Protocol Buffers/GRPC VJ1 Benefits Of Working At CrowdStrike Remote-friendly and flexible work culture Market leader in compensation and equity awards Comprehensive physical and mental wellness programs Competitive vacation and holidays for recharge Paid parental and adoption leaves Professional development opportunities for all employees regardless of level or role Employee Resource Groups, geographic neighbourhood groups and volunteer opportunities to build connections Vibrant office culture with world class amenities Great Place to Work Certified™ across the globe CrowdStrike is proud to be an equal opportunity employer. We are committed to fostering a culture of belonging where everyone is valued for who they are and empowered to succeed. We support veterans and individuals with disabilities through our affirmative action program. CrowdStrike is committed to providing equal employment opportunity for all employees and applicants for employment. The Company does not discriminate in employment opportunities or practices on the basis of race, color, creed, ethnicity, religion, sex (including pregnancy or pregnancy-related medical conditions), sexual orientation, gender identity, marital or family status, veteran status, age, national origin, ancestry, physical disability (including HIV and AIDS), mental disability, medical condition, genetic information, membership or activity in a local human rights commission, status with regard to public assistance, or any other characteristic protected by law. We base all employment decisions--including recruitment, selection, training, compensation, benefits, discipline, promotions, transfers, lay-offs, return from lay-off, terminations and social/recreational programs--on valid job requirements. If you need assistance accessing or reviewing the information on this website or need help submitting an application for employment or requesting an accommodation, please contact us at recruiting@crowdstrike.com for further assistance. Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
India
Remote
Job description Job description This is a permanent work from home position from anywhere in India Notice period - less than 30 Days (Immediate joiners preferred) We are seeking a Generative AI Engineer with 5 + years of experience in machine learning, deep learning, and large language models (LLMs) . The ideal candidate will lead the design, development, and deployment of AI-driven solutions for text, image, and speech generation using cutting-edge GenAI frameworks and cloud platforms . Key Responsibilities: Develop and fine-tune Generative AI models (LLMs, GANs, Diffusion Models, VAEs). Implement NLP, computer vision, and speech-based AI applications . Optimize model performance, scalability, and efficiency for production use. Work with transformer architectures (GPT, BERT, T5, LLaMA, etc.). Deploy AI models on AWS, Azure, or GCP using MLOps and containerization. Design LLM-based applications using LangChain, vector databases, and prompt engineering . Collaborate with cross-functional teams to integrate AI solutions into enterprise applications . Stay ahead of AI/ML trends and advancements to drive innovation. Required Skills: GenAI Frameworks TensorFlow, PyTorch, Hugging Face, OpenAI API LLM Fine-tuning, RAG (Retrieval-Augmented Generation), Prompt Engineering Cloud AI Services AWS SageMaker, Azure OpenAI, Google Vertex AI Programming & Data Engineering Python, PyTorch, LangChain, SQL, NoSQL MLOps & Deployment – Docker, Kubernetes, CI/CD, Vector Databases (FAISS, Pinecone) Show more Show less
Posted 1 week ago
0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
● Minimum of (4+) years of experience in AI-based application development. ● Fine-tune pre-existing models to improve performance and accuracy. ● Experience with TensorFlow or PyTorch, Scikit-learn, or similar ML frameworks and familiarity with APIs like OpenAI or vertex AI ● Experience with NLP tools and libraries (e.g., NLTK, SpaCy, GPT, BERT). ● Implement frameworks like LangChain, Anthropics Constitutional AI, OpenAIs, Hugging Face, and Prompt Engineering techniques to build robust and scalable AI applications. ● Evaluate and analyze RAG solution and Utilise the best-in-class LLM to define customer experience solutions (Fine tune Large Language models (LLM)). ● Architect and develop advanced generative AI solutions leveraging state-of-the-art language models (LLMs) such as GPT, LLaMA, PaLM, BLOOM, and others. ● Strong understanding and experience with open-source multimodal LLM models to customize and create solutions. ● Explore and implement cutting-edge techniques like Few-Shot Learning, Reinforcement Learning, Multi-Task Learning, and Transfer Learning for AI model training and fine-tuning. ● Proficiency in data preprocessing, feature engineering, and data visualization using tools like Pandas, NumPy, and Matplotlib. ● Optimize model performance through experimentation, hyperparameter tuning, and advanced optimization techniques. ● Proficiency in Python with the ability to get hands-on with coding at a deep level. ● Develop and maintain APIs using Python's FastAPI, Flask, or Django for integrating AI capabilities into various systems. ● Ability to write optimized and high-performing scripts on relational databases (e.g., MySQL, PostgreSQL) or non-relational database (e.g., MongoDB or Cassandra) ● Enthusiasm for continuous learning and professional developement in AI and leated technologies. ● Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. ● Knowledge of cloud services like AWS, Google Cloud, or Azure. ● Proficiency with version control systems, especially Git. ● Familiarity with data pre-processing techniques and pipeline development for Al model training. ● Experience with deploying models using Docker, Kubernetes ● Experience with AWS Bedrock, and Sagemaker is a plus ● Strong problem-solving skills with the ability to translate complex business problems into Al solutions. Show more Show less
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Role - Backend Developer Experience - 3-7 yrs Location - Bangalore ● Bachelors/Masters in Computer science from a reputed institute/university ● 3-7 years of strong experience in building Java/golang/python based server side solutions ● Strong in data structure, algorithm and software design ● Experience in designing and building RESTful micro services ● Experience with Server side frameworks such as JPA (HIbernate/SpringData), Spring, vertex, Springboot, Redis, Kafka, Lucene/Solr/ElasticSearch etc. ● Experience in data modeling and design, database query tuning ● Experience in MySQL and strong understanding of relational databases. ● Comfortable with agile, iterative development practices ● Excellent communication (verbal & written), interpersonal and leadership skills ● Previous experience as part of a Start-up or a Product company. ● Experience with AWS technologies would be a plus ● Experience with reactive programming frameworks would be a plus · Contributions to opensource are a plus ● Familiarity with deployment architecture principles and prior experience with container orchestration platforms, particularly Kubernetes, would be a significant advantage Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Jaipur, Rajasthan, India
On-site
Job Title: Machine Learning (ML) Engineer 📍 Location: Jaipur (On-site) 🕒 Experience Required: 3 to 5 Years 🚀 Availability: Immediate Joiner 💼 Employment Type: Full-time Job Summary: We are seeking an experienced and motivated Machine Learning Engineer with 4+ years of experience in designing, developing, and deploying machine learning models. The ideal candidate must be well-versed in Python, SQL/MySQL, ML libraries, and cloud deployment using Google Cloud Platform (GCP). This role is ideal for candidates passionate about solving real-world problems using AI and ML technologies. Key Responsibilities: Build, train, evaluate, and deploy scalable machine learning models. Perform data preprocessing, feature engineering, and pipeline development. Design end-to-end ML workflows from model development to deployment on GCP. Leverage GCP tools such as Vertex AI, BigQuery, Cloud Functions, AI Platform, etc. Work with SQL/MySQL databases for data extraction and manipulation. Collaborate with cross-functional teams including data engineers, software developers, and business stakeholders. Monitor and optimize deployed models for performance and accuracy. Ensure model reproducibility, versioning, and documentation. Required Skills: Strong proficiency in Python and popular ML libraries (scikit-learn, TensorFlow, Keras, PyTorch, XGBoost, Pandas, NumPy). Experience with SQL and MySQL. Solid understanding of ML algorithms and data handling techniques. Hands-on experience with GCP services: Vertex AI BigQuery Cloud Storage AI Platform Cloud Functions Experience in model deployment using Flask, FastAPI, or similar frameworks. Proficient in data visualization using Matplotlib, Seaborn, or similar tools. Familiarity with Git, CI/CD, and Agile practices. Preferred Qualifications: GCP Certification (e.g., Professional Machine Learning Engineer or Data Engineer). Experience with ML Ops tools. Exposure to Docker and Kubernetes. Familiarity with Big Data tools like Apache Spark or Hadoop. Perks & Benefits: 5-day working week Competitive salary and performance-based incentives Learning & development opportunities Dynamic and growth-driven work environment How to Apply: Send your updated resume to shubham@vidhema.com Show more Show less
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Jd Looking for a very strong Java expert. hands-on resource who can lead & interact with the client directly and understand the requirements & contribute to technical discussions apart from development work. Should have very good communication skills. Must-Have Skills Java Spring Boot Microservices Reactive programming Vertex Qualifications Bachelor's or Master's degrees in Computer Science, Computer Engineering, or a related technical discipline. Ability to work independently and to adapt to a fast-changing environment. Creative, self-disciplined, and capable of identifying and completing critical tasks independently and with a sense of urgency. Driving Results A good single contributor and a good team player. Flexible attitude towards work, as per the needs. Proactively identify & communicate issues and risks. Other Personal Characteristics Dynamic, engaging, self-reliant developer Ability to deal with ambiguity Manage a collaborative and analytical approach Self-confident and humble Open to continuous learning Intelligent, rigorous thinker who can operate successfully amongst bright people Role Responsibilities [Add detailed role responsibilities here] Skills: boot,spring boot,java,vertex,reactive programming,communication skills,communication,skills,microservices Show more Show less
Posted 1 week ago
0 years
0 Lacs
Bhubaneshwar, Odisha, India
On-site
Jd Looking for a very strong Java expert. hands-on resource who can lead & interact with the client directly and understand the requirements & contribute to technical discussions apart from development work. Should have very good communication skills. Must-Have Skills Java Spring Boot Microservices Reactive programming Vertex Qualifications Bachelor's or Master's degrees in Computer Science, Computer Engineering, or a related technical discipline. Ability to work independently and to adapt to a fast-changing environment. Creative, self-disciplined, and capable of identifying and completing critical tasks independently and with a sense of urgency. Driving Results A good single contributor and a good team player. Flexible attitude towards work, as per the needs. Proactively identify & communicate issues and risks. Other Personal Characteristics Dynamic, engaging, self-reliant developer Ability to deal with ambiguity Manage a collaborative and analytical approach Self-confident and humble Open to continuous learning Intelligent, rigorous thinker who can operate successfully amongst bright people Role Responsibilities [Add detailed role responsibilities here] Skills: boot,spring boot,java,vertex,reactive programming,communication skills,communication,skills,microservices Show more Show less
Posted 1 week ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Jd Looking for a very strong Java expert. hands-on resource who can lead & interact with the client directly and understand the requirements & contribute to technical discussions apart from development work. Should have very good communication skills. Must-Have Skills Java Spring Boot Microservices Reactive programming Vertex Qualifications Bachelor's or Master's degrees in Computer Science, Computer Engineering, or a related technical discipline. Ability to work independently and to adapt to a fast-changing environment. Creative, self-disciplined, and capable of identifying and completing critical tasks independently and with a sense of urgency. Driving Results A good single contributor and a good team player. Flexible attitude towards work, as per the needs. Proactively identify & communicate issues and risks. Other Personal Characteristics Dynamic, engaging, self-reliant developer Ability to deal with ambiguity Manage a collaborative and analytical approach Self-confident and humble Open to continuous learning Intelligent, rigorous thinker who can operate successfully amongst bright people Role Responsibilities [Add detailed role responsibilities here] Skills: boot,spring boot,java,vertex,reactive programming,communication skills,communication,skills,microservices Show more Show less
Posted 1 week ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Jd Looking for a very strong Java expert. hands-on resource who can lead & interact with the client directly and understand the requirements & contribute to technical discussions apart from development work. Should have very good communication skills. Must-Have Skills Java Spring Boot Microservices Reactive programming Vertex Qualifications Bachelor's or Master's degrees in Computer Science, Computer Engineering, or a related technical discipline. Ability to work independently and to adapt to a fast-changing environment. Creative, self-disciplined, and capable of identifying and completing critical tasks independently and with a sense of urgency. Driving Results A good single contributor and a good team player. Flexible attitude towards work, as per the needs. Proactively identify & communicate issues and risks. Other Personal Characteristics Dynamic, engaging, self-reliant developer Ability to deal with ambiguity Manage a collaborative and analytical approach Self-confident and humble Open to continuous learning Intelligent, rigorous thinker who can operate successfully amongst bright people Role Responsibilities [Add detailed role responsibilities here] Skills: boot,spring boot,java,vertex,reactive programming,communication skills,communication,skills,microservices Show more Show less
Posted 1 week ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Jd Looking for a very strong Java expert. hands-on resource who can lead & interact with the client directly and understand the requirements & contribute to technical discussions apart from development work. Should have very good communication skills. Must-Have Skills Java Spring Boot Microservices Reactive programming Vertex Qualifications Bachelor's or Master's degrees in Computer Science, Computer Engineering, or a related technical discipline. Ability to work independently and to adapt to a fast-changing environment. Creative, self-disciplined, and capable of identifying and completing critical tasks independently and with a sense of urgency. Driving Results A good single contributor and a good team player. Flexible attitude towards work, as per the needs. Proactively identify & communicate issues and risks. Other Personal Characteristics Dynamic, engaging, self-reliant developer Ability to deal with ambiguity Manage a collaborative and analytical approach Self-confident and humble Open to continuous learning Intelligent, rigorous thinker who can operate successfully amongst bright people Role Responsibilities [Add detailed role responsibilities here] Skills: boot,spring boot,java,vertex,reactive programming,communication skills,communication,skills,microservices Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Details As an AI-First AI/ML Engineer, you'll be architecting and deploying intelligent systems that leverage cutting-edge AI technologies including LangChain orchestration, autonomous AI agents, and robust AWS cloud infrastructure. We are seeking expertise in modern AI/ML frameworks, agentic systems, and scalable backend development using Node.js and Python. Your AI-powered engineering approach will create sophisticated machine learning solutions that drive autonomous decision-making and solve complex business challenges at enterprise scale. About You You are an AI/ML specialist who has fully embraced AI-first development methodologies, using advanced AI tools (e.g., Copilot, ChatGPT, Claude, CodeLlama) to accelerate your machine learning workflows. You're equally comfortable building LangChain orchestration pipelines, deploying Hugging Face models, developing autonomous AI agents, and architecting scalable AWS backend systems using Node.js and Python. You move FAST - capable of shipping complete, production-ready features within 1 week cycles. You are a proactive person and a go-getter, willing to go the extra mile. You understand that modern AI engineering means creating intelligent systems that can reason, learn, and act autonomously while maintaining reliability and performance. You thrive using TDD methods, MLOps practices, and Agile methodologies while focusing on finding elegant solutions to complex AI challenges. This is a hybrid role where you'll be spending your time across 4 core functions: Internal Projects (25%) - Building and maintaining OneSeven's internal AI tools and platforms Sales Engineering (25%) - Supporting sales team with technical demos, proof-of-concepts, and client presentations AI-First Engineering and Innovation Sprints (25%) - Rapid prototyping and innovation on cutting-edge AI technologies Forward Deployed Engineering (25%) - Working directly with clients on-site or embedded in their teams to deliver solutions Qualifications Technical Requirements Core AI/ML Skills 4+ years AI/ML development experience with production deployment Fluent English required - strong written and verbal communication skills for direct client interaction Reliable workspace/internet - willing to work extra hours FAST execution mindset - must be able to ship complete features within 1 week Strong system architecture experience - designing scalable, distributed AI/ML systems Expert-level LangChain experience for AI orchestration and workflow management Hugging Face experience - transformers, model integration, and deployment Extensive AI Agent development with LangChain or Google Vertex AI Heavy AWS cloud experience, particularly with Bedrock, SageMaker, and AI/ML services Backend generalist comfortable with Node.js and Python for AI service development Agile methodologies experience, startup environment passion Independent problem-solver, team player willing to work extra hours AI Agent & LangChain Expertise (Required) LangChain framework mastery for complex AI workflow orchestration Hugging Face integration - transformers, model deployment, and API integration AI Agent architecture design with LangChain or Google Vertex AI Prompt engineering and chain-of-thought optimization Vector databases and embedding systems (Pinecone, Pgvector, Chroma) RAG pipeline development and optimization LLM integration across multiple providers (OpenAI, Anthropic, AWS Bedrock, Hugging Face) Agentic system design with memory, planning, and execution capabilities Backend & Cloud Infrastructure Heavy AWS Cloud services experience (Lambda, API Gateway, S3, RDS, SageMaker, Bedrock) System architecture design for high-scale, distributed AI/ML applications Microservices architecture and design patterns for AI systems at scale Node.js and Python backend development for AI service APIs RESTful API design and GraphQL for AI service integration Database design and management for AI data workflows Modern JavaScript/TypeScript and Python async programming MLOps & Integration CI/CD pipelines and GitHub Actions for ML model deployment Model versioning, monitoring, and automated retraining workflows Container orchestration (Docker, Kubernetes) for AI services Performance optimization for high-throughput AI systems Modern authentication and secure API design for AI endpoints API security implementation (XSS, CSRF protection) Bonus Qualifications Advanced Hugging Face experience (fine-tuning, custom models, optimization) Multi-modal AI experience (vision, audio, text processing) Advanced prompt engineering and fine-tuning experience DevOps and infrastructure as code (Terraform, CloudFormation) Database optimization for vector search and AI workloads Additional cloud platforms (Azure AI, Google Vertex AI) Knowledge graph integration and semantic reasoning Project Deliverables You'll be working on building a comprehensive AI-powered business intelligence system with autonomous agent capabilities. Key deliverables include: Core AI Agent Platform Multi-agent orchestration system with LangChain workflow management Autonomous reasoning agents with tool integration and decision-making capabilities Intelligent document processing pipeline with advanced OCR and classification Real-time AI analysis dashboard with predictive insights and recommendations Advanced AI Workflows RAG-powered knowledge synthesis with multi-source data integration Automated business process agents with approval workflows and notifications AI-driven anomaly detection with proactive alerting and response systems Intelligent API orchestration with dynamic routing and load balancing Comprehensive agent performance monitoring with usage analytics and optimization insights Integration & Deployment Systems Scalable AWS backend infrastructure with auto-scaling AI services Production MLOps pipeline with automated model deployment and monitoring Multi-tenant AI service architecture with usage tracking and billing integration Real-time AI API gateway with rate limiting and authentication Benefits/Compensation Fully Remote, Contract-based with U.S. company $4,000/mo - $8,000/mo depending on experience and project duration Company-paid PTO plan, international team of 15+ To Apply SEND YOUR RESUME IN ENGLISH, please. Include the URL of your LinkedIn profile. Include Website references, GitHub repositories, and any other online references that would highlight your prior work for the qualifications described in this role. ⚠️ AUTOMATIC DISQUALIFICATION: You will be automatically disqualified if your resume is not in English or you don't include your LinkedIn profile URL. About OneSeven Tech OneSeven Tech is a premier digital product studio serving both high-growth startups and established enterprises. We've partnered with startup clients who have collectively raised over $100M in Venture Capital, while our enterprise portfolio includes 2000+ person hospitality groups and publicly traded NASDAQ companies. Our passion lies in crafting exceptional AI-powered digital products that drive real business success. Joining OneSeven means working alongside a skillful team of consultants where you'll sharpen your AI/ML expertise, expand your capabilities, and contribute to cutting-edge solutions for industry-leading clients. OST's headquarters is in Miami, Florida, but our employees work remotely worldwide. Our 3 main locations are Miami, Mexico City, Mexico, and Buenos Aires, Argentina. Show more Show less
Posted 1 week ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP Integration with Vertex O Series/Sabrix, SAP FI CO Finance, Tax regimes, including Sales & Use, VAT, GST, HST, Solid Experience in Corporate Taxation Good to have skills : No Function Specialty Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing project progress, coordinating with teams, and ensuring successful application development. We are seeking a Senior Tax Technology Specialist to join our team. This role requires a seasoned professional with extensive experience in tax engines, indirect tax management (particularly Vertex O Series), and a strong foundation in SAP systems. The ideal candidate will manage complex tax project across the Sales and Use Tax, helping to streamline tax processes and maintain global compliance. Roles & Responsibilities: Implementing new requirements and maintaining the Vertex O Series system, focusing on the global indirect tax solution Configuring and managing Vertex tax rules, rates, and jurisdictions to ensure precise and compliant tax calculations for all transactions - Supporting mapping updates to tax matrices and conducting end-to-end testing to ensure no regression impacts across jurisdictions (US and OUS) Collaborating with IT and finance teams to align tax systems with business needs and compliance requirements Developing and maintaining detailed documentation, including SOPs and user guides, for Vertex-related processes Professional & Technical Skills: Must To Have Skills:Proficiency in SAP Integration with Vertex O Series/Sabrix, SAP FI CO Finance Strong understanding of SAP FI CO Finance Must Have Skills:Experience in SAP Integration with Vertex O Series/Sabrix along with SAP FI CO Finance Extensive experience with Vertex O Series and familiarity with SAP tax-related solutions Strong knowledge of tax regimes, including Sales & Use, VAT, GST, HST, and Corporate Tax Excellent analytical skills and keen attention to detail Good To Have Skills:Experience in BRIM/FICA modules and DRC is beneficial but not mandatory Good To Have Skills:Experience in SAP ABAP development, SAP PI/PO, and SAP SD/MM modules. Additional Information: The candidate should have a minimum of 8+ years of experience in SAP Integration with Vertex O Series/Sabrix. This position is based at our Bengaluru office. A 15 years full-time education is required. Qualifications 15 years full time education
Posted 1 week ago
3.0 years
0 Lacs
Delhi, India
Remote
Job Title: Freelance AI Trainer (Corporate Training – Non-Technical Audience) Company: People Pro Consulting Location: Remote / Hybrid (India) Engagement: Freelance – Fees on a per-session basis About Us: People Pro Consulting is dedicated to empowering non-technical professionals (HR, Finance, Service, Marketing, etc.) to embrace and leverage Artificial Intelligence in their daily workflows. We believe AI can be both accessible and fun, and our goal is to spark curiosity first and drive real workplace adoption in a second phase. What You’ll Do Design & Deliver Interactive Sessions • Create engaging, hands-on workshops that demonstrate AI’s potential in everyday tasks (e.g., automating reports, improving decision-making, enhancing customer interactions). • Use real-life examples tailored to HR, Finance, Service, and Marketing professionals—no heavy coding or jargon. • Incorporate gamification, live demos, and simple “AI hacks” so participants experience quick wins. Impress & Inspire • Kick off with “wow” moments (chatbots that generate emails, basic predictive analytics in spreadsheets, AI-driven design tools) to capture attention. • Showcase how AI tools—like language models, low-code RPA, and intelligent assistants—can streamline tasks without requiring technical expertise. Facilitate Workplace Adoption • In Phase 1 (Awareness): Build excitement and confidence around AI by demonstrating practical use cases. • In Phase 2 (Integration): Help participants plan small pilot projects—e.g., automating a routine HR report or generating an AI-powered marketing draft—and guide them through executing those pilots in their respective teams. Measure & Follow Up • Provide simple frameworks to measure ROI (time saved, error reduction, improved engagement). • Offer “office hour” support calls between sessions so learners can troubleshoot as they try out AI in real scenarios. Who You Are AI Expertise with a Teaching Mindset • Proven experience running AI workshops or corporate sessions for non-technical audiences (HR, Finance, Marketing, etc.). • Hands-on familiarity with user-friendly AI tools (ChatGPT, no-code ML platforms, intelligent Excel add-ins, etc.). Engaging & Relatable Communicator • Ability to break down complex concepts into simple, relatable examples. • Comfortable using humor, live demos, and interactive exercises to keep energy high. Practical & Outcome-Oriented • Focus on actionable takeaways: participants should feel confident launching their own AI pilot by the end of the program. • Experience designing multi-phase training roadmaps that transition learners from “curious” to “competent.” Flexible & Collaborative • Willingness to tailor content to specific client needs (e.g., HR policy automation vs. marketing content generation). • Open to co-creating session outlines with our internal learning team and iterating based on feedback. Minimum Qualifications 3+ years delivering AI training or workshops, ideally in a corporate setting. Demonstrated success teaching non-technical learners (testimonials or case studies preferred). Solid knowledge of at least two mainstream AI tools/platforms (e.g., OpenAI ChatGPT, Microsoft Copilot, Google Vertex AI, UiPath Action Center). Strong presentation skills, backed by examples of engaging slide decks, demo videos, or participant feedback summaries. Compensation & Logistics Fees: Competitive, paid per completed session (negotiable based on experience and client scope). Session Duration: Typically 2–3 hours each (can vary from 60-minute “intro” workshops to half-day hands-on labs). Schedule: Flexible—sessions may be booked Monday through Friday between 10 AM and 6 PM IST. Materials: You are responsible for creating and sharing all slide decks, exercise files, and follow-up handouts. How to Apply Submit Your Resume & Portfolio • apply here ar linkedin • Subject Line: “Freelance AI Trainer Application – [Your Name]” Provide a 2–3 minute Video Demo Link (YouTube/Google Drive, unlisted) showcasing a short AI exercise you’ve led for non-technical learners. Include a Brief Proposal (max 200 words) outlining: • One icebreaker or “wow” moment you’d use to introduce AI to a room of HR/Finance professionals. • One hands-on exercise you’d assign to ensure participants leave excited to implement AI. Note: Shortlisted candidates will be invited for a brief discovery call to discuss scope, timeline, and session logistics. Follow @PeopleProConsulting for more corporate training opportunities. Show more Show less
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
India has seen a rise in demand for professionals with expertise in Vertex, a cloud-based tax technology solution. Companies across various industries are actively seeking individuals with skills in Vertex to manage their tax compliance processes efficiently. If you are a job seeker looking to explore opportunities in this field, read on to learn more about the Vertex job market in India.
The salary range for Vertex professionals in India varies based on experience levels. Entry-level professionals can expect to earn around INR 4-6 lakhs per annum, while experienced professionals with several years in the industry can earn upwards of INR 12-15 lakhs per annum.
In the Vertex domain, a typical career progression path may include roles such as Tax Analyst, Tax Consultant, Tax Manager, and Tax Director. Professionals may advance from Junior Tax Analyst to Senior Tax Analyst, and eventually take on leadership roles as Tax Managers or Directors.
Alongside expertise in Vertex, professionals in this field are often expected to have skills in tax compliance, tax regulations, accounting principles, and data analysis. Knowledge of ERP systems and experience in tax software implementation can also be beneficial.
As you explore job opportunities in the Vertex domain in India, remember to showcase your expertise, skills, and experience confidently during interviews. Prepare thoroughly for technical questions and demonstrate your understanding of tax compliance processes. With dedication and continuous learning, you can build a successful career in Vertex roles. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.