About us Discover Dollar s mission is to help enterprises to create value by harnessing vast amount of unstructured data - in the form of email, chats, PDFs, and contract copies etc., using data-first approach using data science, AI and cutting-edge technologies. Why Discover Dollar: 1. We are one of the great innovative product companies in the Finance space 2. We have a magnificent work culture and are certified Great Place to Work 3. We have a distinguished career in both Tech & Non-Tech domain Job Description 1. The primary role of a Financial Analysis Intern is to Audit the standard reports to identify duplicate, over and under payments for the client. 2. Analyze agreements, invoices, purchase orders and other documentation, exploring possible missed opportunities and improving the processes. 3. Successfully uses Audit Tools with increasing proficiency with all systems and applications including internal and client tools. 4. Identify new claim opportunities for different clients. 5. Work on new initiatives and participate in various KT sessions organized in the company Requirements Candidate should hold a bachelor degree in commerce - B.Com/BBA (We are not preferring candidates with Masters degree for this internship) Candidate should have 0 - 1 years of experience Should have good verbal and writing communication Should be strong in analytical skills Displays positive and can-do attitude Note Stipend is 12,000/- per month subject to 10% TDS Work Location - Bangalore Work mode - Hybrid Duration of internship - 6 months Interview process - Aptitude round, Group discussion, HR Interview, Technical Interview
About the Role: We are looking for a passionate AI Engineer Intern with a solid foundation in Python , Object-Oriented Programming (OOPs) , and an interest in Generative AI (GenAI) . The ideal candidate should also have exposure to Data Science concepts and a willingness to work on real-world AI/ML projects under the mentorship of our engineering team. Key Responsibilities: Assist in designing, developing, and deploying AI-driven solutions. Work with Generative AI models for tasks like text generation, summarization etc Apply Data Science techniques for data preprocessing, exploratory analysis, and model evaluation. Collaborate with the team to write clean, maintainable, and scalable Python code. Research and experiment with the latest AI and GenAI advancements. Must-Have Skills: Strong knowledge of Python programming . Good understanding of Object-Oriented Programming (OOPs) concepts. Exposure to Generative AI (GenAI) models (e.g., GPT, LLaMA, or similar). Understanding of prompt engineering , tool calling , and shot-based prompting (zero-shot, one-shot, few-shot). Basic understanding of Data Science principles (data handling, feature engineering, model building). Familiarity with libraries like NumPy, Pandas, Scikit-learn Problem-solving mindset with eagerness to learn and experiment. Good communication and collaboration skills. Nice to Have (Optional): Familiarity with libraries like PyTorch/TensorFlow . Fine-tuning LLMs. Hands-on experience with AI projects (personal, academic, or hackathons). Knowledge of software development best practices (version control, testing).
About the Role: We are looking for a highly skilled Analytics/AI Engineer to bridge the gap between data engineering and data analysis. You will play a critical role in building scalable data pipelines, transforming raw data into actionable insights, and enabling data-driven decision-making across the organization. Key Responsibilities: Design and implement robust, scalable data pipelines using PySpark , Python, Polars and Gen-AI . Develop data models and transformation logic to support analytics and business intelligence needs. Leverage Python and Object-Oriented Programming (OOPs) principles to build reusable and maintainable data tools and workflows. Utilize Databricks and cloud-based platforms to process and manage large datasets efficiently. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver clean, trusted data. Ensure high data quality and consistency across different systems and reports. Must-Have Skills: Strong knowledge of Python programming . Good understanding of Object-Oriented Programming (OOPs) concepts. Advanced SQL skills for data manipulation and querying. Experience with PySpark , Polars for distributed and in-memory data processing. Ability to work independently in a fast-paced , agile environment. Problem-solving mindset with eagerness to learn and experiment. Good communication and collaboration skills. Nice to Have: Understanding of data architecture, ETL/ELT processes, and data modelling. Knowledge on Databricks and Azure Cloud Environment. Good to have an understanding on Data Structures and Algorithms. Familiarity with best coding practices, CI/CD & version control (Git).
Why this role exists Were scaling our AI product suite - invoice & contract and terms extraction, anomaly and overpayment detection, and analyst workflows for both internal and client users. We need a product leader who can turn ambiguous customer pain into shipped, secure, and measurable AI features that enterprise customers love and finance leaders trust. What youll own Strategy & Outcomes Define and own the AI product strategy and measurable outcomes for recovery, accuracy, and time-to-value across enterprise accounts. Translate market and customer signals (Procurement, AP, Audit, Infosec, Legal) into a prioritized roadmap with clear trade-offs. Product Discovery Run deep discovery with Fortune 1000 stakeholders and Discover Dollar analysts to map jobs-to-be-done, decision points, and data realities (SAP/Oracle/Coupa/Ariba exports, data lake, data warehouse extracts, contracts, invoices, emails, credit notes). Validate problems and solutions with prototypes (AI tools like Lovable, Replit, Figma, Low-code tools etc) and pilot metrics before committing engineering. Product Delivery Lead cross-functional squads (ML/DS, Backend, Frontend, Design, Data Platform, Security) to ship high-quality increments on a predictable cadence. Write crisp PRDs, acceptance criteria, and instrument the product for analytics/telemetry (events, funnels, precision/recall dashboards). Drive integrations and partner workflows (e.g., SAP S/4HANA, Oracle, Coupa/Ariba, RMS) with clear APIs and data contracts. AI/ML Product Craft Shape AI features such as document extraction (OCR + NLP), contract terms understanding, Complex Agentic AI workflows, LLM/RAG assistants for analysts, anomaly/risk scoring, and human-in-the-loop review. Partner with DS/ML on evaluation frameworks (offline/online), data labeling strategy, prompt & model versioning, and cost/performance tuning (latency, tokens, GPU/CPU budgets). Establish monitoring/guardrails: drift, bias, hallucination containment, red-teaming, feedback loops. Governance, Privacy & Security Embed privacy-by-design and enterprise controls (role-based access, audit logs, retention, encryption). Coordinate with customers InfoSec/Legal on DPAs, SOC 2/ISO 27001 alignment, data residency, and model risk documentation (model cards, DPIAs where needed). Customer & GTM Own product pilots, success criteria, and value realization (business cases, ROI). Enable Sales/CS with narratives, demos, pricing/packaging inputs, and win/loss insights. Collect structured feedback and convert it into backlog and experiments. What success looks like 30 days: Understand domain and data flows; publish product health baseline; ship a small improvement (e.g., analyst UX, cost reduction). 60 days: Deliver a validated AI feature MVP (e.g., improved clause extraction or anomaly ranking) to a pilot customer with a signed success plan. 90 days: Hit agreed accuracy + adoption targets; publish a 2-quarter roadmap tied to revenue and recovery dollars; formalize AI evaluation & monitoring dashboards. Key KPIs/metrics youll move Business impact: Verified recovery dollars, time-to-detect, time-to-close. Model quality: Precision/recall/F1 by use case; false positive rate; drift indicators. Efficiency: Cost per decision (tokens/compute), latency, analyst throughput. Adoption: Active users, task completion rate, retention, NPS/CSAT. Reliability & risk: Incident rate, hallucination guardrail triggers, SLA/SLO adherence. Must-have qualifications 5+ years in product management with at least 1 year of shipping AI/ML-powered features in production (LLMs/RAG/NLP/vision/ML detection). Track record in B2B enterprise SaaS with complex data and security needs; comfortable working with Procurement/AP/Finance stakeholders. Strong discovery and storytelling skills: can turn messy datasets and ops realities into a clear product narrative and roadmap. Technical depth to partner with DS/ML/Engineering understands evaluation metrics, A/B testing, data pipelines, API design, and basics of SQL/Python. Analytics-first: habit of instrumenting funnels, defining guardrails, and making decisions from dashboards and user studies. Excellent written communication (PRDs, RFCs, customer docs) and stakeholder management across Sales, CS, Legal, and Infosec. Bachelors in Engineering/CS/Math/Statistics or equivalent experience. Nice-to-have Domain exposure to Source-to-Pay, AP automation, recovery audit, contract analytics, or financial controls. Experience with SAP/Oracle/Workday/Coupa/Ariba data models and enterprise integrations. Familiarity with MLOps (MLflow/W&B), vector databases (FAISS/Pinecone), and eval frameworks for LLMs. Understanding of model risk management, secure AI patterns, and enterprise compliance expectations. Prior startup or zero-to-one product experience. Our current tech context (helpful, not required) AI Stack: LLMs (commercial & open-source Open AI, Llama, Claude), RAG pipelines, OCR + NLP for documents, Data stack: cloud object storage, relational DBs, event pipelines, analytics stack like Databricks, Elastic Search, Cloud Stack: Primarily Azure but build cloud agnostic product to make it deployable in AWS, GCP etc. Dev Stack: Python, Javascript, React Product tooling: Clickup (Jira equivalent), Postman, Figma How we work Bangalore-based hybrid team with collaboration across India and North America Pragmatic agile: weekly planning, monthly business reviews, quarterly product reviews tied to revenue and recovery outcomes. What youll get to do Ship AI that recovers real money for global enterprises. (Already saved over $1B+ with current product, need to 10X the impact) Build analyst-in-the-loop systems that blend human judgment with AI scale. Influence pricing/packaging and help define the category for AI-assisted recovery audit. Compensation & benefits Competitive with Bangalore market for Product Manager talent, plus benefits and performance incentives. (Well share details during the first call.)
Role & responsibilities As a Financial Analyst, you will: Audit standard reports to identify duplicate, overpayments, and underpayments for the client. Reconcile vendor statements with the clients Accounts Payable data to identify unprocessed invoices in the client's system. Analyze agreements, invoices, purchase orders, and other supporting documents to identify missed opportunities and suggest process improvements. Use audit tools effectively and demonstrate increasing proficiency with both internal and client systems and applications. Identify and develop new claim opportunities across different clients. Contribute to new initiatives and actively participate in knowledge transfer (KT) sessions organized within the company. Preferred candidate profile Must have skills: Good knowledge of accounts payable & receivable data, data interpretation / data processing. Good understanding of Procure to Pay Process & Accounting Principles Should be able to Identify variances or errors in the procurement and payment processes to recover revenue Good to have skills: Good proficiency in basic computer applications and other tools Proficiency in Microsoft Excel & PowerPoint is desirable, familiarity with data management tools/Query (SQL) will be helpful Should have ability to analyze and interpret data: Rational and logical thinker and should pay attention to details Ability to demonstrate initiative and work independently & should have capability to work with deadlines Years of Experience: Fresher or up-to 1 year of relevant experience Education: Master's degree in finance preferably MCom. MBA Finance with BCom /BBA as bachelor's degree. Location: Bangalore
1. The primary role of a Financial Analysis Intern is to Audit the standard reports to identify duplicate, over and under payments for the client. 2. Analyze agreements, invoices, purchase orders and other documentation, exploring possible missed opportunities and improving the processes. 3. Successfully uses Audit Tools with increasing proficiency with all systems and applications including internal and client tools. 4. Identify new claim opportunities for different clients. 5. Work on new initiatives and participate in various KT sessions organized in the company Requirements Candidate should hold a masters degree in commerce or finance. Ideal candidate should have Commerce / Finance in both UG & PG degree. Candidate should have 0-1 years of experience Should have good verbal and writing communication Should be strong in analytical skills Displays positive and can-do attitude