Job Title: Data Analytics Solution Architect – Banking Domain Location: India Employment Type: Full-time Experience Level: 10+ years **Only candidates with proven experience in the banking domain will be considered** **Immediate joiners or candidates with a notice period of 15 days or less are strongly preferred** About the Role: We are seeking a seasoned Data Analytics Solution Architect with deep experience in the banking and financial services industry to lead the design and implementation of cutting-edge data analytics solutions. The ideal candidate will bring a strong blend of data architecture expertise, business acumen in banking, and hands-on experience with modern analytics platforms and cloud technologies. Key Responsibilities: Design end-to-end data and analytics solutions tailored to banking use cases such as credit risk analysis, customer 360, fraud detection, regulatory compliance, and portfolio management. Work closely with business stakeholders, data engineers, data scientists, and BI developers to translate business needs into scalable and secure data architectures. Define the data strategy and architecture blueprint, including data ingestion, storage, processing, and consumption layers. Champion the adoption of modern data platforms and tools (e.g., GCP, Azure Synapse, Snowflake, Databricks, Power BI, Tableau, Looker etc.). Ensure data governance , quality, lineage, and security standards are embedded into solution design. Lead technical discussions with enterprise architects, solution architects, and delivery teams. Stay updated on banking industry trends, regulatory requirements (e.g., Basel, AML, KYC ), and emerging technologies. Develop reference architectures, design patterns, and reusable components for analytics solutions Required Qualifications: Bachelor's or Master’s degree in Computer Science, Information Systems, Engineering, or a related field. 10+ years of experience in data architecture, data engineering, or analytics roles. 5+ years of experience specifically in the banking or financial services domain . Deep knowledge of SQL, Spark, Python, Git, ETL/ELT tools, Data Modeling tools Proven track record designing large-scale data platforms (on-premises or cloud). Strong knowledge of data modeling, ETL/ELT, data lakes, and data warehousing concepts. Hands-on experience with cloud data platforms ( Azure, AWS, or GCP ). Proficiency in SQL, Python/Scala, and modern data tools (e.g., Apache Spark, Kafka, Airflow ). Familiarity with reporting and visualization tools like Power BI, Tableau, or Looker . Knowledge of data privacy, security, and compliance in financial services. Preferred Skills: Certification in any cloud platforms (e.g., GCP, Azure, AWS or Snowflake or Databricks ). Understanding of machine learning pipelines and AI-driven analytics in banking. Experience with real-time data streaming and event-driven architectures. Familiarity with core banking systems and APIs. Show more Show less
AI / Generative AI Engineer Location: Remote ( Pan India ) Job Type: Fulltime NOTE: "Only immediate joiners or candidates with a notice period of 15 days or less will be considered" Overview: We are seeking a highly skilled and motivated AI/Generative AI Engineer to join our innovative team. The ideal candidate will have a strong background in designing, developing, and deploying artificial intelligence and machine learning models, with a specific focus on cutting-edge Generative AI technologies. This role requires hands-on experience with one or more major cloud platforms (Google Cloud Platform - GCP, Amazon Web Services - AWS) and/or modern data platforms (Databricks, Snowflake) . You will be instrumental in building and scaling AI solutions that drive business value and transform user experiences. Key Responsibilities: Design and Development: Design, build, train, and deploy scalable and robust AI/ML models, including traditional machine learning algorithms and advanced Generative AI models (e.g., Large Language Models - LLMs, diffusion models). Develop and implement algorithms for tasks such as natural language processing (NLP) , text generation, image synthesis, speech recognition, and forecasting. Work extensively with LLMs, including fine-tuning, prompt engineering, retrieval-augmented generation (RAG) , and evaluating their performance. Develop and manage data pipelines for data ingestion, preprocessing, feature engineering, and model training, ensuring data quality and integrity. Platform Expertise: Leverage cloud AI/ML services on GCP (e.g., Vertex AI, AutoML, BigQuery ML, Model Garden, Gemini) , AWS (e.g., SageMaker, Bedrock, S3 ), Databricks, and/or Snowflake to build and deploy solutions. Architect and implement AI solutions ensuring scalability, reliability, security, and cost-effectiveness on the chosen platform(s). Optimize data storage, processing, and model serving components within the cloud or data platform ecosystem. MLOps and Productionization: Implement MLOps best practices for model versioning, continuous integration/continuous deployment (CI/CD), monitoring, and lifecycle management. Deploy models into production environments and ensure their performance, scalability, and reliability. Monitor and optimize the performance of AI models in production, addressing issues related to accuracy, speed, and resource utilization. Collaboration and Innovation: Collaborate closely with data scientists, software engineers, product managers, and business stakeholders to understand requirements, define solutions, and integrate AI capabilities into applications and workflows. Stay current with the latest advancements in AI, Generative AI, machine learning, and relevant cloud/data platform technologies. Lead and participate in the ideation and prototyping of new AI applications and systems. Ensure AI solutions adhere to ethical standards, responsible AI principles, and regulatory requirements, addressing issues like data privacy, bias, and fairness. Documentation and Communication: Create and maintain comprehensive technical documentation for AI models, systems, and processes. Effectively communicate complex AI concepts and results to both technical and non-technical audiences. Required Qualifications: 8+ years of experience with software development in one or more programming languages, and with data structures/algorithms/Data Architecture. 3+ years of experience with state of the art GenAI techniques (e.g., LLMs, Multi-Modal, Large Vision Models) or with GenAI-related concepts (language modeling, computer vision). 3+ years of experience with ML infrastructure (e.g., model deployment, model evaluation, optimization, data processing, debugging). Bachelor's or Master's degree in Computer Science, Artificial Intelligence, Machine Learning, Data Science, or a related technical field. Proven experience as an AI Engineer, Machine Learning Engineer, or a similar role. Strong programming skills in Python. Familiarity with other languages like Java, Scala, or R is a plus. Solid understanding of machine learning algorithms (supervised, unsupervised, reinforcement learning), deep learning concepts (e.g., CNNs, RNNs, Transformer s), and statistical modeling. Hands-on experience with developing and deploying Generative AI models and techniques, including working with Large Language Models ( LLMs like GPT, BERT, LLaMA, etc .). Proficiency in using common A I/ML frameworks and libraries such as TensorFlow, PyTorch, scikit-learn, Keras, Hugging Face Transformers, LangChain, etc. Demonstrable experience with at least one of the following cloud/data platforms: GCP: Experience with Vertex AI, BigQuery ML, Google Cloud Storage, and other GCP AI/ML services . AWS: Experience with SageMaker, Bedrock, S3, and other AWS AI/ML services . Databricks: Experience building and scaling AI/ML solutions on the Databricks Lakehouse Platform , including MLflow. Snowflake: Experience leveraging Snowflake for data warehousing, data engineering for AI/ML workloads , and Snowpark. Experience with data engineering, including data acquisition, cleaning, transformation, and building ETL/ELT pipelines. Knowledge of MLOps tools and practices for model deployment, monitoring, and management. Familiarity with containerization technologies like Docker and orchestration tools like Kubernetes. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Show more Show less
Role : AI / Generative AI Engineer Location : Remote ( Pan India ). Job Type : Fulltime. NOTE : "Only immediate joiners or candidates with a notice period of 15 days or less will be considered". Overview We are seeking a highly skilled and motivated AI/Generative AI Engineer to join our innovative team. The ideal candidate will have a strong background in designing, developing, and deploying artificial intelligence and machine learning models, with a specific focus on cutting-edge Generative AI technologies. This role requires hands-on experience with one or more major cloud platforms (Google Cloud Platform GCP, Amazon Web Services AWS) and/or modern data platforms (Databricks, Snowflake). You will be instrumental in building and scaling AI solutions that drive business value and transform user experiences. Key Responsibilities Design and Development : Design, build, train, and deploy scalable and robust AI/ML models, including traditional machine learning algorithms and advanced Generative AI models (e.g., Large Language Models LLMs, diffusion models). Develop and implement algorithms for tasks such as natural language processing (NLP), text generation, image synthesis, speech recognition, and forecasting. Work extensively with LLMs, including fine-tuning, prompt engineering, retrieval-augmented generation (RAG), and evaluating their performance. Develop and manage data pipelines for data ingestion, preprocessing, feature engineering, and model training, ensuring data quality and integrity. Platform Expertise Leverage cloud AI/ML services on GCP (e.g., Vertex AI, AutoML, BigQuery ML, Model Garden, Gemini), AWS (e.g., SageMaker, Bedrock, S3), Databricks, and/or Snowflake to build and deploy solutions. Architect and implement AI solutions ensuring scalability, reliability, security, and cost-effectiveness on the chosen platform(s). Optimize data storage, processing, and model serving components within the cloud or data platform ecosystem. MLOps And Productionization Implement MLOps best practices for model versioning, continuous integration/continuous deployment (CI/CD), monitoring, and lifecycle management. Deploy models into production environments and ensure their performance, scalability, and reliability. Monitor and optimize the performance of AI models in production, addressing issues related to accuracy, speed, and resource utilization. Collaboration And Innovation Collaborate closely with data scientists, software engineers, product managers, and business stakeholders to understand requirements, define solutions, and integrate AI capabilities into applications and workflows. Stay current with the latest advancements in AI, Generative AI, machine learning, and relevant cloud/data platform technologies. Lead and participate in the ideation and prototyping of new AI applications and systems. Ensure AI solutions adhere to ethical standards, responsible AI principles, and regulatory requirements, addressing issues like data privacy, bias, and fairness. Documentation And Communication Create and maintain comprehensive technical documentation for AI models, systems, and processes. Effectively communicate complex AI concepts and results to both technical and non-technical audiences. Required Qualifications 8+ years of experience with software development in one or more programming languages, and with data structures/algorithms/Data Architecture. 3+ years of experience with state of the art GenAI techniques (e.g., LLMs, Multi-Modal, Large Vision Models) or with GenAI-related concepts (language modeling, computer vision). 3+ years of experience with ML infrastructure (e.g., model deployment, model evaluation, optimization, data processing, debugging). Bachelor's or Master's degree in Computer Science, Artificial Intelligence, Machine Learning, Data Science, or a related technical field. Proven experience as an AI Engineer, Machine Learning Engineer, or a similar role. Strong programming skills in Python. Familiarity with other languages like Java, Scala, or R is a plus. Solid understanding of machine learning algorithms (supervised, unsupervised, reinforcement learning), deep learning concepts (e.g., CNNs, RNNs, Transformers), and statistical modeling. Hands-on experience with developing and deploying Generative AI models and techniques, including working with Large Language Models (LLMs like GPT, BERT, LLaMA, etc.). Proficiency in using common AI/ML frameworks and libraries such as TensorFlow, PyTorch, scikit-learn, Keras, Hugging Face Transformers, LangChain, etc. Demonstrable experience with at least one of the following cloud/data platforms : GCP : Experience with Vertex AI, BigQuery ML, Google Cloud Storage, and other GCP AI/ML services. AWS : Experience with SageMaker, Bedrock, S3, and other AWS AI/ML services. Databricks : Experience building and scaling AI/ML solutions on the Databricks Lakehouse Platform, including MLflow. Snowflake : Experience leveraging Snowflake for data warehousing, data engineering for AI/ML workloads, and Snowpark. Experience with data engineering, including data acquisition, cleaning, transformation, and building ETL/ELT pipelines. Knowledge of MLOps tools and practices for model deployment, monitoring, and management. Familiarity with containerization technologies like Docker and orchestration tools like Kubernetes. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. (ref:hirist.tech)
AI / Generative AI Engineer Location: Remote ( Pan India ). Job Type: Fulltime. NOTE: "Only immediate joiners or candidates with a notice period of 15 days or less will be We are seeking a highly skilled and motivated AI/Generative AI Engineer to join our innovative team. The ideal candidate will have a strong background in designing, developing, and deploying artificial intelligence and machine learning models, with a specific focus on cutting-edge Generative AI technologies. This role requires hands-on experience with one or more major cloud platforms (Google Cloud Platform GCP, Amazon Web Services AWS) and/or modern data platforms (Databricks, Snowflake). You will be instrumental in building and scaling AI solutions that drive business value and transform user experiences. Key Responsibilities Design and Development : Design, build, train, and deploy scalable and robust AI/ML models, including traditional machine learning algorithms and advanced Generative AI models (e.g., Large Language Models LLMs, diffusion models). Develop and implement algorithms for tasks such as natural language processing (NLP), text generation, image synthesis, speech recognition, and forecasting. Work extensively with LLMs, including fine-tuning, prompt engineering, retrieval-augmented generation (RAG), and evaluating their performance. Develop and manage data pipelines for data ingestion, preprocessing, feature engineering, and model training, ensuring data quality and integrity. Platform Expertise Leverage cloud AI/ML services on GCP (e.g., Vertex AI, AutoML, BigQuery ML, Model Garden, Gemini), AWS (e.g., SageMaker, Bedrock, S3), Databricks, and/or Snowflake to build and deploy solutions. Architect and implement AI solutions ensuring scalability, reliability, security, and cost-effectiveness on the chosen platform(s). Optimize data storage, processing, and model serving components within the cloud or data platform ecosystem. MLOps And Productionization Implement MLOps best practices for model versioning, continuous integration/continuous deployment (CI/CD), monitoring, and lifecycle management. Deploy models into production environments and ensure their performance, scalability, and reliability. Monitor and optimize the performance of AI models in production, addressing issues related to accuracy, speed, and resource utilization. Collaboration And Innovation Collaborate closely with data scientists, software engineers, product managers, and business stakeholders to understand requirements, define solutions, and integrate AI capabilities into applications and workflows. Stay current with the latest advancements in AI, Generative AI, machine learning, and relevant cloud/data platform technologies. Lead and participate in the ideation and prototyping of new AI applications and systems. Ensure AI solutions adhere to ethical standards, responsible AI principles, and regulatory requirements, addressing issues like data privacy, bias, and fairness. Documentation And Communication Create and maintain comprehensive technical documentation for AI models, systems, and processes. Effectively communicate complex AI concepts and results to both technical and non-technical audiences. Required Qualifications 8+ years of experience with software development in one or more programming languages, and with data structures/algorithms/Data Architecture. 3+ years of experience with state of the art GenAI techniques (e.g., LLMs, Multi-Modal, Large Vision Models) or with GenAI-related concepts (language modeling, computer vision). 3+ years of experience with ML infrastructure (e.g., model deployment, model evaluation, optimization, data processing, debugging). Bachelor's or Master's degree in Computer Science, Artificial Intelligence, Machine Learning, Data Science, or a related technical field. Proven experience as an AI Engineer, Machine Learning Engineer, or a similar role. Strong programming skills in Python. Familiarity with other languages like Java, Scala, or R is a plus. Solid understanding of machine learning algorithms (supervised, unsupervised, reinforcement learning), deep learning concepts (e.g., CNNs, RNNs, Transformers), and statistical modeling. Hands-on experience with developing and deploying Generative AI models and techniques, including working with Large Language Models (LLMs like GPT, BERT, LLaMA, etc.). Proficiency in using common AI/ML frameworks and libraries such as TensorFlow, PyTorch, scikit-learn, Keras, Hugging Face Transformers, LangChain, etc. Demonstrable experience with at least one of the following cloud/data platforms: GCP: Experience with Vertex AI, BigQuery ML, Google Cloud Storage, and other GCP AI/ML services. AWS: Experience with SageMaker, Bedrock, S3, and other AWS AI/ML services. Databricks: Experience building and scaling AI/ML solutions on the Databricks Lakehouse Platform, including MLflow. Snowflake: Experience leveraging Snowflake for data warehousing, data engineering for AI/ML workloads, and Snowpark. Experience with data engineering, including data acquisition, cleaning, transformation, and building ETL/ELT pipelines. Knowledge of MLOps tools and practices for model deployment, monitoring, and management. Familiarity with containerization technologies like Docker and orchestration tools like Kubernetes. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. (ref:hirist.tech)
Outbound Sales Executive Location: India Full Time - Remote ***DO NOT APPLY if you don't have selling Analytics Professional Services to Enterprise B2B Clients **** What You'll Bring Proven Sales Expertise: At least 10+ years of experience in sales development, cold calling, and outbound messaging. Professional Services Background: Deep experience working with GCP / Snowflake and/or Databricks and a strong history of selling professional services for a non-product company. Tech-Savvy: Proficiency with sales tools like CRMs, prospecting tools (e.g., ZoomInfo), and communication platforms (e.g., Slack, LinkedIn). Influential Communicator: Exceptional written and verbal communication skills with the ability to build rapport and influence decisions, including with C-level executives. Results-Oriented: A demonstrated ability to track and measure activity, outcomes, and goals, with a keen eye for detail. Strategic & Agile: Excellent analytical skills with the ability to navigate ambiguity, prioritize effectively, and collaborate across organizations and with external stakeholders. Customer-Centric: A strong focus on customer satisfaction and a willingness to address escalated client issues with speed and urgency. Education: A Bachelor's degree in a related field or equivalent work experience. An MBA is highly desirable. Travel: Willingness and ability to travel regionally (expected at least 50%). What You'll Do As an Outbound Sales Account Executive specializing in our GCP / Snowflake & Databricks Practice, you'll be a key driver of our growth. Your mission is to build and nurture relationships with new prospects, creating a robust pipeline of opportunities for our team. You'll be the first point of contact for leaders at target organizations, introducing them to our innovative Analytics Professional Services and Solutions. Strategize & Prospect: Creatively identify and engage with leaders at target organizations through cold calls, LinkedIn, email, and other outbound channels. Build Relationships: Nurture relationships with prospects, providing relevant insights into their technology footprint and strategic goals. You'll transition these relationships into qualified opportunities. Collaborate & Support: Work closely with our Enterprise Account Executives and internal GTM teams to refine outreach strategies, ensure smooth handoffs, and provide ongoing support throughout the sales process. Drive Account Growth: Actively lead account strategy to generate and develop business growth opportunities with new customers, collaborating with our alliance partners. Measure & Track: Maintain detailed, up-to-date records of all activities, outcomes, and goals in our CRM, ensuring a clear view of your progress. Stay Ahead of the Curve: Continuously expand your knowledge of cloud and trends to offer relevant and insightful information that resonates with potential customers.
Job Title: Web Scraper & Lead Generation Specialist Location: Remote Job Type: Part Time About the Role We are seeking a highly skilled Web Scraper & Lead Generation Specialist with proven experience in extracting, validating, and organizing data for sales and business development. The ideal candidate will have hands-on experience in generating qualified leads within Solar Industries, Call Center, BPO, Customer Support as a Service (CSaaS), Business Process Services (BPS), and B2B Data & Analytics companies (e.g., Snowflake, Databricks, GCP/AWS Analytics partners, Tableau, Power BI, and other analytics consulting firms). Key Responsibilities Develop and maintain automated web scraping scripts/tools to extract data from websites, directories, and other public sources. Generate targeted lead lists in the Solar, Call Center, BPO, CSaaS, BPS, and B2B Data & Analytics sectors. Identify and capture company details and key decision-makers (CXOs, VPs, Directors, Managers). Validate and enrich data (emails, phone numbers, LinkedIn profiles, company size, industry classification). Collaborate with the Sales & Marketing team to align lead generation efforts with business goals. Maintain data hygiene (accuracy, deduplication, compliance with GDPR/CCPA). Track and report lead generation performance and KPIs. Research and adopt the latest scraping and automation techniques . Qualifications & Skills Proven experience in web scraping, data mining, and B2B lead generation . Strong knowledge of scraping frameworks/libraries such as BeautifulSoup, Scrapy, Selenium, Puppeteer, or Octoparse . Experience working with APIs, enrichment tools, and automation platforms (e.g., ZoomInfo, Apollo.io, Lusha, PhantomBuster, Zapier). Familiarity with CRM systems (HubSpot, Salesforce, Zoho, or similar). Strong understanding of the Solar Industries, Call Center, BPO, CSaaS, BPS, and Data & Analytics industries , Snowflake partners, Databricks consulting firms, Tableau/Power BI service providers, and GCP/AWS analytics specialists. Proficiency in Python, JavaScript, or other scripting languages for automation. Analytical mindset with strong attention to detail and data accuracy. Ability to manage and deliver large-scale data extraction projects. Preferred Experience 2–5 years of experience in lead generation/web scraping for B2B companies . Prior experience in targeting Data & Analytics service providers (Snowflake, Databricks, GCP, AWS Analytics, Tableau, Power BI consulting firms). Knowledge of ethical scraping and global data privacy laws .