India
Not disclosed
On-site
Full Time
Job Title: Data Analytics Solution Architect – Banking Domain Location: India Employment Type: Full-time Experience Level: 10+ years **Only candidates with proven experience in the banking domain will be considered** **Immediate joiners or candidates with a notice period of 15 days or less are strongly preferred** About the Role: We are seeking a seasoned Data Analytics Solution Architect with deep experience in the banking and financial services industry to lead the design and implementation of cutting-edge data analytics solutions. The ideal candidate will bring a strong blend of data architecture expertise, business acumen in banking, and hands-on experience with modern analytics platforms and cloud technologies. Key Responsibilities: Design end-to-end data and analytics solutions tailored to banking use cases such as credit risk analysis, customer 360, fraud detection, regulatory compliance, and portfolio management. Work closely with business stakeholders, data engineers, data scientists, and BI developers to translate business needs into scalable and secure data architectures. Define the data strategy and architecture blueprint, including data ingestion, storage, processing, and consumption layers. Champion the adoption of modern data platforms and tools (e.g., GCP, Azure Synapse, Snowflake, Databricks, Power BI, Tableau, Looker etc.). Ensure data governance , quality, lineage, and security standards are embedded into solution design. Lead technical discussions with enterprise architects, solution architects, and delivery teams. Stay updated on banking industry trends, regulatory requirements (e.g., Basel, AML, KYC ), and emerging technologies. Develop reference architectures, design patterns, and reusable components for analytics solutions Required Qualifications: Bachelor's or Master’s degree in Computer Science, Information Systems, Engineering, or a related field. 10+ years of experience in data architecture, data engineering, or analytics roles. 5+ years of experience specifically in the banking or financial services domain . Deep knowledge of SQL, Spark, Python, Git, ETL/ELT tools, Data Modeling tools Proven track record designing large-scale data platforms (on-premises or cloud). Strong knowledge of data modeling, ETL/ELT, data lakes, and data warehousing concepts. Hands-on experience with cloud data platforms ( Azure, AWS, or GCP ). Proficiency in SQL, Python/Scala, and modern data tools (e.g., Apache Spark, Kafka, Airflow ). Familiarity with reporting and visualization tools like Power BI, Tableau, or Looker . Knowledge of data privacy, security, and compliance in financial services. Preferred Skills: Certification in any cloud platforms (e.g., GCP, Azure, AWS or Snowflake or Databricks ). Understanding of machine learning pipelines and AI-driven analytics in banking. Experience with real-time data streaming and event-driven architectures. Familiarity with core banking systems and APIs. Show more Show less
India
Not disclosed
Remote
Full Time
AI / Generative AI Engineer Location: Remote ( Pan India ) Job Type: Fulltime NOTE: "Only immediate joiners or candidates with a notice period of 15 days or less will be considered" Overview: We are seeking a highly skilled and motivated AI/Generative AI Engineer to join our innovative team. The ideal candidate will have a strong background in designing, developing, and deploying artificial intelligence and machine learning models, with a specific focus on cutting-edge Generative AI technologies. This role requires hands-on experience with one or more major cloud platforms (Google Cloud Platform - GCP, Amazon Web Services - AWS) and/or modern data platforms (Databricks, Snowflake) . You will be instrumental in building and scaling AI solutions that drive business value and transform user experiences. Key Responsibilities: Design and Development: Design, build, train, and deploy scalable and robust AI/ML models, including traditional machine learning algorithms and advanced Generative AI models (e.g., Large Language Models - LLMs, diffusion models). Develop and implement algorithms for tasks such as natural language processing (NLP) , text generation, image synthesis, speech recognition, and forecasting. Work extensively with LLMs, including fine-tuning, prompt engineering, retrieval-augmented generation (RAG) , and evaluating their performance. Develop and manage data pipelines for data ingestion, preprocessing, feature engineering, and model training, ensuring data quality and integrity. Platform Expertise: Leverage cloud AI/ML services on GCP (e.g., Vertex AI, AutoML, BigQuery ML, Model Garden, Gemini) , AWS (e.g., SageMaker, Bedrock, S3 ), Databricks, and/or Snowflake to build and deploy solutions. Architect and implement AI solutions ensuring scalability, reliability, security, and cost-effectiveness on the chosen platform(s). Optimize data storage, processing, and model serving components within the cloud or data platform ecosystem. MLOps and Productionization: Implement MLOps best practices for model versioning, continuous integration/continuous deployment (CI/CD), monitoring, and lifecycle management. Deploy models into production environments and ensure their performance, scalability, and reliability. Monitor and optimize the performance of AI models in production, addressing issues related to accuracy, speed, and resource utilization. Collaboration and Innovation: Collaborate closely with data scientists, software engineers, product managers, and business stakeholders to understand requirements, define solutions, and integrate AI capabilities into applications and workflows. Stay current with the latest advancements in AI, Generative AI, machine learning, and relevant cloud/data platform technologies. Lead and participate in the ideation and prototyping of new AI applications and systems. Ensure AI solutions adhere to ethical standards, responsible AI principles, and regulatory requirements, addressing issues like data privacy, bias, and fairness. Documentation and Communication: Create and maintain comprehensive technical documentation for AI models, systems, and processes. Effectively communicate complex AI concepts and results to both technical and non-technical audiences. Required Qualifications: 8+ years of experience with software development in one or more programming languages, and with data structures/algorithms/Data Architecture. 3+ years of experience with state of the art GenAI techniques (e.g., LLMs, Multi-Modal, Large Vision Models) or with GenAI-related concepts (language modeling, computer vision). 3+ years of experience with ML infrastructure (e.g., model deployment, model evaluation, optimization, data processing, debugging). Bachelor's or Master's degree in Computer Science, Artificial Intelligence, Machine Learning, Data Science, or a related technical field. Proven experience as an AI Engineer, Machine Learning Engineer, or a similar role. Strong programming skills in Python. Familiarity with other languages like Java, Scala, or R is a plus. Solid understanding of machine learning algorithms (supervised, unsupervised, reinforcement learning), deep learning concepts (e.g., CNNs, RNNs, Transformer s), and statistical modeling. Hands-on experience with developing and deploying Generative AI models and techniques, including working with Large Language Models ( LLMs like GPT, BERT, LLaMA, etc .). Proficiency in using common A I/ML frameworks and libraries such as TensorFlow, PyTorch, scikit-learn, Keras, Hugging Face Transformers, LangChain, etc. Demonstrable experience with at least one of the following cloud/data platforms: GCP: Experience with Vertex AI, BigQuery ML, Google Cloud Storage, and other GCP AI/ML services . AWS: Experience with SageMaker, Bedrock, S3, and other AWS AI/ML services . Databricks: Experience building and scaling AI/ML solutions on the Databricks Lakehouse Platform , including MLflow. Snowflake: Experience leveraging Snowflake for data warehousing, data engineering for AI/ML workloads , and Snowpark. Experience with data engineering, including data acquisition, cleaning, transformation, and building ETL/ELT pipelines. Knowledge of MLOps tools and practices for model deployment, monitoring, and management. Familiarity with containerization technologies like Docker and orchestration tools like Kubernetes. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Show more Show less
My Connections Dataplatr Inc.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.