Jobs
Interviews

Client of Prasha Consultancy Services Private Limited

6 Job openings at Client of Prasha Consultancy Services Private Limited
Senior Tech Lead Java, Pune/Chennai chennai,tamil nadu 6 - 12 years INR Not disclosed On-site Full Time

As a Full Stack Java Developer at our reputed IT MNC, you will be responsible for demonstrating your expertise and leadership skills in the tech domain. This position specifically offers the opportunity to work as a Tech Lead or Senior Tech Lead in our team. The job location for this role is in Chennai and Pune, with a hybrid mode of working and general shift timings. To excel in this role, you should ideally have 6 to 12 years of experience in the field, showcasing your deep understanding and proficiency in Core Java, Spring Boot, MVC, React, and Kafka. Join us in this dynamic environment where your technical prowess and collaborative spirit will play a key role in shaping the success of our projects and initiatives. If you are passionate about leveraging your skills to drive innovation and create impactful solutions, we look forward to welcoming you to our team.,

Lead Data Scientist/AI Product Owner haryana 10 - 14 years INR Not disclosed On-site Full Time

As a Lead Data Scientist/AI Product Owner at a reputed IT MNC in Gurgaon, your primary responsibility will be AI Model Development & Deployment, focusing on Computer Vision & Deep Learning. You will be leading the design and implementation of computer vision models for various tasks such as object detection, tracking, segmentation, and action recognition. Your expertise will include architectures like YOLO (v4, v5, v8), Vision Transformers, Mask R-CNN, Faster R-CNN, LSTMs, and Spatio-Temporal Models for image and video analysis. Additionally, you will contextualize the model for challenging situations such as poor detection through specific training on different scenarios. Another key aspect of your role will involve Reinforcement Learning & Model Explainability, where you will develop and integrate reinforcement learning models to optimize decision-making in dynamic AI environments. You will work with techniques such as Deep Q-Networks (DQN), Proximal Policy Optimization (PPO), A3C, SAC, and other RL methods. Furthermore, you will lead the development of a Scalable Data Model for AI Model Training Pipelines, overseeing the end-to-end data preparation process, including data gathering, annotation, and quality review before training. You will focus on enhancing the quality and volume of training data to continually improve model performance and have the ability to think through the data model for future enhancements. In terms of technical skills, you are expected to have a proven track record of leading and delivering AI products, hands-on expertise with advanced AI models, experience in managing and mentoring a small team of data scientists, proficiency in AI/ML frameworks like TensorFlow, PyTorch, and Keras, effective collaboration with cross-functional teams, and strong problem-solving skills in image and video processing. Nice-to-have skills include Experimentation, Iterative Improvement and Testing, Collaboration with Engineering and Product Teams, and Team Leadership. To qualify for this role, you should have at least 10 years of hands-on experience in AI with a solid background in object detection, image and video processing, and computer vision.,

Duck Creek Data Insights Extract Mapper Engineer pune,maharashtra 3 - 7 years INR Not disclosed On-site Full Time

A US Based IT MNC is looking to hire a Duck Creek Data Insights Extract Mapper Engineer at various locations including Noida, Gurgaon, Pune, and Bangalore. The ideal candidate should possess a strong skill set in Duck Creek, Data Insight, Extract Mapper, SQL, ETL Process, and Datahub, while having additional experience in Azure, DevOps, or the Insurance Domain would be advantageous. As a Duck Creek Data Insights Extract Mapper Engineer, you will be expected to demonstrate expertise in Duck Creek Insight product, SQL Server, T-SQL, XSL/XSLT, and MSBI. You should have a deep understanding of Duck Creek Extract Mapper solution, including its architecture, manuscripts, operation, and server API. Proficiency in mapping techniques such as Static Value, XPath, and Expression, as well as field-based mapping and error logging, is essential for this role. Moreover, the successful candidate will be required to have a strong grasp of Data Modelling, Data Warehousing, Data Marts, and Business Intelligence to effectively address business challenges. Familiarity with ETL and EDW toolsets within Duck Creek Data Insights, as well as a working knowledge of its product architecture flow, Data hub, and Extract mapper, are key aspects of this position. The key requirements for this role include a minimum of 3 to 5 years of experience working with Duck Creek Insights product, solid technical knowledge of SQL databases and MSBI, and a preference for candidates with experience in the Insurance domain. Previous exposure to Duck Creek Data Insights and specific experience with Duck Creek would be considered advantageous. Additionally, a profound understanding of database structures, data mining, excellent problem-solving skills, and strong written and verbal communication abilities are highly valued. The ideal candidate should also possess excellent organizational and analytical capabilities, be proficient in Agile methodology, and have the ability to work effectively in large teams. If you meet these criteria and are looking to join a dynamic IT MNC, we encourage you to apply for the position of Duck Creek Data Insights Extract Mapper Engineer.,

Ping Engineer (IAM) haryana 4 - 8 years INR Not disclosed On-site Full Time

You are a highly skilled and experienced CIAM Engineer with 4-7 years of experience, seeking to join a reputed US IT MNC in Gurgaon. Your role involves leading the implementation and deployment of the Ping Identity Platform for clients. Your responsibilities include designing, implementing, and maintaining custom Ping-based IAM solutions, developing custom workflows and connectors, configuring and deploying Ping Access Management, Identity Management, and Directory Services products in various environments. Your technical skills must include proficiency in SAML, OAuth, OpenID Connect, Linux/Unix, Windows Server environments, networking, security concepts, and a deep understanding of the Ping Identity Platform products. You should have a strong understanding of IAM concepts, protocols, and standards, as well as experience with Java, JavaScript, and other scripting languages. Additionally, experience with cloud platforms such as AWS and services like Ec2, Auto Scaling, S3, MongoDB, RDS, Lambda, and API Gateway is required. To qualify for this role, you must hold a Bachelor's degree in Computer Science, Information Technology, or a related field, with a minimum of 4 years of experience in Identity and Access Management technologies, focusing on the Ping Identity Platform. Ping certifications, such as Ping Certified Access Management Specialist or Ping Certified Identity Management Specialist, would be advantageous for this position.,

React Lead haryana 6 - 10 years INR Not disclosed On-site Full Time

A reputed US based IT MNC is currently looking to hire a React Lead to join their team. The ideal candidate should have 6-10 years of experience and possess strong skills in React Js, Redux, Micro front end, and Product UI. This role will require team management skills as it is a Team Lead/Senior Team Lead position. The position is based in either Chennai or Gurgaon and operates in a 5-day work from office setup within an ODC environment. The office timings are from 12 pm to 9 pm. We are specifically looking for candidates who can join immediately or in the early stages. If you meet the criteria and are passionate about leading a team in the field of React development, we encourage you to apply for this exciting opportunity.,

Solution Architect (Data Architecture) haryana 12 - 18 years INR Not disclosed On-site Full Time

As a Solution Architect (Data Architecture) at a US Based IT MNC located in Gurgaon with a hybrid work mode, you will be responsible for owning the end-to-end data architecture for a cross-product reporting solution. Your key responsibilities will include: - Owning the end-to-end data architecture for a cross-product reporting solution, including lakehouse/warehouse, streaming + batch pipelines, semantic layer, and BI. - Establishing robust data contracts and versioned schemas with product teams and driving event standards for policy lifecycle, claims, and accounting events. - Designing multi-tenant data isolation and security strategies, enforcing RBAC/ABAC, encryption, and key management aligned to SOC 2, GDPR, and PCI-DSS. - Building the ingestion and transformation layer with streaming (Kafka/Event Hubs) and batch (ELT/ETL) into a bronze/silver/gold model, managing SCD, late-arriving data, and idempotency. - Standing up the semantic layer with governed, reusable metrics and ensuring data quality and observability through checks, SLAs/SLOs, lineage, and monitoring. - Owning governance & compliance including metadata, lineage, catalog, PII handling, retention, and audit trails, championing Purview (or equivalent) and data stewardship processes. - Driving performance, reliability & cost optimizations through various strategies such as partitioning/clustering, query acceleration, workload management, and cost guardrails. - Publishing the tooling roadmap involving Azure-native and complementary tools, and enabling reference implementations, style guides, and model playbooks for product squads. In addition, you will be required to have the following qualifications and skills: - Bachelor's or Masters degree in computer science, Engineering, or a related field. - 12+ years in data engineering/architecture, with 5+ years architecting analytics platforms for SaaS or enterprise products. - Proven experience designing lakehouse/warehouse architectures and event-driven data pipelines at scale. - Expertise with Azure data stack and Power BI, as well as deep understanding of insurance data. - Hands-on experience with Python/SQL, CI/CD for data, and grounding in data governance, lineage, and security. - Excellent communication and stakeholder management skills. Desirable qualifications and skills include knowledge of reinsurance, bordereaux reporting, DAMA-DMBOK practices, metric standardization frameworks, and certifications in Azure Data, Databricks, or Power BI, along with Agile experience.,