Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
0 Lacs
karnataka
On-site
We are looking for a skilled Data Governance Engineer to take charge of developing and overseeing robust data governance frameworks on Google Cloud Platform (GCP). Your role will involve leveraging your expertise in data management, metadata frameworks, compliance, and security within cloud environments to ensure the implementation of high-quality, secure, and compliant data practices aligned with organizational objectives. With a minimum of 4 years of experience in data governance, data management, or data security, you should possess hands-on proficiency with Google Cloud Platform (GCP) tools such as BigQuery, Dataflow, Dataproc, and Google Data Catalog. Additionally, a strong command over metadata management, data lineage, and data quality tools like Collibra and Informatica is crucial. A deep understanding of data privacy laws and compliance frameworks, coupled with proficiency in SQL and Python for governance automation, is essential. Experience with RBAC, encryption, data masking techniques, and familiarity with ETL/ELT pipelines and data warehouse architectures will be advantageous. Your responsibilities will include developing and executing comprehensive data governance frameworks with a focus on metadata management, lineage tracking, and data quality. You will be tasked with defining, documenting, and enforcing data governance policies, access control mechanisms, and security standards using GCP-native services like IAM, DLP, and KMS. Managing metadata repositories using tools such as Collibra, Informatica, Alation, or Google Data Catalog will also be part of your role. Collaborating with data engineering and analytics teams to ensure compliance with regulatory standards like GDPR, CCPA, SOC 2, and automating processes for data classification, monitoring, and reporting using Python and SQL will be key responsibilities. Supporting data stewardship initiatives, optimizing ETL/ELT pipelines, and data workflows to adhere to governance best practices will also be part of your role. At GlobalLogic, we offer a culture of caring, emphasizing inclusivity and personal growth. You will have access to continuous learning and development opportunities, engaging and meaningful work, as well as a healthy work-life balance. Join our high-trust organization where integrity is paramount, and collaborate with us to engineer innovative solutions that have a lasting impact on industries worldwide.,
Posted 6 days ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
Your role at Prudential is to design, build, and maintain data pipelines to ingest data from multiple sources into the cloud data platform. It is essential to ensure that these pipelines are constructed according to defined standards and documented comprehensively. Data governance standards must be adhered to and enforced to maintain data integrity and compliance. Additionally, you will be responsible for implementing data quality rules to ensure the accuracy and reliability of the data. As part of your responsibilities, you will need to implement data security and protection controls around Databricks Unity Catalog. You will be utilizing Azure Data Factory, Azure Databricks, and other Azure services to build and optimize data pipelines. Proficiency in SQL, Python/PySpark, and other programming languages for data processing and transformation is crucial. Staying updated with the latest Azure technologies and best practices is essential for this role. You will also provide technical guidance and support to team members and stakeholders. Detailed documentation of data pipelines, processes, and data quality rules must be maintained. Debugging, fine-tuning, and optimizing large-scale data processing jobs will be part of your routine tasks. Generating reports and dashboards to monitor data pipeline performance and data quality metrics is also important. Collaboration with data teams across Asia and Africa to understand data requirements and deliver solutions will be required in this role. Overall, your role at Prudential will involve designing, building, and maintaining data pipelines, ensuring data integrity, implementing data quality rules, and collaborating with various teams to deliver effective data solutions.,
Posted 6 days ago
4.0 - 8.0 years
0 Lacs
delhi
On-site
As a Senior AI Engineer / Data Scientist at our cutting-edge Ed-Tech company based in Delhi, you will play a crucial role in designing, developing, and deploying advanced AI/ML models that have a significant impact on our business. Your focus will be on utilizing AI technologies to enhance education through various initiatives such as automated question generation, assessment engines, personalized learning solutions, AI tutors, and auto-grading of answer sheets. Your key responsibilities will include designing, optimizing, and implementing AI and machine learning models tailored specifically for Ed-Tech use cases. This will involve working on projects related to question generation, auto-evaluation systems, adaptive learning engines, AI tutors, and automated answer sheet grading. Additionally, you will be responsible for integrating AI models with frontend interfaces and backend server logic, handling both structured and unstructured data using SQL and NoSQL databases, and ensuring high standards of data security, privacy, and compliance. To succeed in this role, you should possess a Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related field, along with 4-6 years of hands-on experience in AI/ML development. You should have a strong background in Generative AI, time series forecasting, recommendation systems, and real-world AI solutions, as well as proficiency in Python and ML frameworks like TensorFlow, PyTorch, and scikit-learn. Knowledge of machine learning algorithms, data structures, model tuning, and evaluation metrics is essential, and experience with Knowledge Graphs and Docker deployments would be beneficial. Furthermore, you should be adept at developing and maintaining microservices-based architectures, possess strong testing skills using platforms like unittest, and excel in communication, documentation, and team collaboration. Prior experience in Ed-Tech or deploying AI solutions in production environments, as well as active contributions to open-source projects or a well-maintained GitHub portfolio, would be considered advantageous. If you are passionate about leveraging AI to solve real-world challenges in the education sector and thrive in a collaborative, innovative environment, we would love to hear from you. Join our team and be part of revolutionizing education through AI-driven technologies.,
Posted 6 days ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
As a System Analyst Datawarehouse at our company, you will be responsible for collaborating with stakeholders to understand business requirements and translate them into data warehouse design specifications. Your role will involve developing and maintaining data warehouse architecture, including data models, ETL processes, and data integration strategies. You will create, optimize, and manage ETL processes to extract, transform, and load data from various source systems into the data warehouse. Ensuring data quality and accuracy during the ETL process by implementing data cleansing and validation procedures will be a key part of your responsibilities. Designing and maintaining data models, schemas, and hierarchies to support efficient data retrieval and reporting will be crucial. You will implement best practices for data modeling, including star schemas, snowflake schemas, and dimension tables. Integrating data from multiple sources, both structured and unstructured, into the data warehouse will be part of your daily tasks. You will work with API endpoints, databases, and flat files to collect and process data efficiently. Monitoring and optimizing the performance of the data warehouse, identifying and resolving bottlenecks and performance issues, will be essential. You will implement indexing, partitioning, and caching strategies for improved query performance. Enforcing data governance policies and security measures to protect sensitive data within the data warehouse will be a priority. You will ensure compliance with data privacy regulations, such as GDPR or HIPAA. Collaborating with business intelligence teams to provide support for reporting and analytics initiatives will also be part of your role. You will assist in the creation of data marts and dashboards for end-users. Maintaining comprehensive documentation of data warehouse processes, data models, and ETL workflows will be crucial. Additionally, you will train and mentor junior data analysts and team members. To qualify for this role, you should have a Bachelor's degree in computer science, information technology, or a related field. A minimum of 3 years of experience as a Data Warehouse Systems Analyst is required. Strong expertise in data warehousing concepts, methodologies, and tools, as well as proficiency in SQL, ETL tools (e.g., Informatica, Talend, SSIS), and data modeling techniques, are essential. Knowledge of data governance, data security, and compliance best practices is necessary. Excellent problem-solving and analytical skills, along with strong communication and interpersonal skills for effective collaboration with cross-functional teams, will be beneficial in this role. Immediate joiners will be preferable for this position. If you meet the qualifications and are looking to join a dynamic team in Mumbai, we encourage you to apply.,
Posted 6 days ago
1.0 - 2.0 years
5 - 10 Lacs
Hyderabad
Work from Office
Job Role : DLP Analyst--Microsoft Purview Tool Experience : 1 to 3 Yrs Key Skills: DLP Implementation, Writing Policies, Onboarding, Configuration, Data Classificatio n Notice Period : 0 to 15 days(Must) Should be willing to work in rotational shift Office Address : Cyber Towers, Quadrant 3, 3rd floor, Madhapur, Hyderabad -- 500081. Job Overview: Develop and implement data loss prevention strategies, policies, and procedures to protect sensitive data from unauthorized access, disclosure, or loss. Collaborate with cross-functional teams to identify potential vulnerabilities, risks, and gaps in existing data protection measures, and provide recommendations for improvement. Design, configure, DLP solutions and tools to monitor, detect, and prevent data breaches or leaks across various platforms and endpoints. Conduct regular assessments and audits to evaluate the effectiveness of data loss prevention controls and ensure compliance with applicable regulations and industry standards. Collaborate with internal stakeholders to raise awareness and educate employees on data protection best practices, policies, and procedures. Stay updated on emerging threats, trends, and technologies in the field of data security and loss prevention, and provide recommendations for proactive measures. Participate in the evaluation, selection, and implementation of new data protection technologies and tools. Prepare comprehensive reports and presentations for management, highlighting key findings, recommendations, and metrics related to data loss prevention initiatives. Prepare and maintain Standard Operating Procedures (SOPs) related to DLP, ensuring they are up to date and accessible to all relevant stakeholders. Develop and maintain the Responsibility Assignment Matrix (RACI) to clearly define roles and responsibilities for DLP initiatives, including incident response, policy enforcement, and employee training. Skills Strong understanding of data security concepts, regulatory requirements (e.g., GDPR, HIPAA), and industry best practices. Experience in designing and implementing data loss prevention strategies, policies, and procedures in a corporate environment. Proficient in configuring and managing DLP technologies such as data classification, data discovery, data loss monitoring, and incident response. Familiarity with network protocols, security technologies (e.g., firewalls, intrusion detection systems), and encryption methods.
Posted 6 days ago
1.0 - 3.0 years
2 - 5 Lacs
Jaipur
Work from Office
Design and implement endpoint security solutions Develop and report enterprise level metrics for endpoint security controls Maintain endpoint protection infrastructure Ensure that security systems documentation is up to date Maintain awareness of latest security risks and exploits Collaborate with network and systems administrators Ensure that security solutions are integrated seamlessly Bring up concerns to management regarding endpoint security Participate in incident response efforts Implement tasks critical to a company's Endpoint technologies Develop and implement security policies and procedures for end-users
Posted 6 days ago
5.0 - 10.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Shape the future of product delivery while crafting solutions that enhance and optimize customer experiences. Lead end-to-end processes, manage dependencies, and liaise with stakeholders as part of a team at the forefront of innovation. As a Product Delivery Manager in Cybersecuirty & Tech Controls team, you will lead the integration of risk management practices into our organization, thereby enhancing the security and compliance of our products. You will also help drive closure of risk gaps within our products, support execution of audits and assessment, and enhance the efficiency of our Product Security Teams programs. Applying a broad knowledge of technical principles, practices, and theories is essential to developing innovative solutions, along with leveraging analytical reasoning and adaptability skills to navigate through ambiguity and change. Your strong communication abilities will enable you to effectively collaborate with cross-functional teams and manage stakeholder relationships, ensuring alignment on project objectives and governance. By optimizing resources and managing risks, you will contribute to the successful delivery of high-impact projects that shape the future of the firm. Job responsibilities Guide, coach, and oversee the creation and modification of control procedures (CPs), rather than writing them from scratch. Engaging with Product Security Leads (PSLs) to ensure adherence to the GTPC (Global Technology Policies & Controls) process; Ensuring the development of high-quality controls for Products. Assisting with normalizing language, wording, and measurement for consistency across controls; Ensuring that controls are maintained, regularly reviewed, and properly re-certified; Understanding which control procedures apply to products during the planning and building stages, and coaching teams on how to identify relevant controls. Effective integration and engagement with Audit. This involves assisting with the accuracy of Request for Information (RFI) reviews, facilitating communication by addressing questions to and from the audit team, and clarifying the audit scope. Emphasizes "Compliance from the Start". While not the Control Manager, the Risk Lead provides valuable advice in improving the RAS (Risk Assessment Structure), assists in mapping risks to control procedures (CPs) and addressing CORE (Control and Operational Risk Evaluation) issues. Leverage strong communication skills to validate controls and ensure their effectiveness. Required qualifications, capabilities, and skills Formal training or certification on Product delivery management concepts and 5+ years applied experience Expertise in technology risk management, information security, or a related field, with a focus on risk identification, assessment, and mitigation. Proficient in risk management frameworks, industry standards, and regulatory requirements specific to the financial industry. Strong critical thinking skills. Excellent written and verbal communication abilities. Proficient knowledge in data security, risk assessment and reporting, control evaluation, design, and governance. Proven track record of implementing effective risk mitigation strategies Preferred qualifications, capabilities, and skills Knowledge of the product development life cycle, design, and data analytics Shape the future of product delivery while crafting solutions that enhance and optimize customer experiences. Lead end-to-end processes, manage dependencies, and liaise with stakeholders as part of a team at the forefront of innovation. As a Product Delivery Manager in Cybersecuirty & Tech Controls team, you will lead the integration of risk management practices into our organization, thereby enhancing the security and compliance of our products. You will also help drive closure of risk gaps within our products, support execution of audits and assessment, and enhance the efficiency of our Product Security Teams programs. Applying a broad knowledge of technical principles, practices, and theories is essential to developing innovative solutions, along with leveraging analytical reasoning and adaptability skills to navigate through ambiguity and change. Your strong communication abilities will enable you to effectively collaborate with cross-functional teams and manage stakeholder relationships, ensuring alignment on project objectives and governance. By optimizing resources and managing risks, you will contribute to the successful delivery of high-impact projects that shape the future of the firm. Job responsibilities Guide, coach, and oversee the creation and modification of control procedures (CPs), rather than writing them from scratch. Engaging with Product Security Leads (PSLs) to ensure adherence to the GTPC (Global Technology Policies & Controls) process; Ensuring the development of high-quality controls for Products. Assisting with normalizing language, wording, and measurement for consistency across controls; Ensuring that controls are maintained, regularly reviewed, and properly re-certified; Understanding which control procedures apply to products during the planning and building stages, and coaching teams on how to identify relevant controls. Effective integration and engagement with Audit. This involves assisting with the accuracy of Request for Information (RFI) reviews, facilitating communication by addressing questions to and from the audit team, and clarifying the audit scope. Emphasizes "Compliance from the Start". While not the Control Manager, the Risk Lead provides valuable advice in improving the RAS (Risk Assessment Structure), assists in mapping risks to control procedures (CPs) and addressing CORE (Control and Operational Risk Evaluation) issues. Leverage strong communication skills to validate controls and ensure their effectiveness. Required qualifications, capabilities, and skills Formal training or certification on Product delivery management concepts and 5+ years applied experience Expertise in technology risk management, information security, or a related field, with a focus on risk identification, assessment, and mitigation. Proficient in risk management frameworks, industry standards, and regulatory requirements specific to the financial industry. Strong critical thinking skills. Excellent written and verbal communication abilities. Proficient knowledge in data security, risk assessment and reporting, control evaluation, design, and governance. Proven track record of implementing effective risk mitigation strategies Preferred qualifications, capabilities, and skills Knowledge of the product development life cycle, design, and data analytics
Posted 6 days ago
3.0 - 8.0 years
5 - 9 Lacs
Chennai
Work from Office
Hiring for GCP Data Engineer - Chennai Key Responsibilities: Design and implement scalable and secure data pipelines using GCP-native tools such as: Cloud Dataflow , Cloud Dataproc , BigQuery , Cloud Composer Build and manage ETL/ELT workflows to ingest data from various sources (structured and unstructured) Optimize and monitor data pipelines for performance and cost Ensure data quality and integrity across the data lifecycle Collaborate with data scientists, analysts, and application teams for data access and transformation needs Implement CI/CD for data engineering pipelines using tools like Cloud Build or Jenkins Develop and maintain data models and schemas (dimensional or normalized) Document data flows, architecture, and metadata for internal usage Apply best practices in data security, access control , and compliance (e.g., GDPR, HIPAA) Required Skills & Qualifications: 3+ years of experience in data engineering, with at least 2+ years on GCP Strong knowledge and hands-on with GCP services: BigQuery , Cloud Storage , Pub/Sub , Dataflow , Dataproc , Cloud Functions Proficiency in SQL , Python , and Java or Scala (preferred) Experience with orchestration tools such as Apache Airflow or Cloud Composer Experience with real-time data streaming and batch data processing Good understanding of data warehousing , data lake architecture , and data governance Familiarity with Terraform , Infrastructure as Code (IaC) is a plus GCP certifications such as Professional Data Engineer or Associate Cloud Engineer are desirable
Posted 6 days ago
3.0 - 8.0 years
4 - 7 Lacs
Gurugram
Work from Office
Role Overview We are seeking an AWS Data Engineer to join our team. The ideal candidate will have strong experience in AWS cloud services, data engineering, and ETL processes. This role involves designing, implementing, and maintaining data pipelines, ensuring data quality, and collaborating with cross-functional teams. Key Responsibilities Design, develop and maintain scalable data pipelines using AWS services Implement ETL processes and ensure data quality and consistency Optimize data infrastructure for performance and cost Collaborate with data scientists and analysts to support their data needs Implement and maintain data security measures Document data architectures and processes Requirements Education Bachelors/Masters degree in Computer Science, Engineering, or related field Experience Minimum 3 years of experience with AWS data services Strong experience in data engineering and ETL processes Experience with big data technologies Technical Skills Expertise in AWS services (Redshift, S3, Glue, EMR, Lambda) Strong programming skills in Python and SQL Experience with data modeling and warehousing Knowledge of data security best practices Soft Skills Strong problem-solving and analytical skills Excellent communication and collaboration abilities Self-motivated and able to work independently
Posted 6 days ago
3.0 - 8.0 years
6 - 9 Lacs
Gurugram
Work from Office
Role Overview We are looking for an Azure Data Engineer to join our team. The ideal candidate will have strong experience in Microsoft Azure cloud services, data engineering, and ETL processes. This role involves designing, implementing, and maintaining data solutions on Azure, ensuring data quality, and collaborating with cross-functional teams. Key Responsibilities Design and implement data solutions using Azure services Develop and maintain ETL/ELT pipelines Optimize data architectures for performance and cost Ensure data security and compliance Collaborate with data scientists and analysts Document technical specifications and processes Requirements Education Bachelors/Masters degree in Computer Science, Engineering, or related field Experience Minimum 3 years of experience with Azure data services Strong background in data engineering Experience with cloud data platforms Technical Skills Expertise in Azure services (Synapse Analytics, Data Factory, Databricks) Strong SQL and Python programming skills Experience with data modeling and ETL Knowledge of data security best practices Soft Skills Strong analytical and problem-solving skills Excellent communication abilities Team collaboration capabilities
Posted 6 days ago
6.0 - 9.0 years
7 - 11 Lacs
Bengaluru
Work from Office
Req ID: 333419 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Software Development Advisor to join our team in Bangalore, Karn taka (IN-KA), India (IN). COBOL, JCL, CICS, DB2, VSAM - Mainframe Developer (BRT) Mandatory Skills Mainframe development expertise in COBOL, JCL, CICS, DB2, VSAM, utilities, Endeavor, debuggers, schedulers etc. Ability to perform detailed analysis for maintenance and improvement projects in the batch and online environments Employ your data security skills to keep our customers information safe You are detailed and thorough with unit, functional and regression testing Proven Experience writing COBOL/Native stored procedures is a plus You must be experienced in maintaining systems in the MVS mainframe environment. Good To Have: Knowledge on Micro-Services Architecture concepts and emerging patterns such as APIs, Web Services (REST/SOAP) Any knowledge or experience in cloud computing, and/or Java programming is a plus." Minimum Experience on Key Skills 6-9 Years General Expectation 1) Must have Good Communication 2) Must be ready to work in 10:30 AM to 8:30 PM Shift 3) Flexible to work in Client Location GV, Manyata or EGL, Bangalore 4) Must be ready to work from office in a Hybrid work environment. Full Remote work is not an option 5) Expect Full Return to office in 2025 #LI-INPAS
Posted 6 days ago
4.0 - 8.0 years
10 - 11 Lacs
Hyderabad
Work from Office
Job Position Summary The MetLife Corporate Technology (CT) organization is evolving to enable MetLife s New Frontier strategy. With a strong vision in place, we are a global function focused on driving digital technology strategies for key corporate functions within MetLife including, Finance, Actuarial, Reinsurance, Legal, Human Resources, Employee Experience, Risk, Treasury, Audit and Compliance. In partnership with our business leaders, we develop and deliver seamless technology experiences to our employees across the entire employee lifecycle. Our vision and mission is to create innovative, transformative and contemporary technology solutions to empower our leaders and employees so they can focus on what matters most, our customers. We are technologists with strong business acumen focused on developing our talent to continually transform and innovate. We are seeking a highly motivated and skilled Azure Data Engineer to join our growing team in Hyderabad. This position is perfect for talented professionals with 4-8 years of experience in designing, building, and maintaining scalable cloud-based data solutions. As an Azure Data Engineer at MetLife, you will collaborate with cross-functional teams to enable data transformation, analytics, and decision-making by leveraging Microsoft Azure s advanced technologies. He/she should be a strategic thinker, an effective communicator, and an expert in technological development. Key Relationships Internal Stake Holder Key Responsibilities Design, develop, and maintain efficient and scalable data pipelines using Azure Data Factory (ADF) for ETL/ELT processes. Build and optimize data models and data flows in Azure Synapse Analytics, SQL Databases, and Azure Data Lake. Work with large datasets to define, test, and implement data storage, transformation, and processing strategies using Azure-based services. Create and manage data pipelines for ingesting, processing, and transforming data from various sources into a structured format. Develop solutions for real-time and batch processing using tools like Azure Stream Analytics and Event Hubs. Implement data security, governance, and compliance measures to ensure the integrity and accessibility of the organization s data assets. Contribute to the migration of on-premises databases and ETL processes to Azure cloud. Build processes to identify, monitor, and resolve data inconsistencies and quality issues. Collaborate with data architects, business analysts, and developers to deliver reliable and performant data solutions aligned with business requirements. Monitor and optimize performance and cost of Azure-based data solutions. Document architectures, data flows, pipelines, and implementations for future reference and knowledge sharing. Knowledge, Skills, and Abilities Education A Bachelors/masters degree in computer science or equivalent Engineering degree. Candidate Qualifications: Education: Bachelors degree in computer science, Information Systems or related field Experience: Required: 4-8 years of experience in data engineering, with a strong focus on Azure-based services. Proficiency in Azure Data Factory (ADF) , Azure Synapse Analytics, Azure Data Lake, and Azure SQL Databases. Strong knowledge of data modeling, ETL/ELT processes , and data pipeline design. Hands-on experience with Python, SQL, and Spark for data manipulation and transformation. Exposure to big data platforms like Hadoop, Databricks, or similar technologies. Experience with real-time data streaming using tools like Azure Stream Analytics, Event Hubs , or Service Bus. Familiarity with data governance, best practices, and security protocols within cloud environments. Solid understanding of Azure DevOps for CI/CD pipelines around data workflows. Strong problem-solving skills with attention to detail and a results-driven mindset. Excellent collaboration, communication, and interpersonal skills for working with cross-functional teams. Preferred: Demonstrated experience in end-to-end cloud data warehouse migrations . Familiarity with Power BI or other visualization tools for creating dashboards and reports. Certification in Azure Data Engineer Associate or Azure Solutions Architect is a plus. Understanding of machine learning concepts and integrating AI/ML pipelines is an advantage. Skills and Competencies: Language: Proficiency at business level in English. Competencies: Communication: Ability to influence and help communicate the organization s direction and ensure results are achieved Collaboration: Proven track record of building collaborative partnerships and ability to operate effectively in a global environment Diverse environment: Can-do attitude and ability to work in a high paced environment Tech Stack Development & Delivery Methods: Agile (Scaled Agile Framework) DevOps and CI/CD: Azure DevOps Development Frameworks and Languages: SQL Spark Python Azure: Functional Knowledge of cloud based solutions
Posted 6 days ago
5.0 - 7.0 years
5 - 8 Lacs
Thane
Work from Office
Your responsibilities 5 -7 years of experience in Power Apps, Power Automate and power platform Design, develop, and deploy automated workflows using Microsoft Power Automate. Integrate Power Automate with various Microsoft tools like Power Apps, SharePoint, Teams, Outlook, and Dynamics 365, as well as third-party applications. Create custom connectors, API integrations, and manage data flows. Optimize and troubleshoot existing workflows to improve efficiency and performance. Collaborate with business stakeholders to understand requirements and translate them into automation solutions. Ensure data security, governance, and compliance within Power Automate solutions. Document workflow processes, best practices, and troubleshooting guides. Develop Power Automate workflows for process automation and efficiency improvement. Implement security and governance best practices in Power Platform applications. Optimize app performance by applying Power Apps formulas and best coding practices. Troubleshoot and resolve issues related to Power Apps, Power Automate, and related services. Provide training and documentation to end-users and stakeholders for solution adoption Work Location: - Thane (Mumbai)
Posted 6 days ago
5.0 - 10.0 years
7 - 11 Lacs
Bengaluru
Work from Office
. You will guide customers through their product adoption journey, ensuring seamless deployment, integration, and continuous optimization. This role will develop deep relationships with our largest customers and complete the technical onboarding process for our smaller customers. By leveraging your technical expertise and customer-centric approach, you will drive operational success, best-practice adoption, and long-term customer satisfaction. A little about the role As a Technical Consultant Extended Expertise in the Skyhigh Customer Value team, you will play a critical role in ensuring customers maximize the value of our solutions. You will guide customers through their product adoption journey, ensuring seamless deployment, integration, and continuous optimization. This role will develop deep relationships with our largest customers and complete the technical onboarding process for our smaller customers. By leveraging your technical expertise and customer-centric approach, you will drive operational success, best-practice adoption, and long-term customer satisfaction. In this role: Technical Consultants directly guide our customers throughout their product adoption journey, ensuring they achieve their security and network transformation goals. Implement SWG, CASB, ZTNA, DLP, and other SSE security solutions based on the Solution Architect s design. Configure security policies, user access controls, and enterprise integrations. Perform policy migration and optimization to ensure compliance and best practices. Conduct pilot testing, UAT, and final production rollout. Deliver technical training and enablement to customer teams to effectively manage the product. Provide ongoing technical guidance and troubleshooting support for complex issues. Lead best practice workshops and enablement sessions to upskill customer teams. Provide continuous education and coaching to customers on existing and new features to ensure they maximize the value of the solution. Provide context and help to the Support team to ensure that service requests are addressed and effectively communicated to the customer. Manage and escalate customer concerns internally when necessary. Act as the technical point of contact post-deployment, ensuring customers fully adopt, optimize, and expand their SSE solution. Monitor solution performance, adoption and provide proactive recommendations to improve efficiency, security, and compliance. Participate in business reviews and conduct technical health checks to track progress and address gaps. Identify opportunities for expansion by assessing additional use cases, security needs, and feature adoption. Assist with change management and internal advocacy within customer teams to drive long-term adoption. Collaborate with Customer Success Manager, Sales, and Product teams to align solutions with customer objectives. Documentation, Knowledge Sharing and Continuous Improvement Document deployment procedures, configuration settings, and optimization strategies. Contribute to the internal knowledge base for Professional Services and TAM best practices. Mentor and coach other Technical Consultants and Technical Account Managers to enhance team expertise. Contribute to internal knowledge-sharing initiatives, training programs, and best practice discussions. Provide feedback on customer pain points and feature requests to Product and Engineering teams. Stay up to date with industry security trends, emerging threats, and SSE product advancements. required for a Technical Consultant: 5+ years of experience in a customer-facing technical role (Technical Consultant, Solution Architect, Technical Account Manager, Sales Engineer, or similar). Strong background in network security, cloud security, and SSE/SASE solutions. Experience with firewalls, proxies, CASB, DLP, Zero Trust Network Access (ZTNA), and SWG (Secure Web Gateway). Proficiency in network protocols, authentication mechanisms, and security frameworks. Hands-on experience with cloud platforms such as AWS, Azure, or Google Cloud. Strong troubleshooting and problem-solving skills with a customer-first mindset. Excellent communication and presentation skills, with the ability to translate technical concepts for various audiences. Experience coaching and training customers to use technical products effectively. Ability to manage multiple accounts while prioritizing key customer needs. It would be great if you also have the following, but they are not required Industry certifications such as CCNA Security, Comptia security+, AWS/Azure Security, CCSK/CCSP or CISSP. Experience working with large enterprise customers and managing multi-region deployments. Experience working with PSA tools (Open Air, Financial Force etc.). We believe that the best solutions are developed by teams who embrace each others unique experiences, skills, and abilities. We work hard to create a dynamic workforce where we encourage everyone to bring their authentic selves to work every day. We offer a variety of social programs, flexible work hours and family-friendly benefits to all of our employees. Medical, Dental and Vision Coverage Were serious ab out our commitment to a workplace where everyone can thrive and contribute to our industry-leading products and customer support, which is why we prohibit discrimination and harassment based on race, color, religion, gender, national origin, age, disability, veteran status, marital status, pregnancy, gender expression or identity, sexual orientation or any other legally protected status.
Posted 6 days ago
5.0 - 10.0 years
20 - 30 Lacs
Chennai
Work from Office
• Experience in cloud-based systems (GCP, BigQuery) • Strong SQL programming skills. • Expertise in database programming and performance tuning techniques • Possess knowledge of data warehouse architectures, ETL, reporting/analytic tools,
Posted 6 days ago
15.0 - 19.0 years
0 Lacs
haryana
On-site
NTT DATA is looking for a Strategic Pursuit Leader, Vice President to join their Client Growth team. The ideal candidate should have a strong focus on managing large deals ($50M+), orchestrating end-to-end strategic pursuit cycles, and architecting engagement strategies. The Pursuit Leader will play a crucial role in understanding value drivers, value proposition, competitive landscape, and decision processes. They will need to develop actionable Pursuit Strategies that articulate required business outcomes and continually refine their understanding of client executives" needs and beliefs. Key Responsibilities: - Own the creation and execution of strategic pursuit engagement plans. - Engage with customer technology and business leaders to build relationships and understand their needs. - Manage customer relationships from origination to closure of complex, high-value proposals. - Create win strategies aligned with customer business goals and technology needs. - Lead solution design and orchestration across NTT, partners, and client organizations. - Collaborate with cross-functional teams to ensure proposed solutions align with customer requirements. - Stay updated on trends and technologies to create innovative and competitive solutions. - Work in a fast-paced environment driving deal strategies with multiple stakeholders. Basic Qualifications: - Minimum 15 years of experience in IT enterprise sales, complex deal pursuit, or consulting sales roles. - Minimum 10 years of experience leading sales/new business deal pursuits. - Minimum 8 years of pursuit experience managing and closing deals involving various technology domains. - Bachelor's degree in computer science, engineering, or related field. - Ability to travel at least 50% of the time. About NTT DATA: NTT DATA is a trusted global innovator of business and technology services, serving 75% of the Fortune Global 100. They are committed to helping clients innovate, optimize, and transform for long-term success. With diverse experts in more than 50 countries, NTT DATA offers business and technology consulting, data and artificial intelligence solutions, industry solutions, and digital infrastructure services. As part of the NTT Group, they invest significantly in R&D to support organizations and society in moving confidently into the digital future. Visit us at us.nttdata.com.,
Posted 1 week ago
3.0 - 8.0 years
0 Lacs
noida, uttar pradesh
On-site
You should have a total of 8+ years of development/design experience, with a minimum of 3 years experience in Big Data technologies on-prem and on cloud. Proficiency in Snowflake and strong SQL programming skills are required. In addition, you should have strong experience with data modeling and schema design, as well as extensive experience using Data warehousing tools like Snowflake, BigQuery, or RedShift. Experience with BI Tools like Tableau, QuickSight, or PowerBI is a must, with at least one tool being a requirement. You should also have strong experience implementing ETL/ELT processes and building data pipelines, including workflow management, job scheduling, and monitoring. A good understanding of Data Governance, Security and Compliance, Data Quality, Metadata Management, Master Data Management, and Data Catalog is essential. Moreover, you should have a strong understanding of cloud services (AWS or Azure), including IAM and log analytics. Excellent interpersonal and teamwork skills are necessary for this role, as well as experience with leading and mentoring other team members. Good knowledge of Agile Scrum and communication skills are also required. As part of the job responsibilities, you will be expected to perform the same tasks as mentioned above. At GlobalLogic, we prioritize a culture of caring. From day one, you will experience an inclusive culture of acceptance and belonging, where you will have the chance to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. We are committed to your continuous learning and development. You will learn and grow daily in an environment with many opportunities to try new things, sharpen your skills, and advance your career at GlobalLogic. GlobalLogic is known for engineering impact for and with clients around the world. As part of our team, you will have the chance to work on projects that matter and engage your curiosity and creative problem-solving skills. We believe in the importance of balance and flexibility. With many functional career areas, roles, and work arrangements, you can explore ways of achieving the perfect balance between your work and life. GlobalLogic is a high-trust organization where integrity is key. By joining us, you are placing your trust in a safe, reliable, and ethical global company. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world's largest and most forward-thinking companies. Since 2000, we have been at the forefront of the digital revolution, helping create innovative and widely used digital products and experiences. Today we continue to collaborate with clients in transforming businesses and redefining industries through intelligent products, platforms, and services.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
kolkata, west bengal
On-site
As a skilled Salesforce Developer with 4-6 years of experience, you will be responsible for designing, developing, and implementing custom Salesforce applications. Your expertise in Apex, Visualforce, Lightning Components, and integrations will be crucial in enhancing our Salesforce environment. Collaborating with cross-functional teams, you will play a key role in ensuring the success of our Salesforce projects. Your key responsibilities will include: - Designing, developing, testing, and deploying custom Salesforce applications using Apex, Visualforce, Lightning Components, LWC, and other Salesforce technologies. - Configuring Salesforce to align with business requirements, including creating and modifying objects, Lightning flows, validation rules, and approval processes. - Integrating Salesforce with other systems and third-party applications through REST/SOAP APIs, middleware, and ETL tools. - Managing data integrity by implementing data migration and data cleansing strategies, as well as overseeing data modeling, security, and backups. - Creating and maintaining technical documentation, such as design specifications, deployment plans, and system architecture diagrams. - Providing ongoing support and troubleshooting for Salesforce-related issues, including debugging and performance optimization. - Ensuring adherence to Salesforce development best practices by conducting code reviews, testing, and maintaining code quality standards. Required Qualifications: - Bachelor's degree in Computer Science, Information Technology, or a related field. - 4-6 years of hands-on experience in Salesforce development. - Salesforce Certified Platform Developer I or II. Technical Skills: - Proficiency in Apex, Visualforce, Lightning Components, LWC, and SOQL. - Experience with Salesforce integrations using REST/SOAP APIs and familiarity with middleware tools. - Knowledge of Salesforce declarative tools such as Lightning Flow and Workflow Rules. - Understanding of the Salesforce security model, including profiles, roles, and sharing rules. - Familiarity with CI/CD tools for Salesforce, such as Jenkins or Salesforce DX. - Experience with Agile/Scrum development methodologies. If you are a proactive, detail-oriented Salesforce Developer with a passion for innovation and a commitment to excellence, we invite you to join our dynamic team and contribute to the success of our Salesforce projects.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
pune, maharashtra
On-site
ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it, our most valuable asset is our people. Here you'll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers, and consumers, worldwide. ZSers drive impact by bringing a client-first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage, and passion to drive life-changing impact to ZS. Our most valuable asset is our people. At ZS, we honor the visible and invisible elements of our identities, personal experiences, and belief systemsthe ones that comprise us as individuals, shape who we are, and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. What You'll Do Collaborate with client-facing teams to understand solution context and contribute to technical requirement gathering and analysis. Design and implement technical features leveraging best practices for the technology stack being used. Work with technical architects on the team to validate the design and implementation approach. Write production-ready code that is easily testable, understood by other developers, and accounts for edge cases and errors. Ensure the highest quality of deliverables by following architecture/design guidelines, coding best practices, and participating in periodic design/code reviews. Write unit tests as well as higher-level tests to handle expected edge cases and errors gracefully, as well as happy paths. Use bug tracking, code review, version control, and other tools to organize and deliver work. Participate in scrum calls and agile ceremonies, and effectively communicate work progress, issues, and dependencies. Consistently contribute to researching & evaluating the latest technologies through rapid learning, conducting proof-of-concepts, and creating prototype solutions. What You'll Bring Experience: 2+ years of relevant hands-on experience. CS foundation is a must. Strong command over distributed computing frameworks like Spark (preferred) or others. Strong analytical/problem-solving skills. Ability to quickly learn and become hands-on with new technology and be innovative in creating solutions. Strong in at least one of the Programming languages - Python or Java, Scala, etc., and Programming basics - Data Structures. Hands-on experience in building modules for data management solutions such as data pipeline, orchestration, ingestion patterns (batch, real-time). Experience in designing and implementing solutions on a distributed computing and cloud services platform (but not limited to) - AWS, Azure, GCP. Good understanding of RDBMS, with some experience on ETL is preferred. Additional Skills Understanding of DevOps, CI/CD, data security, experience in designing on a cloud platform. AWS Solutions Architect certification with an understanding of the broader AWS stack. Knowledge of data modeling and data warehouse concepts. Willingness to travel to other global offices as needed to work with clients or other internal project teams. Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth, and professional development. Our robust skills development programs, multiple career progression options, internal mobility paths, and collaborative culture empower you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client-facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering Applying At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment. An online application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At www.zs.com,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
You have approximately 10 years of experience as a Senior IT Enterprise Solution Architect, specializing in designing IT infrastructure solutions for both on-premises and cloud environments. Your expertise lies in areas such as Digital Certificate PKI IAM and IT Security within the Banking industry. Your responsibilities include designing, implementing, and documenting solution and integration architectures, evaluating the impact of design decisions, ensuring architecture compliance, and communicating system architecture to stakeholders through diagrams and models. As a part of the Strategic Internal Certificate Management Governance Project team, your main objective will be to develop a Solution Architecture for the project and lead the Technical Team during implementation. You will be required to stay updated on technological advancements to ensure high efficiency and solution quality. Additionally, you should possess a desire to work in an innovative agile environment, be proactive, positive, and have a can-do attitude. Your role will involve leading the engagement efforts in providing consulting solutions to customers, from problem definition to solution design, development, and deployment. You will also be responsible for reviewing proposals, identifying change management requirements, and coaching team members to provide high-quality consulting solutions adhering to organizational guidelines. Your technical requirements include expertise in Technology Data Security, Public Key Infrastructure, Digital Signatures, Cryptography, and Transport Layer Security. Desirable skills encompass Technology Architecture, Finacle eB, Digital Certificates Signature Technology, Identity Management, and IAM Architecture Consultancy. In summary, your primary focus will be on developing and leading the implementation of Solution Architectures for IT infrastructure solutions, ensuring compliance, effective communication with stakeholders, and staying abreast of technological advancements to deliver high-quality solutions in a dynamic environment.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
The responsibilities of the role involve designing and implementing Azure Synapse Analytics solutions for data processing and reporting. You will be required to optimize ETL pipelines, SQL pools, and Synapse Spark workloads while ensuring data quality, security, and governance best practices are followed. Collaborating with business stakeholders to develop data-driven solutions and mentoring a team of data engineers are key aspects of this position. To excel in this role, you should possess 6-10 years of experience in Data Engineering, BI, or Cloud Analytics. Expertise in Azure Synapse, Azure Data Factory, SQL, and ETL processes is essential. Experience with Fabric is strongly desirable. Strong leadership, problem-solving, and stakeholder management skills are required. Additionally, knowledge of Power BI, Python, or Spark is a plus. Deep understanding of Data Modelling techniques, Design and development of ETL Pipelines, Azure Resources Cost Management, and writing complex SQL queries are important competencies. Familiarity with Best Authorization and security practices for Azure components, Master Data/metadata management, and data governance is crucial. Being able to manage a complex and rapidly evolving business and actively lead, develop, and support team members is vital. An Agile mindset and the ability to adapt to constant changes in risks and forecasts are expected. Thorough knowledge of data warehouse architecture, principles, and best practices is necessary. Expertise in designing star and snowflake schemas, identifying facts and dimensions, and selecting appropriate granularity levels is also required. Ensuring data integrity within the dimensional model by validating data and identifying inconsistencies is part of the role. You will work closely with Product Owners and data engineers to translate business needs into effective dimensional models. Joining MRI Software offers the opportunity to lead AI-driven data integration projects in real estate technology, work in a collaborative and innovative environment with global teams, and access competitive compensation, career growth opportunities, and exposure to cutting-edge technologies. The ideal candidate should hold a Bachelor's/Master's degree in software engineering, Computer Science, or a related area. The benefits of this position include hybrid working arrangements, an annual performance-related bonus, 6x Flexi any days, medical insurance coverage for extended family members, and an engaging, fun, and inclusive culture at MRI Software. MRI Software is a company that delivers innovative applications and hosted solutions to empower real estate companies to enhance their business. With a flexible technology platform and an open and connected ecosystem, we cater to the unique needs of real estate businesses globally. With offices across various countries and a diverse team, we provide expertise and insight to support our clients effectively. MRI Software is proud to be an Equal Employment Opportunity employer.,
Posted 1 week ago
0.0 - 3.0 years
0 Lacs
maharashtra
On-site
The Acquisition Process Officer in the Branch Banking department plays a crucial role in managing account opening and customer service processes for the Corporate Salary segment within a branch/region. The primary objective of this role is to contribute towards the larger branch banking channel objectives by ensuring high process efficiencies and delivering exceptional customer experiences. Responsibilities of the Acquisition Process Officer include processing forms for account opening and service requests for the Corporate Salary segment with a First Time Right approach, ensuring errors are kept within acceptable norms. They are also responsible for resolving queries and discrepancies raised by stakeholders in a timely manner, as well as collaborating with Channel Partners to address any discrepancies promptly to facilitate smooth client onboarding. Moreover, the Acquisition Process Officer is tasked with ensuring compliance with banking regulations and policies related to Anti Money Laundering (AML), Know Your Customer (KYC), Data & Information security, among others. They support the Acquisition Process Manager in driving initiatives to reduce operational costs and develop strategies to enhance profitability. Staying updated on products, policies, and market competition is essential for this role, along with ensuring strict adherence to internal guidelines and regulations. The ideal candidate for this position should hold a Bachelor's degree in Engineering, Technology, Mathematics, Commerce, Arts, Science, Biology, Business, Computers, or Management. Relevant work experience in the range of 0 to 2 years is preferred to excel in this role.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
You are an exceptional, innovative, and passionate individual looking to grow with NTT DATA. If you want to be part of an inclusive, adaptable, and forward-thinking organization, this opportunity is for you. As a Salesforce Datacloud & Agentforce Solution Architect, you will be responsible for designing, developing, and implementing AI-powered conversational experiences within the Salesforce platform. Your role will involve utilizing Agentforce capabilities to create automated customer interactions across various channels, leveraging strong technical skills in Salesforce development and natural language processing (NLP) to build effective virtual agents. Your core responsibilities will include architecting and building data integration solutions using Salesforce Data Cloud, unifying customer data from diverse sources. You will implement data cleansing, matching, and enrichment processes to enhance data quality, design and manage data pipelines for efficient data ingestion, transformation, and loading, and collaborate with cross-functional teams to translate business requirements into effective data solutions. Monitoring data quality, identifying discrepancies, and enforcing data governance policies will also be key aspects of your role. Minimum Skills Required: - Expertise in Salesforce Data Cloud features such as data matching, cleansing, enrichment, and data quality rules - Understanding of data modeling concepts and the ability to design data models within Salesforce Data Cloud - Proficiency in utilizing Salesforce Data Cloud APIs and tools for data integration from various sources - Knowledge of data warehousing concepts and data pipeline development Relevant Experience: - Implementing Salesforce Data Cloud for customer 360 initiatives - Designing and developing data integration solutions - Managing data quality issues and collaborating with business stakeholders - Building and customizing Agentforce conversational flows - Training and refining natural language processing models - Monitoring Agentforce performance and analyzing customer interaction data - Seamlessly integrating Agentforce with other Salesforce components - Thoroughly testing Agentforce interactions before deployment Skills to Highlight: - Expertise in Salesforce administration, development, and architecture - Deep knowledge of Agentforce features and configuration options - Familiarity with NLP concepts - Proven ability in conversational design and data analysis - Experience in designing, developing, and deploying solutions on Salesforce Data Cloud platform - Collaboration with stakeholders and building custom applications and integrations - Development and optimization of data models - Implementation of data governance and security best practices - Troubleshooting, debugging, and performance tuning Join NTT DATA, a trusted global innovator in business and technology services, committed to helping clients innovate, optimize, and transform for long-term success. With a diverse team and a strong partner ecosystem, NTT DATA offers a range of services including business and technology consulting, data and artificial intelligence solutions, as well as application and infrastructure management. Be part of a leading provider of digital and AI infrastructure, transforming organizations and society for a digital future. Visit us at us.nttdata.com.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a part of ZS, you will have the opportunity to work in a place driven by passion that aims to change lives. ZS is a management consulting and technology firm that is dedicated to enhancing life and its quality. The core strength of ZS lies in its people, who work collectively to develop transformative solutions for patients, caregivers, and consumers worldwide. By adopting a client-first approach, ZS employees bring impactful results to every engagement by partnering closely with clients to design custom solutions and technological products that drive value and yield positive outcomes in key areas of their business. Your role at ZS will require you to bring inquisitiveness for learning, innovative ideas, courage, and dedication to make a life-changing impact. At ZS, the individuals are highly valued, recognizing both the visible and invisible facets of their identities, personal experiences, and belief systems. These elements shape the uniqueness of each individual and contribute to the diverse tapestry within ZS. ZS acknowledges and celebrates personal interests, identities, and the thirst for knowledge as integral components of success within the organization. Learn more about the diversity, equity, and inclusion initiatives at ZS, along with the networks that support ZS employees in fostering community spaces, accessing necessary resources for growth, and amplifying the messages they are passionate about. As an Architecture & Engineering Specialist specializing in ML Engineering at ZS's India Capability & Expertise Center (CEC), you will be part of a team that constitutes over 60% of ZS employees across three offices in New Delhi, Pune, and Bengaluru. The CEC plays a pivotal role in collaborating with colleagues from North America, Europe, and East Asia to deliver practical solutions to clients that drive the company's operations. Upholding standards of analytical, operational, and technological excellence, the CEC leverages collective knowledge to enable ZS teams to achieve superior outcomes for clients. Joining ZS's Scaled AI practice within the Architecture & Engineering Expertise Center will immerse you in a dynamic ecosystem focused on generating continuous business value for clients through innovative machine learning, deep learning, and engineering capabilities. In this role, you will collaborate with data scientists to craft cutting-edge AI models, develop and utilize advanced ML platforms, establish and implement sophisticated ML pipelines, and oversee the entire ML lifecycle. **Responsibilities:** - Design and implement technical features using best practices for the relevant technology stack - Collaborate with client-facing teams to grasp the solution context, contribute to technical requirement gathering and analysis - Work alongside technical architects to validate design and implementation strategies - Write production-ready code that is easily testable, comprehensible to other developers, and addresses edge cases and errors - Ensure top-notch quality deliverables by adhering to architecture/design guidelines, coding best practices, and engaging in periodic design/code reviews - Develop unit tests and higher-level tests to handle expected edge cases, errors, and optimal scenarios - Utilize bug tracking, code review, version control, and other tools for organizing and delivering work - Participate in scrum calls, agile ceremonies, and effectively communicate progress, issues, and dependencies - Contribute consistently by researching and evaluating the latest technologies, conducting proofs-of-concept, and creating prototype solutions - Aid the project architect in designing modules/components of the overall project/product architecture - Break down large features into estimable tasks, lead estimation, and defend them with clients - Independently implement complex features with minimal guidance, such as service or application-wide changes - Systematically troubleshoot code issues/bugs using stack traces, logs, monitoring tools, and other resources - Conduct code/script reviews of senior engineers within the team - Mentor and cultivate technical talent within the team **Requirements:** - Minimum 5+ years of hands-on experience in deploying and productionizing ML models at scale - Proficiency in scaling GenAI or similar applications to accommodate high user traffic, large datasets, and reduce response time - Strong expertise in developing RAG-based pipelines using frameworks like LangChain & LlamaIndex - Experience in crafting GenAI applications such as answering engines, extraction components, and content authoring - Expertise in designing, configuring, and utilizing ML Engineering platforms like Sagemaker, MLFlow, Kubeflow, or other relevant platforms - Familiarity with Big data technologies including Hive, Spark, Hadoop, and queuing systems like Apache Kafka/Rabbit MQ/AWS Kinesis - Ability to quickly adapt to new technologies, innovate in solution creation, and independently conduct POCs on emerging technologies - Proficiency in at least one Programming language such as PySpark, Python, Java, Scala, etc., and solid foundations in Data Structures - Hands-on experience in building metadata-driven, reusable design patterns for data pipeline, orchestration, and ingestion patterns (batch, real-time) - Experience in designing and implementing solutions on distributed computing and cloud services platforms (e.g., AWS, Azure, GCP) - Hands-on experience in constructing CI/CD pipelines and awareness of application monitoring practices **Additional Skills:** - AWS/Azure Solutions Architect certification with a comprehensive understanding of the broader AWS/Azure stack - Knowledge of DevOps CI/CD, data security, and experience in designing on cloud platforms - Willingness to travel to global offices as required to collaborate with clients or internal project teams **Perks & Benefits:** ZS provides a holistic total rewards package encompassing health and well-being, financial planning, annual leave, personal growth, and professional development. The organization offers robust skills development programs, various career progression options, internal mobility paths, and a collaborative culture that empowers individuals to thrive both independently and as part of a global team. ZS is committed to fostering a flexible and connected work environment that enables employees to combine work from home and on-site presence at clients/ZS offices for the majority of the week. This approach allows for the seamless integration of the ZS culture and innovative practices through planned and spontaneous face-to-face interactions. **Travel:** Travel is an essential aspect of working at ZS, especially for client-facing roles. Business needs dictate the priority for travel, and while some projects may be local, all client-facing employees should be prepared to travel as required. Travel opportunities provide avenues to strengthen client relationships, gain diverse experiences, and enhance professional growth through exposure to different environments and cultures. **Application Process:** Candidates must either possess or be able to obtain work authorization for their intended country of employment. To be considered, applicants must submit an online application, including a complete set of transcripts (official or unofficial). *Note: NO AGENCY CALLS, PLEASE.* For more information, visit [ZS Website](www.zs.com).,
Posted 1 week ago
12.0 - 16.0 years
0 Lacs
maharashtra
On-site
At Capgemini Engineering, the world leader in engineering services, a global team of engineers, scientists, and architects collaborate to empower the world's most innovative companies to reach their full potential. From cutting-edge technologies like autonomous cars to life-saving robots, our digital and software technology experts are known for their out-of-the-box thinking, providing unique R&D and engineering services across various industries. Embark on a career with us, where every day presents new opportunities to make a meaningful difference. As a part of our team, you will be responsible for developing and implementing techniques and analytics applications that convert raw data into valuable insights using data-oriented programming languages and visualization software. Utilize data mining, data modeling, natural language processing, and machine learning to extract and analyze information from large datasets, both structured and unstructured. Visualize, interpret, and report data findings, and potentially create dynamic data reports. We are currently seeking a highly experienced AI/ML Solution Architect to take the lead in designing and implementing scalable, enterprise-grade AI and machine learning solutions. The ideal candidate should possess a solid background in data science, cloud computing, and AI/ML frameworks, demonstrating the ability to translate business requirements into technical solutions effectively. **Primary Responsibilities:** - Design and architect end-to-end AI/ML solutions customized to meet specific business needs. - Lead the development and deployment of machine learning, deep learning, and generative AI models. - Collaborate with cross-functional teams, including data engineers, software developers, and business stakeholders. - Ensure seamless integration of AI models into existing systems and workflows. - Conduct architecture reviews, performance tuning, and scalability assessments. - Stay abreast of the latest trends, tools, and technologies in the field of AI/ML. **Primary Skills (Must-Have):** - Strong expertise in AI/ML algorithms, encompassing deep learning, NLP, and computer vision. - Proficiency in Python and ML libraries such as TensorFlow, PyTorch, Keras, and Scikit-learn. - Experience with cloud platforms, particularly Azure (mandatory); familiarity with AWS/GCP is desirable. - Hands-on experience with LLMs (e.g., Azure OpenAI, Mistral) and GenAI deployment. - API development using FastAPI, Flask; integration with third-party tools (e.g., SharePoint, SNOW). - Solid understanding of data security (at rest and in motion) and data flow diagrams. **Secondary Skills (Good-to-Have):** - UI development using Angular, Streamlit, HTML, CSS, and JavaScript. - Experience with containerization tools like Docker, Azure Container Registry/Instance. - Familiarity with monitoring tools such as Splunk and NewRelic. - Knowledge of Agentic AI, RAG (Retrieval-Augmented Generation), and MFA (Multi-Factor Authentication). - Cost estimation and infrastructure planning for AI deployments (on-prem/cloud). **Qualifications:** - Bachelor's or Master's degree in Computer Science, Engineering, or a related field. - 12-14 years of experience in AI/ML solution development and architecture. - Excellent communication and stakeholder management skills. Join Capgemini, a global partner in business and technology transformation, dedicated to accelerating organizations" transition to a digital and sustainable world while delivering tangible impact for enterprises and society. With a diverse team of over 340,000 members across 50 countries, Capgemini is a trusted leader with a heritage spanning more than 55 years. Clients rely on Capgemini to unlock technology's value and address their entire business needs, offering end-to-end services and solutions with expertise in AI, generative AI, cloud, and data, supported by industry knowledge and a strong partner ecosystem.,
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough