Home
Jobs

15339 Gcp Jobs - Page 43

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 9.0 years

6 - 7 Lacs

Gurugram

Work from Office

Naukri logo

We are seeking a skilled and proactive GCP DevOps Engineer . The ideal candidate will have hands-on experience with Google Cloud Platform (GCP), Infrastructure as Code (IaC), CI/CD pipelines, container orchestration, and cloud security. You will play a crucial role in automating infrastructure, optimizing deployment workflows, and maintaining highly available and scalable systems. Key Responsibilities: Design, implement, and manage scalable infrastructure on Google Cloud Platform (GCP) . Develop and maintain CI/CD pipelines using tools like Jenkins, GitLab CI, Cloud Build , etc. Write and maintain Infrastructure as Code (IaC) using Terraform, Deployment Manager , or similar tools. Implement containerization and orchestration using Docker and Kubernetes (GKE) . Monitor system performance and ensure system reliability and availability. Collaborate with development, QA, and security teams to streamline DevOps practices. Automate cloud operations and improve deployment and rollback processes. Set up and manage logging, monitoring, and alerting with tools such as Stackdriver (Cloud Operations), Prometheus, Grafana , etc. Ensure compliance with security best practices and manage IAM roles, policies, and service accounts. Required Skills and Qualifications: 4+ years of experience in DevOps or Cloud Engineering. Proven hands-on experience with Google Cloud Platform (GCP) services. Strong knowledge of CI/CD tools , version control systems (Git), and DevOps methodologies. Experience with Terraform or Cloud Deployment Manager for IaC. Good understanding of Docker and Kubernetes (GKE preferred) . Scripting skills in languages like Python, Bash, or Go . Familiarity with cloud security , networking , and load balancing on GCP. Knowledge of monitoring/logging tools (e.g., Stackdriver, Prometheus). Strong problem-solving and communication skills.

Posted 2 days ago

Apply

4.0 - 6.0 years

10 - 11 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

We are looking for members with hands-on Data Engineering experience who will work on the internal and customer-based projects for Bridgenext. We are looking for someone who cares about the quality of code and who is passionate about providing the best solution to meet the client needs and anticipate their future needs based on an understanding of the market. Someone who worked on Hadoop projects including processing and data representation using various AWS Services. Must Have Skills: 4+ years of experience in data engineering, with a focus on big data technologies (e.g., Spark, Kafka) 2+ years of Databricks experience is must Strong understanding of data architecture, ETL processes, and data warehousing Proficiency in programming languages such as Python or Java Experience with cloud platforms (e.g., AWS, Azure, GCP) and big data tools Excellent communication, interpersonal, and leadership skills Ability to work in a fast-paced environment and manage multiple priorities Professional Skills: Solid written, verbal, and presentation communication skills Strong team and individual player Maintains composure during all types of situations and is collaborative by nature High standards of professionalism, consistently producing high-quality results Self-sufficient, independent requiring very little supervision or intervention Demonstrate flexibility and openness to bring creative solutions to address issue

Posted 2 days ago

Apply

1.0 - 4.0 years

0 Lacs

Bengaluru

Remote

Naukri logo

Job Summary: We are seeking a high-energy and ambitious Business Development Representative (BDR) to join our sales team. As a BDR, you will be responsible for generating leads, qualifying prospects, and helping drive revenue growth for our cloud-based SaaS and IT services. This is an excellent opportunity for a sales professional with a passion for technology and a knack for consultative selling. Key Responsibilities: Prospect, identify, and qualify new sales opportunities through inbound and outbound efforts (calls, emails, LinkedIn, etc.) Work closely with the sales and marketing teams to develop and execute lead generation strategies Conduct discovery calls to understand client needs and pain points Schedule product demos and meetings for Account Executives Maintain accurate records in the CRM system and generate regular reports Nurture prospects through the sales pipeline with timely follow-ups Stay up-to-date with industry trends, SaaS products, and cloud technologies Key Requirements: 1-4 years of experience in B2B SaaS or IT sales/business development Strong understanding of cloud services (AWS, Azure, GCP) and/or SaaS solutions Excellent communication, presentation, and interpersonal skills Self-motivated, goal-oriented, and a quick learner Experience with CRM tools (e.g., HubSpot,Salesforce, Zoho) Proven track record of generating and qualifying leads Bachelor's degree in Business, Marketing, IT, or a related field Interested Candidates can drop your resumes to chaitra.br@ind.bloomsolutions.com

Posted 2 days ago

Apply

5.0 - 10.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Hello. We re Haleon. A new world-leading consumer health company. Shaped by all who join us. Together, we re improving everyday health for billions of people. By growing and innovating our global portfolio of category-leading brands - including Sensodyne, Panadol, Advil, Voltaren, Theraflu, Otrivin, and Centrum - through a unique combination of deep human understanding and trusted science. What s more, we re achieving it in a company that we re in control of. In an environment that we re co-creating. And a culture that s uniquely ours. Care to join us. It isn t a question. With category leading brands such as Sensodyne, Voltaren and Centrum, built on trusted science and human understanding, and combined with our passion, knowledge and expertise, we re uniquely placed to do this and to grow a strong, successful business. This is an exciting time to join us and help shape the future. It s an opportunity to be part of something special. About the role This is an excellent opportunity for a Senior Machine Learning Engineer to provide key contribution to a strategic digital programme expected to transform the integrated business planning process in Haleon. The successful candidate will lead the deployment of machine learning solutions that will transform our integrated business planning process. In this role, you will be responsible for designing and implementing end-to-end machine learning pipelines for various applications. Key Responsibilities of the role Lead the deployment of ML solutions on the enterprise infrastructure, in line with ML engineering standards and best practices (version control, testing, deployment, maintenance). Manage the entire lifecycle of data science/machine learning models, including monitoring, data gathering for retraining, and updates. Implement ML engineering best practices, such as coding standards, code reviews, and automated testing. Define and implement metrics to evaluate the functional performance and computational resource efficiency of ML and AI components. Can coordinate and manage competing priorities across a portfolio of projects. Ability to influence across organisations, proven collaboration skills, comfortable working with ambiguity, collaborate with cross-functional teams, including data scientists, data engineers, and business stakeholders. People and Team management Contribute to a highly collaborative team with a culture of ownership, initiative and responsibility. Contribute to the development of our Team s ML engineering standards of reusable DS, ML, and AI assets. Motivate, coach, mentor colleagues within the ML Engineering Team to develop technical excellence. Manage a team of up to 3 Machine Learning Engineers Necessary Qualifications & Skills BSc, MSc or PhD degree in mathematics, computer science, or another scientific discipline that provides solid foundations on relevant aspects of Data Science. 5-10 years of industry experience with proven experience implementing machine learning engineering pipelines on large datasets. Strong understanding of ML engineering best practices. Strong collaboration skills and comfortable working with ambiguity, making quick, informed decisions considering trade-offs. Experience with deploying machine learning solutions on enterprise infrastructure. Strong programming skills in Python or similar programming languages.. Ability to manage competing priorities across a portfolio of projects. Highly desirable Qualifications & Skills Strong experience with cloud-based infrastructure and distributed computing systems, such as Azure, AWS, or GCP. Experience with containerization and orchestration tools, such as Docker and Kubernetes. Understanding of software engineering principles, such as modular design, clean code, and testing. Familiarity with DevOps practices and tools, such as continuous integration and deployment (CI/CD) pipelines and configuration management tools like Ansible or Terraform. Excellent problem-solving skills and the ability to debug complex issues in production environments. Strong communication skills and the ability to collaborate with cross-functional teams. Familiarity with monitoring and logging tools, such as Grafana, Prometheus, and ELK Stack. Knowledge of security best practices in machine learning engineering. .

Posted 2 days ago

Apply

2.0 - 4.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Note***: This is a hybrid role, combining remote and on-site work, requiring 3 days in the office, and relocation to Pune. We are looking for a seasoned software engineer to join the team that owns building Agent experience in support products. Someone who identifies as a JavaScript developer with good knowledge in frontend technologies and an interest in understanding back-end architecture for putting the holistic picture together. Ruby skills are a plus! Agent Workspace enables Zendesk Support agents to work seamlessly across Zendesk channels, all within a single ticket interface. It is a critical piece of how customers use Zendesk Support, the most important part of the overall product user experience, and fundamentally what makes our customers successful. As a team, we are a close-knit group that values inclusivity and diversity of backgrounds and opinions. We deliberately cultivate a highly collaborative and productive working style. The team has a proven history of developing highly reliable and scalable frontend and extending complex areas of the product. The features managed by our team powers some of the critical features in the support product. The work we do has a high impact on agents efficiency! What you get to do every day: Drive the modernization and evolution of our largest monolithic application by building rich, scalable, and performant frontend features using React, Redux, TypeScript, and GraphQL. Collaborate closely with product managers, designers, and backend engineers to deliver seamless user experiences that solve real customer problems. Write clean, maintainable, and well-tested code to implement new features and enhance existing ones within our frontend stack. Participate actively in code reviews, pair programming, and design discussions to uphold high-quality engineering standards. Identify and resolve frontend performance bottlenecks, ensuring fast, responsive, and accessible user interfaces. Evangelize best practices in frontend development, including state management, component architecture, and testing strategies. What you bring to the role: 2 to 4+ years of professional experience focused on frontend development, ideally in SaaS or complex web application environments. Strong expertise in JavaScript and TypeScript, with deep knowledge of React and its ecosystem (Redux, React Router, hooks, etc.). Experience working with GraphQL APIs and integrating them effectively into frontend applications. Familiarity with modern frontend tooling and build systems (Webpack, Babel, ESLint, etc.). Solid understanding of responsive design, cross-browser compatibility, and accessibility standards. Passion for writing clear, maintainable code and producing thorough automated tests (unit, integration, end-to-end). Excellent communication skills, able to articulate technical concepts clearly to both technical and non-technical stakeholders. Bonus: Experience with Ember.js, cloud platforms (AWS, GCP), or containerization technologies is a plus but not required. Bonus: Contributions to open source frontend projects or active participation in frontend communities.

Posted 2 days ago

Apply

8.0 - 10.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Software Engineering Advisor - HIH - Evernorth About Evernorth: Evernorth Health Services, a division of The Cigna Group (NYSE: CI), creates pharmacy, care, and benefits solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention, and treatment of illness and disease more accessible to millions of people. Position Summary: Data engineer on the Data integration team Job Description & Responsibilities: Work with business and technical leadership to understand requirements. Design to the requirements and document the designs Ability to write product-grade performant code for data extraction, transformations and loading using Spark, Py-Spark Do data modeling as needed for the requirements. Write performant queries using Teradata SQL, Hive SQL and Spark SQL against Teradata and Hive Implementing dev-ops pipelines to deploy code artifacts on to the designated platform/servers like AWS / Azure / GCP. Troubleshooting the issues, providing effective solutions and jobs monitoring in the production environment Participate in sprint planning sessions, refinement/story-grooming sessions, daily scrums, demos and retrospectives. Experience Required: Overall 8-10 years of experience Experience Desired: Strong development experience in Spark, Py-Spark, Shell scripting, Teradata. Strong experience in writing complex and effective SQLs (using Teradata SQL, Hive SQL and Spark SQL) and Stored Procedures Health care domain knowledge is a plus Education and Training Required: Primary Skills: Excellent work experience on Databricks as Data Lake implementations Experience in Agile and working knowledge on DevOps tools (Git, Jenkins, Artifactory) Experience in AWS (S3, EC2, SNS, SQS, Lambda, ECS, Glue, IAM, and CloudWatch) / GCP / Azure Databricks (Delta lake, Notebooks, Pipelines, cluster management, Azure / AWS integration Additional Skills: Experience in Jira and Confluence Exercises considerable creativity, foresight, and judgment in conceiving, planning, and delivering initiatives. Location & Hours of Work (hybrid, Hyderabad ) (11:30am-8:30PM) Equal Opportunity Statement About Evernorth Health Services

Posted 2 days ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

Remote

Linkedin logo

Work on Real-World Problems with Global Tech Experts Join a leading U.S.-based technology company as a Python Developer / AI Engineer, where you’ll tackle real-world challenges and build innovative solutions alongside top global experts. This is a fully remote, contract-based opportunity ideal for developers passionate about Python, data analysis, and AI-driven work. Key Responsibilities: Write efficient, production-grade Python code to solve complex problems. Analyze public datasets and extract meaningful insights using Python and SQL. Collaborate with researchers and global teams to iterate on data-driven ideas. Document all code and development decisions in Jupyter Notebooks or similar platforms. Maintain high-quality standards and contribute to technical excellence. Job Requirements: Open to all levels: junior, mid-level, or senior engineers. Degree in Computer Science, Engineering, or equivalent practical experience. Proficient in Python programming for scripting, automation, or backend development. Experience with SQL/NoSQL databases is a plus. Familiarity with cloud platforms (AWS, GCP, Azure) is advantageous. Must be able to work 5+ hours overlapping with Pacific Time (PST/PT). Strong communication and collaboration skills in a remote environment. Perks & Benefits: Work on cutting-edge AI and data projects impacting real-world use cases. Collaborate with top minds from Meta, Stanford, and Google. 100% remote – work from anywhere. Contract role with flexibility and no traditional job constraints. Competitive compensation in USD, aligned with global tech standards. Selection Process: Shortlisted developers may be asked to complete an assessment. If you clear the assessment, you will be contacted for contract assignments with expected start dates, durations, and end dates. Some contract assignments require fixed weekly hours, averaging 20/30/40 hours per week for the duration of the contract assignment. Show more Show less

Posted 2 days ago

Apply

10.0 - 18.0 years

30 - 40 Lacs

Bengaluru, Delhi / NCR, Mumbai (All Areas)

Hybrid

Naukri logo

Role & responsibilities Job Description: We are seeking a talented and experienced BI Manager to join our dynamic team. The ideal candidate will have a strong background in BI tools, along with proficiency in SQL and ETL processes or tools Responsibilities: Work in an IC role or lead and mentor a team of BI developers, ensuring adherence to best practices in BI development and data visualization. Collaborate with stakeholders to define business requirements and design impactful dashboards using Power BI. Use Python to automate data extraction and transformation. Oversee ETL processes and integrate data from various sources into Power BI. Enhance Power BI reports with Python scripts for advanced analytics and custom calculations. Ensure strong data governance, quality, and user-friendly visualization practices. Communicate complex data concepts clearly to both technical and non-technical teams. Good documentation skills to deliver BRDs, architecture, specifications document, project plan etc. Lead the creation of RFPs for BI tools and solutions, working with vendors to select the best technology. Manage POV exercises to demonstrate the value of BI solutions and secure stakeholder buy-in. Develop MVPs to quickly deliver core BI functionalities, allowing for iterative feedback and faster delivery. Qualifications: 9+ years of experience in DWH/BI domain with a focus on Power BI development. Proficiency in at least two BI tools: Qlik, Power BI, Tableau. Showcase experience in 2-3 complete life cycle implementation of BI projects. Willingness and readiness to cross-skill in multiple BI tools and platforms. Experience with Power Platform (Power Apps, Power Automate), including building apps and automating workflows. Advanced SQL skills for querying and manipulating data from relational databases. Basic Python skills for ETL automation, with experience in Pandas, NumPy, and pyodbc. Proven experience leading BI teams and managing priorities. Experience in preparing RFPs, leading POVs, and delivering MVPs for BI solutions. Strong analytical and problem-solving skills, with attention to detail. Excellent communication and presentation skills. Good-to-Have: Knowledge of other BI tools and platforms. Experience working with cloud environments (e.g., AWS, Azure, GCP, Snowflake). Knowledge of Alteryx, Informatica or other ETL tools . Knowledge in embedding BI visualizations and reports into web applications. Exposure to big data tools like Hadoop or Spark. Familiarity with AI-powered assistant tools (e.g., GenAI copilot).

Posted 2 days ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Bison Global Search is seeking a Principal AI Engineer for a leading product company in Chennai . They work on some cutting-edge technologies in the BIOS Industry Please find below details about the role : Location: Chennai (please do not apply if you are not willing to relocate to Chennai) Company: Product Company (Leader in BIOS Products) Designation: Principal AI Engineer Skills Required: Python + RAG ( Retrieval-Augmented Generation) + Agentic AI Experience : 8 + years of experience as an AI engineer + Guiding and mentor a team of 3-5 AI engineers Please find below the complete JD. if this interests you, please apply to this email We are looking for a highly skilled Principal AI Engineer with deep expertise in Retrieval-Augmented Generation (RAG) and Agentic AI to lead our AI initiatives and drive innovation in FW and Data Center AI solutions . This role requires a strategic thinker who can design and deploy scalable AI architectures, integrate LLMs with retrieval-based techniques, and develop intelligent agentic systems that autonomously interact with data, APIs, and workflows. This role will lead the design and deployment of cutting-edge AI-driven solutions, focusing on LLMs for code synthesis, automated testing, and intelligent autonomous agents that enhance software development workflows with strong technical expertise, strategic vision, and leadership to build and deploy AI-driven products that align with business goals. Key Responsibilities: AI Strategy and Leadership: Define and execute AI strategies focused on RAG-based retrieval, code generation, and AI-assisted software engineering Work with stakeholders to align AI capabilities with business objectives and software development needs Research and integrate cutting-edge LLMs and autonomous AI agent architecture into development processes. RAG & Agentic AI Development: Develop RAG pipelines that enhance AI‘s ability to retrieve relevant knowledge and generate context-aware responses. Build and optimize agentic AI systems that can interact with APIs, databases, and development environments (such as LangChain, OpenAI APIs, etc.) Implement AI-powered search, chatbots, and decision-support tools for software engineers. Fine-tune LLMs (GPT, Llama, Mistral, Claude, Gemini etc.) for domain-specific applications. Optimize retrieval mechanisms to enhance response accuracy, grounding AI outputs in real-world data Code generation & Test case Automation: Leverage LLMs to generate high-quality, production-ready code Develop AI-driven test case generation tools that automatically create and validate unit tests, integration tests, and regression tests Integrate AI-driven code assistants and programming agents into IDE and CI/CD workflows Optimize prompt engineering and fine-tuning strategies for LLMs to improve code quality and efficiency MLOps & Scalable AI Systems: Architect and deploy scalable AI models and retrieval pipelines using cloud-based MLOps pipelines (AWS/GCP/Azure, Docker, Kubernetes) Optimize LLMs for real-time AI inferencing , ensuring low latency and high-performance AI solutions. Collaboration: Work cross-functionally with product teams, software engineers, and business stakeholders to integrate AI solutions into products. Mentorship: Guide and mentor a team of 3-5 AI engineers in LLM fine-tuning, retrieval augmentation, and autonomous AI agents. Establish best practices for AI-assisted software development, secure AI integration, and bias mitigation. Research & Innovation: Commitment to staying updated with the latest AI and machine learning research and advancements . Ability to think creatively and propose innovative solutions to complex problems. Model Development: Ability to design, train, and evaluate various AI models , including LLMs and standalone models —familiarity with model training tools and frameworks like Hugging Face Trainer, Fairseq, etc . Required Qualifications: Education: Master's or Ph.D. in Computer Science, AI, Machine Learning, or a related field. Experience: 8+ years of experience in AI and machine learning, with at least 2 years of experience working on LLMs, code generation, RAG, or AI-powered automation . Technical skills: Proficiency in Python, Tensorflow, PyTorch, and LangChain Experience with LLM fine-tuning for code generation Strong expertise in vector databases (FAISS, Weaviate, Chroma, Pinecone, Milvus) and retrieval models Hands-on experience with AI-powered code assistants (Copilot, code Llama, Codex, GTP-4) Knowledge of automated software testing, AI-driven test case generation, AI-assisted debugging Experience with multi-agent AI systems (LangGraph, CrewAI, AutgoGen, OpenAI Assistants API) for autonomous coding tasks Knowledge of GoLang for building high-performance and scalable components and unit test case generation using CMocka is a plus. Hands-on model development, working with business stakeholders to define KPIs and develop and deliver multi-modal (Text and Images) and ensemble models. Develop novel approaches to solve firmware lifecycle management code generation and customer support issues. Implement advanced natural language processing and computer vision models to extract insights from diverse data sources , user-generated data, and images. Automate model lifecycle management . Stay updated with AI and machine learning technology advancements to drive Firmware Lifecycle Management. Analytical & Problem-Solving: Analytical Thinking: Strong analytical skills to interpret complex data and derive actionable insights. Problem-Solving: Ability to troubleshoot and resolve technical issues related to AI models and systems. Research & Innovation: Continuous Learning: Commitment to staying updated with the latest research and advancements in AI and machine learning. Innovation: Ability to think creatively and propose innovative solutions to complex problems. Soft Skills: Communication: Excellent verbal and written communication skills. Adaptability: Ability to adapt to changing technologies and project requirements. Team Player: Strong interpersonal skills and the ability to work well in a team environment. Preferred Qualifications: Experience with deploying and maintaining AI models in production environments . Familiarity with RAG-specific techniques like knowledge distillation or multi-hop retrieval . Understanding of reinforcement learning and active learning techniques for model improvement . Previous experience with large-scale NLP systems and AI-powered search engines . Contribution to AI research, patents, or open-source development Show more Show less

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Dear Candidate; Greetings from!! Vinga Software Solutions Product Details Bautomate is a Intelligent Business Process Automation Software comprehensive hyper automation platform housed within a single software system. | AI-Powered Process Automation Solution Vinga Software Solutions is the Parent Company of Bautomate Product https://www.vinga.biz/about/ and you'll be in the role of Vinga Software Solutions. About The Product Bautomate offers cognitive automation solutions designed to automate repetitive tasks, eliminate bottlenecks, and enable seamless workflows. The product combines artificial intelligence, business process management, robotic process automation (RPA), and optical character recognition (OCR) to streamline and optimize various business processes. It provides a transformative solution to empower businesses of all sizes across industries to achieve unprecedented levels of productivity and success. Unique Features Of Bautomate's Business Process Automation Solutions Include Workflow Automation: Bautomate's intuitive drag-and-drop interface enables users to easily automate complex workflows, leveraging pre-built components for effective intelligent automation. Data Integration: Seamless integration with all existing systems and applications ensures smooth data transfer and real-time information exchange, enhancing collaboration and decision-making. Intelligent Analytics: By harnessing advanced analytics capabilities, businesses can gain valuable insights into their processes, identify areas for improvement, and make data-driven decisions. It allows organizations to optimize their operations and drive growth based on comprehensive data analysis. Cognitive Automation: Our comprehensive solution encompasses Intelligent Document Capture utilizing OCR & NLP, Predictive Analytics for Forecasting, Computer Vision and Image Processing, Anomaly Detection, and an Intelligent Decision Engine. Scalability and Flexibility: Bautomate platform is highly scalable, accommodating the evolving needs of businesses as they grow. It offers flexible deployment options, including on-premises and cloud-based solutions. About Us: We are leading provider of business process automation, aiding firms to streamline operations, boost efficiency, and spur growth. Our suite includes AP automation, purchase order automation, P2P automation, invoice automation, IVR testing automation, form etc. AI/ML Developer – Lead (LLM & Gen AI) Experience Required: 5 to 9 years Job Location: Madurai Role Overview We are looking for a Senior AI/ML Developer with expertise in Large Language Models (LLM) & Generative AI. The ideal candidate should have experience in developing and deploying AI-driven solutions. Key Responsibilities Design and develop AI/ML models focusing on LLMs & Generative AI. Collaborate with data scientists to optimize model performance. Deploy AI solutions on cloud platforms (AWS, GCP, Azure). Lead AI projects and mentor junior developers. Required Skills Expertise in LLMs, Gen AI, NLP, and Deep Learning. Strong coding skills in Python, TensorFlow, PyTorch. Experience in ML model deployment using Docker/Kubernetes. Knowledge of cloud-based AI/ML services. Can you pls fill these details Total Work Experience Experience in AI/ML : Exp In LLM & Generative AI Exp in NLP, deep learning, python / pytorch / tensorflow : Exp in Current CTC ; Expected CTC Last working day : Notice Period Current Location : Native Place Reason for Job Change : Marital Status Do you have any offer in hand : Skills:- Machine Learning (ML), Artificial Intelligence (AI), Generative AI, Python, Large Language Models (LLM) tuning, Deep Learning and Natural Language Processing (NLP) Show more Show less

Posted 2 days ago

Apply

0 years

0 Lacs

Delhi, India

Remote

Linkedin logo

Work on Real-World Problems with Global Tech Experts Join a leading U.S.-based technology company as a Python Developer / AI Engineer, where you’ll tackle real-world challenges and build innovative solutions alongside top global experts. This is a fully remote, contract-based opportunity ideal for developers passionate about Python, data analysis, and AI-driven work. Key Responsibilities: Write efficient, production-grade Python code to solve complex problems. Analyze public datasets and extract meaningful insights using Python and SQL. Collaborate with researchers and global teams to iterate on data-driven ideas. Document all code and development decisions in Jupyter Notebooks or similar platforms. Maintain high-quality standards and contribute to technical excellence. Job Requirements: Open to all levels: junior, mid-level, or senior engineers. Degree in Computer Science, Engineering, or equivalent practical experience. Proficient in Python programming for scripting, automation, or backend development. Experience with SQL/NoSQL databases is a plus. Familiarity with cloud platforms (AWS, GCP, Azure) is advantageous. Must be able to work 5+ hours overlapping with Pacific Time (PST/PT). Strong communication and collaboration skills in a remote environment. Perks & Benefits: Work on cutting-edge AI and data projects impacting real-world use cases. Collaborate with top minds from Meta, Stanford, and Google. 100% remote – work from anywhere. Contract role with flexibility and no traditional job constraints. Competitive compensation in USD, aligned with global tech standards. Selection Process: Shortlisted developers may be asked to complete an assessment. If you clear the assessment, you will be contacted for contract assignments with expected start dates, durations, and end dates. Some contract assignments require fixed weekly hours, averaging 20/30/40 hours per week for the duration of the contract assignment. Show more Show less

Posted 2 days ago

Apply

0.0 - 2.0 years

0 Lacs

Mohali, Punjab

On-site

Indeed logo

About Us Cywarden is a US-based cybersecurity company with a global presence, focused on delivering advanced security solutions to organizations across industries. We’re a team of passionate professionals committed to securing digital ecosystems with cutting-edge cloud, network, and application security strategies. Job Summary We are seeking a skilled and motivated Cloud Security Engineer with 2-3 years of experience to join our growing team. The ideal candidate should have hands-on experience in securing cloud infrastructure (AWS/Azure/GCP), monitoring threats, and implementing security best practices across cloud environments. Key Responsibilities Implement and manage cloud security measures to protect infrastructure, data, and applications. Monitor cloud environments for threats, vulnerabilities, and suspicious activities. Work with DevOps and engineering teams to integrate security in CI/CD pipelines. Perform regular cloud security assessments and audits. Respond to cloud-related security incidents and support investigation processes. Maintain up-to-date knowledge of cloud security trends and compliance standards. Assist in implementing identity and access management (IAM) policies and controls. Required Skills & Qualifications Bachelor’s degree in Computer Science, Information Security, or related field. 1–2 years of experience in a cloud security or cloud engineering role. Practical knowledge of AWS, Azure, or GCP cloud platforms. Familiarity with tools such as CloudTrail, GuardDuty, Security Hub, or similar. Understanding of IAM, VPC, encryption, and key management. Basic scripting skills (Python, Bash, etc.) are a plus. Good communication and documentation skills. Preferred Qualifications Certifications like AWS Certified Security – Specialty, Microsoft Certified: Security, or equivalent. Experience with container security (e.g., Docker, Kubernetes). Exposure to compliance frameworks (e.g., SOC 2, ISO 27001, NIST). Job Types: Full-time, Permanent Pay: ₹362,847.29 - ₹1,213,956.83 per year Schedule: Night shift Location: Mohali, Punjab (Required) Shift availability: Night Shift (Required) Work Location: In person

Posted 2 days ago

Apply

3.0 - 5.0 years

0 Lacs

Udaipur, Rajasthan, India

Remote

Linkedin logo

At GKM IT , we believe great technology is built on clean logic, smart architecture, and collaboration that just clicks. We're hiring a Node.js Engineer - Senior I to help us power scalable, secure, and high-performing backend systems. If you enjoy building the brains behind brilliant applications, love writing elegant code, and thrive in cross-functional teams—this is your stage. You’ll be responsible for developing core server-side logic, integrating with front-end systems, and optimising for speed, security, and scalability. Requirements 3 to 5 years of hands-on experience with Node.js backend development Solid experience building and maintaining RESTful APIs and data-driven applications Design, develop, and maintain robust server-side application logic Collaborate with front-end developers to deliver seamless, end-to-end features Write clean, efficient, and testable code that’s easy to understand and maintain Optimize applications for speed, scalability, and low-latency performance Ensure data protection and application security best practices are followed Design and integrate data storage solutions that are reliable and efficient Strong knowledge of JavaScript and TypeScript Deep understanding of Node.js architecture and asynchronous programming Familiarity with HTML5, CSS3, and front-end/backend integration Hands-on experience with npm, package management, and related tools Experience with implementing authentication, authorization, and permission controls Knowledge of integrating various databases, data sources, and output methods for mobile/desktop Ability to build scalable, maintainable Node.js applications Proficient with Git and version control best practices A strong sense of product ownership and the ability to think beyond code Experience with containerization tools (Docker, Kubernetes) Familiarity with GraphQL or WebSockets Understanding of CI/CD pipelines and cloud deployment (AWS/GCP/Azure) Contributions to open-source Node.js libraries or frameworks & prior work on high-traffic or enterprise-grade applications Benefits We don’t just hire employees—we invest in people. At GKM IT, we’ve designed a benefits experience that’s thoughtful, supportive, and actually useful. Here’s what you can look forward to: Top-Tier Work Setup You’ll be equipped with a premium MacBook and all the accessories you need. Great tools make great work. Flexible Schedules & Remote Support Life isn’t 9-to-5. Enjoy flexible working hours, emergency work-from-home days, and utility support that makes remote life easier. Quarterly Performance Bonuses We don’t believe in waiting a whole year to celebrate your success. Perform well, and you’ll see it in your pay check—quarterly. Learning is Funded Here Conferences, courses, certifications—if it helps you grow, we’ve got your back. We even offer a dedicated educational allowance. Family-First Culture Your loved ones matter to us too. From birthday and anniversary vouchers (Amazon, BookMyShow) to maternity and paternity leaves—we’re here for life outside work. Celebrations & Gifting, The GKM IT Way Onboarding hampers, festive goodies (Diwali, Holi, New Year), and company anniversary surprises—it’s always celebration season here. Team Bonding Moments We love food, and we love people. Quarterly lunches, dinners, and fun company retreats help us stay connected beyond the screen. Healthcare That Has You Covered Enjoy comprehensive health insurance for you and your family—because peace of mind shouldn’t be optional. Extra Rewards for Extra Effort Weekend work doesn’t go unnoticed, and great referrals don’t go unrewarded. From incentives to bonuses—you’ll feel appreciated. Show more Show less

Posted 2 days ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

About us: As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up. Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about TII At Target, we have a timeless purpose and a proven strategy. And that hasn’t happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Target’s global team and has more than 4,000 team members supporting the company’s global strategy. Pyramid overview Roundel is Target’s entry into the media business with an impact of $1B+; an advertising sell-side business built on the principles of first party (people based) data, brand safe content environments and proof that our marketing programs drive business results for our clients. We are here to drive business growth for our clients and redefine “value” in the industry by solving core industry challenges vs. copy current industry methods of operation. Roundel is a key growth initiative for Target and lead the industry to a better way of operating within the media marketplace. Target Tech is on a mission to offer the systems, tools and support that our clients, guests and team members need and deserve. We drive industry-leading technologies in support of every angle of the business, and help ensure that Target operates smoothly, securely, and reliably from the inside out. Role Overview As a Senior Engineer, you serve as a specialist in the engineering team that supports the product. You help develop and gain insight in the application architecture. You can distill an abstract architecture into concrete design and influence the implementation. You show expertize in applying the appropriate software engineering patterns to build robust and scalable systems. You are an expert in programming and apply your skills in developing the product. You have the skills to design and implement the architecture on your own, but choose to influence your fellow engineers by proposing software designs, providing feedback on software designs and/or implementation. You show good problem solving skills and can help the team in triaging operational issues. You leverage your expertise in eliminating repeat occurrences. We are looking for a highly skilled and motivated Senior Backend Developer with deep expertise in Java or Kotlin and modern backend technologies. You will be responsible for designing, building, and maintaining scalable backend systems that power our platform. If you're passionate about building high-performance APIs, optimizing data flow, and working with large-scale systems, we'd love to meet you. Key Responsibilities Design, develop, and maintain robust backend services using Java or Kotlin and Spring Framework (Spring Boot, Spring Data, etc.). Develop RESTful APIs and backend components that are secure, scalable, and performant. Work with both SQL (e.g., PostgreSQL, MySQL) and NoSQL (e.g., MongoDB, Cassandra) databases. Work with Kafka and Kafka Streams. Integrate and optimize Elasticsearch for advanced search functionality. Write clean, maintainable, and testable code with proper documentation. Participate in system design, architecture discussions, and code reviews. Collaborate with product managers, frontend developers, and QA engineers to deliver seamless features. Ensure system reliability, performance tuning, and monitor services in production. * Follow DevOps and CI/CD best practices. Required Qualifications 5+ years of backend development experience. Strong programming skills in Java or Kotlin. Deep understanding of the Spring ecosystem (Spring Boot, Spring Security, etc.). Solid experience in working with both relational and non-relational databases. Experience implementing Elasticsearch in production systems. Proficiency in designing and consuming RESTful APIs. Experience with microservices architecture and distributed systems. Strong problem-solving and debugging skills. Familiarity with version control tools like Git and CI/CD tools (Jenkins, GitHub Actions, etc.). Good to Have Experience with containerization and orchestration (Docker, Kubernetes). Exposure to cloud platforms (GCP, AWS, Azure). Useful Links Life at Target- https://india.target.com/ Benefits- https://india.target.com/life-at-target/workplace/benefits Culture - https://india.target.com/life-at-target/belonging Show more Show less

Posted 2 days ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Pune, Gurugram

Work from Office

Naukri logo

In one sentence We are seeking a highly skilled and adaptable Senior Python Developer to join our fast-paced and dynamic team. The ideal candidate is a hands-on technologist with deep expertise in Python and a strong background in data engineering, cloud platforms, and modern development practices. You will play a key role in building scalable, high-performance applications and data pipelines that power critical business functions. You will be instrumental in designing and developing high-performance data pipelines from relational to graph databases, and leveraging Agentic AI for orchestration. Youll also define APIs using AWS Lambda and containerised services on AWS ECS. Join us on an exciting journey where you'll work with cutting-edge technologies including Generative AI, Agentic AI, and modern cloud-native architectureswhile continuously learning and growing alongside a passionate team. What will your job look like? Key Attributes: Adaptability & Agility Thrive in a fast-paced, ever-evolving environment with shifting priorities. Demonstrated ability to quickly learn and integrate new technologies and frameworks. Strong problem-solving mindset with the ability to juggle multiple priorities effectively. Core Responsibilities Design, develop, test, and maintain robust Python applications and data pipelines using Python/Pyspark. Define and implement smart data pipelines from RDBMS to Graph Databases. Build and expose APIs using AWS Lambda and ECS-based microservices. Collaborate with cross-functional teams to define, design, and deliver new features. Write clean, efficient, and scalable code following best practices. Troubleshoot, debug, and optimise applications for performance and reliability. Contribute to the setup and maintenance of CI/CD pipelines and deployment workflows if required. Ensure security, compliance, and observability across all development activities. All you need is... Required Skills & Experience Expert-level proficiency in Python with a strong grasp of Object oriented & functional programming. Solid experience with SQL and graph databases (e.g., Neo4j, Amazon Neptune). Hands-on experience with cloud platforms AWS and/or Azure is a must. Proficiency in PySpark or similar data ingestion and processing frameworks. Familiarity with DevOps tools such as Docker, Kubernetes, Jenkins, and Git. Strong understanding of CI/CD, version control, and agile development practices. Excellent communication and collaboration skills. Desirable Skills Experience with Agentic AI, machine learning, or LLM-based systems. Familiarity with Apache Iceberg or similar modern data lakehouse formats. Knowledge of Infrastructure as Code (IaC) tools like Terraform or Ansible. Understanding of microservices architecture and distributed systems. Exposure to observability tools (e.g., Prometheus, Grafana, ELK stack). Experience working in Agile/Scrum environments. Minimum Qualifications 6 to 8 years of hands-on experience in Python development and data engineering. Demonstrated success in delivering production-grade software and scalable data solutions.

Posted 2 days ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Mumbai

Work from Office

Naukri logo

Responsibilities : Manipulate and preprocess structured and unstructured data to prepare datasets for analysis and model training. Utilize Python libraries like PyTorch, Pandas, and NumPy for data analysis, model development, and implementation. Fine-tune large language models (LLMs) to meet specific use cases and enterprise requirements. Collaborate with cross-functional teams to experiment with AI/ML models and iterate quickly on prototypes. Optimize workflows to ensure fast experimentation and deployment of models to production environments. Implement containerization and basic Docker workflows to streamline deployment processes. Write clean, efficient, and production-ready Python code for scalable AI solutions. Good to Have: Exposure to cloud platforms like AWS, Azure, or GCP. Knowledge of MLOps principles and tools. Basic understanding of enterprise Knowledge Management Systems. Ability to work against tight deadlines. Ability to work on unstructured projects independently. Strong initiative and self-motivated Strong Communication & Collaboration acumen. Required Skills: Proficiency in Python with strong skills in libraries like PyTorch, Pandas, and NumPy. Experience in handling both structured and unstructured datasets. Familiarity with fine-tuning LLMs and understanding of modern NLP techniques. Basics of Docker and containerization principles. Demonstrated ability to experiment, iterate, and deploy code rapidly in a production setting. Strong problem-solving mindset with attention to detail. Ability to learn and adapt quickly in a fast-paced, dynamic environment.

Posted 2 days ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title: Senior Data Scientist Location: Hyderabad, India Experience Level: 3+ years Job Summary We are seeking a highly skilled Senior Data Scientist to join our dynamic team. The ideal candidate will have expertise in Machine Learning (ML), Deep Learning (DL), Natural Language Processing (NLP), Computer Vision, Transformers, Retrieval-Augmented Generation (RAG), and Generative AI (LLMs, Diffusion Models, etc.). Additionally, proficiency in Prompt Engineering is required. The role also demands secondary skills in Django, Database Management (MySQL, NoSQL), and REST API development. Required Skills & Qualifications  Programming: Proficiency in Python.  Machine Learning & Deep Learning: Strong experience in developing and deploying ML/DL models.  NLP: Expertise in working with NLP frameworks such as Hugging Face Transformers, SpaCy, NLTK, etc.  Computer Vision: Experience with OpenCV, TensorFlow, PyTorch, or other relevant frameworks.  Generative AI: Hands-on experience with LLMs, diffusion models, GANs, VAEs, and other generative techniques.  Transformers & RAG: Strong understanding of transformer architectures (BERT, GPT, LLAMA, etc.) and retrieval-augmented generation methods.  Prompt Engineering: Ability to design, optimize, and implement effective prompts for AI applications.  Django: Experience in building and maintaining backend services and APIs.  Database Management: Proficiency in MySQL and NoSQL databases for efficient data handling.  REST API Development: Experience in developing and integrating APIs for AI powered applications.  Cloud & Deployment: Knowledge of cloud platforms (AWS, GCP, Azure) and deployment tools like Docker & Kubernetes. Skills:- Natural Language Processing (NLP), Deep Learning, Large Language Models (LLM) tuning and Computer Vision Show more Show less

Posted 2 days ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Minimum of 3 years of professional experience in developing and deploying AI/ML solutions. Strong proficiency in Python and relevant libraries such as TensorFlow, PyTorch, scikit-learn, and Transformers. Hands-on experience with at least one Python web framework (FastAPI or Flask) for building APIs and backend services. Solid understanding of machine learning algorithms, deep learning architectures, and GenAI concepts. Experience with containerization technologies (Docker) and orchestration platforms (Kubernetes). Proven experience working with the Azure cloud platform and its AI/ML and container services. Familiarity with data engineering concepts and tools for data processing and preparation. Experience with CI/CD pipelines and DevOps practices. Strong problem-solving and analytical skills. Excellent communication and collaboration skills. Ability to work independently and as part of a team. Preferred Qualifications Experience with MLOps practices and tools for managing the ML lifecycle. Familiarity with other cloud platforms (e.g., AWS, GCP). Experience with specific AI/ML application domains (e.g., NLP, computer vision, time series analysis). Contributions to open-source AI/ML projects. Relevant certifications in AI/ML or cloud technologies. Skills:- Python, Large Language Models (LLM), TensorFlow, Docker, CI/CD and Natural Language Processing (NLP) Show more Show less

Posted 2 days ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Chennai

Work from Office

Naukri logo

Skills Skill SDLC PL/SQL Solution Architecture Java Unix .NET SOA Microsoft SQL Server SQL ASP.NET Education Qualification No data available CERTIFICATION No data available Role Data Scientist Educational Qualification ME/BE / MCA Experience Required 4+ Years Shifts Day shift Skills & Responsibilities Experience4+ years in machine learning, deep learning, or generative AI. Programming & Frameworks Strong Python scripting with Pandas, NumPy, OOPs concepts. Experience with PyTorch, TensorFlow, Keras, Hugging Face Transformers. Proficiency in SQL queries when needed. Generative AI & NLP Experience with LLMs, GANs, VAEs, Diffusion Models. Familiarity with OpenAI GPT, DALLE, Stable Diffusion. Deep knowledge of NLP techniques & deep learning architectures (RNN, CNN, LSTM, GRU). Machine Learning & Statistics Understanding of ML/DL algorithms, statistical analysis, and feature engineering. Theoretical knowledge of Random Forest, SVM, Boosting, Bagging, Regression (Linear & Logistic), and Unsupervised Learning. MLOps & Deployment Familiarity with Cloud platforms (AWS, Azure, GCP). Experience in MLOps, CI/CD, Docker, Kubernetes. Comfortable with Linux systems and GPU-based deep learning. Research & Ethics Contributions to AI research, open-source projects, or Kaggle competitions. Awareness of AI ethics, bias mitigation, and model interpretability. Soft Skills & Work Environment Ability to work independently and deliver results. Experience in an agile development environment. Knowledge of computer vision is a plus.

Posted 2 days ago

Apply

0 years

0 Lacs

India

On-site

Linkedin logo

Key Responsibilities Design, develop, and deploy machine learning models for prediction, recommendation, anomaly detection, NLP, or image processing tasks. Work with large, complex datasets to extract insights and build scalable solutions. Collaborate with data engineers to create efficient data pipelines and feature engineering workflows. Evaluate model performance using appropriate metrics and improve models through iterative testing and tuning. Communicate findings, insights, and model outputs clearly to non-technical stakeholders. Stay up to date with the latest machine learning research, frameworks, and technologies. Required Skills Strong programming skills in Python (Pandas, NumPy, Scikit-learn, etc.). Hands-on experience with ML/DL frameworks like TensorFlow, PyTorch, XGBoost, or LightGBM. Experience in building, deploying, and maintaining end-to-end ML models in production. Solid understanding of statistics, probability, and mathematical modeling. Proficiency with SQL and data manipulation in large-scale databases. Familiarity with version control (Git), CI/CD workflows, and model tracking tools (MLflow, DVC, etc.). Preferred Skills Experience with cloud platforms like AWS, GCP, or Azure (e.g., SageMaker, Vertex AI). Knowledge of MLOps practices and tools for scalable ML deployments. Exposure to real-time data processing or streaming (Kafka, Spark). Experience with NLP, Computer Vision, or Time Series Forecasting. Show more Show less

Posted 2 days ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Paytm is India's leading mobile payments and financial services distribution company. Pioneer of the mobile QR payments revolution in India, Paytm builds technologies that help small businesses with payments and commerce. Paytm s mission is to serve half a billion Indians and bring them to the mainstream economy with the help of technology. About the Role: Business analyst focuses on data, statistical analysis and reporting to help investigate and analyze business performance, provide insights, and drive recommendations to improve performance. Expectations/ : 1. Derive business insights from data with a focus on driving business level metrics. 2. Ability to interact and convince business stakeholders. 3. Developing insightful analysis about business and their strategic and operational implications. 4. Partner with stakeholders at all levels to establish current and ongoing data support and reporting needs. 5. Analyze data from multiple angles, looking for trends that highlight areas of concerns or opportunities. 6. Design, create and deliver data reports, dashboards, extract and/or deliver presentations to strategic questions. 7. Identifying data needs and driving data quality improvement projects. Key Skills Required: 1.Ideally have 2-5 years experience working on data analytics and business intelligence. Candidates from b2c consumer internet product companies are preferred. 2. Proven work experience on MS Excel, Google analytics,SQL, Data Studio, any BI Tool, business analyst or similar role. 3. Should be comfortable working in a fast-changing environment and ambiguous. 4. Critical thinking and very detail oriented. 5. In-depth understanding of datasets, data and business understanding. 6. Capable of demonstrating good business judgement. Education: Applicants must have an engineering academic background with specialization in data science . Why join us We aim at bringing half a billion Indians into the mainstream economy, and everyone working here is striving to achieve that goal. Our success is rooted in our people s collective energy and unwavering focus on the customers, and that s how it will always be. We are the largest merchant acquirer in India. Compensation If you are the right fit, we believe in creating wealth for you With enviable 500 mn+ registered users, 21 mn+ merchants and depth of data in our ecosystem, we are in a unique position to democratize credit for deserving consumers & merchants - and we are committed to it. India s largest digital lending story is brewing here. It is your opportunity to be a part of the story!

Posted 2 days ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Noida, Bengaluru

Work from Office

Naukri logo

Paytm is India's leading mobile payments and financial services distribution company. Pioneer of the mobile QR payments revolution in India, Paytm builds technologies that help small businesses with payments and commerce. Paytm s mission is to serve half a billion Indians and bring them to the mainstream economy with the help of technology. About the Role Business analyst focuses on data, statistical analysis and reporting to help investigate and analyze business performance, provide insights, and drive recommendations to improve performance. Expectations/ : 1. Drive business insights from data with a focus on driving business level metrics. 2. Ability to interact and convince business stakeholders. 3. Developing insightful analysis about business and their strategic and operational implications. 4. Partner with stakeholders at all levels to establish current and ongoing data support and reporting needs. 5. Analyze data from multiple angles, looking for trends that highlight areas of concerns or opportunities. 6. Design, create and deliver data reports, dashboards, extract and/or deliver presentations to strategic questions. 7. Identifying data needs and driving data quality improvement projects. Key Skills Required: 1. Ideally have 2-5 years experience working on data analytics and business intelligence. Candidates from b2c consumer internet product companies are preferred. 2. Proven work experience on MS Excel, Google analytics, SQL, Data Studio, any BI Tool, business analyst or similar role. 3. Should be comfortable working in a fast-changing environment and ambiguous. 4. Critical thinking and very detail oriented. 5. In-depth understanding of datasets, data and business understanding. 6. Capable of demonstrating good business judgement. Education Applicants must have an engineering academic background with specialization in data science . Why join us We aim at bringing half a billion Indians into the mainstream economy, and everyone working here is striving to achieve that goal. Our success is rooted in our people s collective energy and unwavering focus on the customers, and that s how it will always be. We are the largest merchant acquirer in India. Compensation If you are the right fit, we believe in creating wealth for you With enviable 500 mn+ registered users, 21 mn+ merchants and depth of data in our ecosystem, we are in a unique position to democratize credit for deserving consumers & merchants - and we are committed to it. India s largest digital lending story is brewing here. It is your opportunity to be a part of the story!

Posted 2 days ago

Apply

4.0 - 9.0 years

6 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Title : .Net Developer(.net+openshift OR Kubernetes) | 4 to 12 years | Bengaluru & Hyderabad : Assess and understand the application implementation while working with architects and business experts Analyse business and technology challenges and suggest solutions to meet strategic objectives Build cloud native applications meeting 12/15 factor principles on OpenShift or Kubernetes Migrate Dot Net Core and/ or Framework Web/ API/ Batch Components deployed in PCF Cloud to OpenShift, working independently Analyse and understand the code, identify bottlenecks and bugs, and devise solutions to mitigate and address these issues Design and Implement unit test scripts and automation for the same using Nunit to achieve 80% code coverage Perform back end code reviews and ensure compliance to Sonar Scans, CheckMarx and BlackDuck to maintain code quality Write Functional Automation test cases for system integration using Selenium. Coordinate with architects and business experts across the application to translate key Required Qualifications: 4+ years of experience in Dot Net Core (3.1 and above) and/or Framework (4.0 and above) development (Coding, Unit Testing, Functional Automation) implementing Micro Services, REST API/ Batch/ Web Components/ Reusable Libraries etc Proficiency in C# with a good knowledge of VB.NET Proficiency in cloud platforms (OpenShift, AWS, Google Cloud, Azure) and hybrid/multi-cloud strategies with at least 3 years in Open Shift Familiarity with cloud-native patterns, microservices, and application modernization strategies. Experience with monitoring and logging tools like Splunk, Log4J, Prometheus, Grafana, ELK Stack, AppDynamics, etc. Familiarity with infrastructure automation tools (e.g., Ansible, Terraform) and CI/CD tools (e.g., Harness, Jenkins, UDeploy). Proficiency in Database like MS SQL Server, Oracle 11g, 12c, Mongo, DB2 Experience in integrating front-end with back-end services Experience in working with Code Versioning methodology as followed with Git, GitHub Familiarity with Job Scheduler through Autosys, PCF Batch Jobs Familiarity with Scripting languages like shell / Helm chats modules" Works in the area of Software Engineering, which encompasses the development, maintenance and optimization of software solutions/applications.1. Applies scientific methods to analyse and solve software engineering problems.2. He/she is responsible for the development and application of software engineering practice and knowledge, in research, design, development and maintenance.3. His/her work requires the exercise of original thought and judgement and the ability to supervise the technical and administrative work of other software engineers.4. The software engineer builds skills and expertise of his/her software engineering discipline to reach standard software engineer skills expectations for the applicable role, as defined in Professional Communities.5. The software engineer collaborates and acts as team player with other software engineers and stakeholders.

Posted 2 days ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Dear Aspirant! We empower our people to stay resilient and relevant in a constantly changing world. We’re looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like youThen it seems like you’d make a great addition to our vibrant international team. We are looking forSenior Software Engineer (C# .Net), You’ll make an impact by Responsible for Design of software solutions based on requirements and within the constraints of architectural /design guidelines. Derive software requirements and software functional specication, validate software requirements, provide software feasibility analysis and software effort estimation. Understand the requirements, user stories, High level and low-level design, implementation, unit and integration test to deliver high quality product. Accurate translation of software architecture into design and code. You will guide Scrum team members on all design topics & implementation consistency against the design/architecture. Identication and implementation of (unit/integration/automation) tests to ensure solution addresses customer requirements. Providing documentation (requirement/design/test specication) inputs, and ensuring delivery conforms to organization and project quality processes. Ensuring integration and submission of solution into software conguration management system, within committed delivery timelines. Guide the team in test automation design and towards implementation of the same. Use your skills to move the world forward! B.E/ B. Tech/ MCA/ M.E/ M. Tech/MSc Computer Science. Knowledge and Experience4 - 8+ years of Software development experience with strong experience in C# in developing windows application. Advanced C# knowledge with solid understanding of object-oriented design and programming and Design Patterns. Exposure in .Net development (C# .NET, Entity Framework, ADO.NET, ASP.NET, MVVM, MVC). Exposure in implementing and deploying solutions in any one of the Cloud environments (Azure / AWS / GCP). Exposure in building web applications using Angular or other client-side application frameworks. Work experience in unit testing frameworks, DevOps. Hands-on experience in developing and debugging applications using c++ will be an added advantage. Experience in Agile/LEAN software development Good knowledge in software conguration management & DevOps concepts Good analytical and problem-solving skills Good communication skills (oral and written) and a quick learner of new technologies and trends Ability to effectively communicate and interact with various stakeholders. Create a better #TomorrowWithUs! This role is based in Bangalore, where you’ll get the chance to work with teams impacting entire cities, countries - and the shape of things to come. We’re Siemens. A collection of over 312,000 minds building the future, one day at a time in over 200 countries. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and imagination and help us shape tomorrow. Find out more about Siemens careers at www.siemens.com/careers Find out more about the Digital world of Siemens here www.siemens.com/careers/digitalminds

Posted 2 days ago

Apply

5.0 - 6.0 years

55 - 60 Lacs

Pune

Work from Office

Naukri logo

At Capgemini Invent, we believe difference drives change. As inventive transformation consultants, we blend our strategic, creative and scientific capabilities,collaborating closely with clients to deliver cutting-edge solutions. Join us to drive transformation tailored to our client's challenges of today and tomorrow. Informed and validated by science and data. Superpowered by creativity and design. All underpinned by technology created with purpose. Data engineers are responsible for building reliable and scalable data infrastructure that enables organizations to derive meaningful insights, make data-driven decisions, and unlock the value of their data assets. - Grade Specific The role support the team in building and maintaining data infrastructure and systems within an organization. Skills (competencies) Ab Initio Agile (Software Development Framework) Apache Hadoop AWS Airflow AWS Athena AWS Code Pipeline AWS EFS AWS EMR AWS Redshift AWS S3 Azure ADLS Gen2 Azure Data Factory Azure Data Lake Storage Azure Databricks Azure Event Hub Azure Stream Analytics Azure Sunapse Bitbucket Change Management Client Centricity Collaboration Continuous Integration and Continuous Delivery (CI/CD) Data Architecture Patterns Data Format Analysis Data Governance Data Modeling Data Validation Data Vault Modeling Database Schema Design Decision-Making DevOps Dimensional Modeling GCP Big Table GCP BigQuery GCP Cloud Storage GCP DataFlow GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2 Industry Standard Data Modeling (FSLDM) Industry Standard Data Modeling (IBM FSDM)) Influencing Informatica IICS Inmon methodology JavaScript Jenkins Kimball Linux - Redhat Negotiation Netezza NewSQL Oracle Exadata Performance Tuning Perl Platform Update Management Project Management PySpark Python R RDD Optimization SantOs SaS Scala Spark Shell Script Snowflake SPARK SPARK Code Optimization SQL Stakeholder Management Sun Solaris Synapse Talend Teradata Time Management Ubuntu Vendor Management Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fuelled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of "22.5 billion.

Posted 2 days ago

Apply

Exploring GCP Jobs in India

The job market for Google Cloud Platform (GCP) professionals in India is rapidly growing as more and more companies are moving towards cloud-based solutions. GCP offers a wide range of services and tools that help businesses in managing their infrastructure, data, and applications in the cloud. This has created a high demand for skilled professionals who can work with GCP effectively.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Hyderabad
  4. Pune
  5. Chennai

Average Salary Range

The average salary range for GCP professionals in India varies based on experience and job role. Entry-level positions can expect a salary range of INR 5-8 lakhs per annum, while experienced professionals can earn anywhere from INR 12-25 lakhs per annum.

Career Path

Typically, a career in GCP progresses from a Junior Developer to a Senior Developer, then to a Tech Lead position. As professionals gain more experience and expertise in GCP, they can move into roles such as Cloud Architect, Cloud Consultant, or Cloud Engineer.

Related Skills

In addition to GCP, professionals in this field are often expected to have skills in: - Cloud computing concepts - Programming languages such as Python, Java, or Go - DevOps tools and practices - Networking and security concepts - Data analytics and machine learning

Interview Questions

  • What is Google Cloud Platform and its key services? (basic)
  • Explain the difference between Google Cloud Storage and Google Cloud Bigtable. (medium)
  • How would you optimize costs in Google Cloud Platform? (medium)
  • Describe a project where you implemented CI/CD pipelines in GCP. (advanced)
  • How does Google Cloud Pub/Sub work and when would you use it? (medium)
  • What is Cloud Spanner and how is it different from other database services in GCP? (advanced)
  • Explain the concept of IAM and how it is implemented in GCP. (medium)
  • How would you securely transfer data between different regions in GCP? (advanced)
  • What is Google Kubernetes Engine (GKE) and how does it simplify container management? (medium)
  • Describe a scenario where you used Google Cloud Functions in a project. (advanced)
  • How do you monitor performance and troubleshoot issues in GCP? (medium)
  • What is Google Cloud SQL and when would you choose it over other database options? (medium)
  • Explain the concept of VPC (Virtual Private Cloud) in GCP. (basic)
  • How do you ensure data security and compliance in GCP? (medium)
  • Describe a project where you integrated Google Cloud AI services. (advanced)
  • What is the difference between Google Cloud CDN and Google Cloud Load Balancing? (medium)
  • How do you handle disaster recovery and backups in GCP? (medium)
  • Explain the concept of auto-scaling in GCP and when it is useful. (medium)
  • How would you set up a multi-region deployment in GCP for high availability? (advanced)
  • Describe a project where you used Google Cloud Dataflow for data processing. (advanced)
  • What are the best practices for optimizing performance in Google Cloud Platform? (medium)
  • How do you manage access control and permissions in GCP? (medium)
  • Explain the concept of serverless computing and how it is implemented in GCP. (medium)
  • What is the difference between Google Cloud Identity and Access Management (IAM) and AWS IAM? (advanced)
  • How do you ensure data encryption at rest and in transit in GCP? (medium)

Closing Remark

As the demand for GCP professionals continues to rise in India, now is the perfect time to upskill and pursue a career in this field. By mastering GCP and related skills, you can unlock numerous opportunities and build a successful career in cloud computing. Prepare well, showcase your expertise confidently, and land your dream job in the thriving GCP job market in India.

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies