Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6 - 10 years
9 - 18 Lacs
Bengaluru
Remote
Hiring, Performance Tester Experience: 6+ years Good understanding of GCP/other cloud platforms Strong in performance Centre Good skill in Web-HTTP protocol and API Very strong in Dynatrace or App dynamics Strong on Laodrunner & Jmeter Good communication skill You must be willing to work under a split shift, specifically from 10 AM to 2 PM and 6 PM to 10 PM.
Posted 1 month ago
7 - 11 years
8 - 11 Lacs
Hyderabad
Work from Office
Wipro Limited (NYSEWIT, BSE507685, NSEWIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. About The Role Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ?java cloud Kotlin 7 to 11 years 1.Java, Spring boot, Kafka, Kotlin, MongoDB, GCP Or Java, Springboot, , Kotlin, MongoDB, Cloud Or Java, Springboot, Kafka, MongoDB, GCP Cloud ? Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ? Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ? Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 month ago
7 - 12 years
5 - 15 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
We are hiring for PostgreSQL DBA Experience range & skills: Mandatory, with no exceptions Experience range 7+ years as Postgresql DBA with cloud & SQL Education: BE/B.Tech/MCA/M.Tech/MSc./MS Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements. You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!
Posted 1 month ago
1 - 3 years
5 - 10 Lacs
Noida, Gurugram, Delhi / NCR
Work from Office
Job Description: Junior Backend Developer Overview We are seeking a dedicated and talented Junior Backend Developer with 1-2 year of experience to join our growing team. The ideal candidate will be responsible for building and maintaining the server-side logic, ensuring high performance and responsiveness to requests from the front-end. You will collaborate with front-end developers to integrate user-facing elements with server-side logic. Key Responsibilities Develop and maintain server-side applications using backend technologies such as Node.js, Express, and databases. Write clean, scalable, and efficient code for backend logic. Design and implement APIs to support front-end functionality and integrate with other services. Optimize application performance, ensuring fast and responsive interactions. Debug and resolve technical issues, ensuring smooth operation of the applications. Collaborate with front-end developers to integrate user-facing elements with server-side logic. Participate in code reviews, providing and receiving constructive feedback. Stay updated with emerging trends and technologies in backend development. Assist in the deployment of applications and monitor performance to ensure stability. Contribute to the improvement of development processes and best practices. Required Skills and Qualifications Experience : 1-2 year of professional experience in backend development. Proficiency in Backend Technologies : Strong understanding of Node.js and Express.js or similar backend frameworks. Database Management : Experience with databases such as MongoDB, MySQL, or PostgreSQL, including database design and optimization. API Development : Practical experience in designing and building RESTful APIs. Version Control : Proficiency in using Git for version control and collaboration. Problem-Solving : Strong analytical and problem-solving skills with attention to detail. Communication : Excellent verbal and written communication skills to work effectively in a team environment. Security Best Practices : Basic understanding of security principles and practices in backend development. Testing : Familiarity with testing frameworks and tools for backend services, such as Mocha, Chai, or Jest. Preferred Skills Additional Frameworks : Experience with other backend frameworks like Django, Flask, or Ruby on Rails is a plus. Cloud Services : Knowledge of cloud services (AWS, Azure, Google Cloud) and deployment processes. Containerization : Familiarity with containerization tools like Docker and orchestration tools like Kubernetes. Agile/Scrum : Experience working in an Agile/Scrum development environment. DevOps : Basic understanding of DevOps practices and tools for continuous integration and deployment (CI/CD). Education Bachelors degree in Computer Science, Information Technology, or a related field, or equivalent work experience.
Posted 1 month ago
3 - 7 years
6 - 7 Lacs
Chennai
Work from Office
AI/ML Developer Position Description: The AI/ML Developer is to develop, train, validate, and deploy AI and ML models that meet the project's analytical needs, ensuring accuracy, scalability, and efficiency. Role & responsibilities Data Processing & Analysis Collect, clean, and preprocess structured and unstructured data. Conduct exploratory data analysis to identify trends and patterns. Model Development Design, train, validate, and fine-tune machine learning models using frameworks such as TensorFlow, PyTorch, Scikit-learn, or Hugging Face. Develop and optimize deep learning architectures for classification, regression, NLP, or computer vision tasks. Deployment & Integration Deploy models using tools like Docker, FastAPI, Flask, or TensorFlow Serving. Integrate models into existing applications, pipelines, or APIs. Implement monitoring and performance evaluation of deployed models. Documentation & Collaboration Write clear and comprehensive documentation for models, data pipelines, and APIs. Work collaboratively with data engineers, software developers, and domain experts to ensure alignment with project objectives. Innovation & Research Stay up to date with the latest AI/ML research and tools. Contribute to prototyping and innovation initiatives within the organization. Deliverables: Clean, well-documented code and model scripts. Trained and validated AI/ML models with performance reports. Data pipelines and preprocessing scripts. Deployment-ready models integrated with APIs or platforms. Monthly progress reports on development and outcomes. Final report with documentation on model architecture, datasets, results, and recommendations. Preferred candidate profile Education: Bachelors or Masters degree in Computer Science, Data Science, Artificial Intelligence, or a related field. Experience: Minimum 3 years of experience in machine learning or AI development Proficiency in Python and ML frameworks such as TensorFlow, PyTorch, Scikit-learn. Experience with data preprocessing, feature engineering, and model evaluation. Experience with cloud services (AWS, Azure, GCP) for AI model deployment, good understanding of data structures, algorithms, and software engineering practices Experience with database systems (SQL, NoSQL) and big data tools (e.g., Spark, Hadoop) is a plus. Personal Qualities: Strong analytical and problem-solving skills. Excellent programming and debugging abilities. Ability to communicate technical concepts to non-technical stakeholders. Attention to detail and commitment to reproducible research/code. Strong teamwork and collaborative mindset.
Posted 1 month ago
3 - 8 years
15 - 30 Lacs
Pune, Gurugram, Bengaluru
Hybrid
Salary: 15 to 30 LPA Exp: 3 to 8 years Location : Gurgaon/Bangalore/Pune/Chennai Notice: Immediate to 30 days..!! Key Responsibilities & Skillsets: Common Skillsets : 3+ years of experience in analytics, SAS Pyspark, Python, Spark, SQL and associated data engineering jobs. Must have experience with managing and transforming big data sets using pyspark, spark-scala, Numpy pandas Excellent communication & presentation skills Experience in managing Python codes and collaborating with customer on model evolution Good knowledge of data base management and Hadoop/Spark, SQL, HIVE, Python (expertise). Superior analytical and problem solving skills Should be able to work on a problem independently and prepare client ready deliverable with minimal or no supervision Good communication skill for client interaction Data Management Skillsets: Ability to understand data models and identify ETL optimization opportunities. Exposure to ETL tools is preferred Should have strong grasp of advanced SQL functionalities (joins, nested query, and procedures). Strong ability to translate functional specifications / requirements to technical requirements
Posted 1 month ago
1 - 3 years
3 - 6 Lacs
Jewar
Work from Office
Responsibilities: Develop, maintain & enhance services, APIs, & databases. Design & implement scalable, high-performance & secure architecture. Optimize database queries & ensure data integrity. Troubleshoot, resolve issues & performance bottlenecks.
Posted 1 month ago
5 - 9 years
5 - 15 Lacs
Gurugram
Work from Office
DevOps Team Lead 5-9 Years of Experience TOP 3 SKILLS Hands-on experience with CI/CD, Kubernetes (GKE) and Terraform based GCP Deployments. Well versed with Cloud based Database, Kafka or other Event based systems, and Observation Stack (Grafana, Prometheus, Loki, Tempo etc) Deployments and automation with focus on Performance Monitoring. Familiarity with query languages, SQL, PromQL etc. Job Description Roles & Responsibility Design, build, and maintain infrastructure and tools to enable continuous integration, delivery, and deployment of software products. Collaborate with technology owners, developers, testers, and other stakeholders to understand the requirements and specifications of the software projects. Monitor, troubleshoot, and optimize the performance and availability of the systems and applications. Research and evaluate new technologies and methodologies that can improve the efficiency and quality of the DevOps processes. Essential Skills Proficient in using various tools and technologies for DevOps, such as Git, Jenkins, Docker, Kubernetes, Ansible, Prometheus, AWS, GCP etc. Proven experience in managing multiple environments and manage DevOps Engineers or a similar role in a software development environment. Strong problem-solving and troubleshooting skills.
Posted 1 month ago
4 - 6 years
12 - 20 Lacs
Hyderabad
Work from Office
Role & responsibilities Development of modern based applications in the back-end and front-end. Development, implementation and optimisation of innovative IoT products, web apps and new features Implement cleaner solutions for the problems with recommended system design concepts. Technical product design, solution architecture, specifications and implementation of Livello solutions Work with a cross-functional team to define, build, test, and deploy universal applications Version control with Git is part of your daily work and continuous integration Ensure the implementation of technical standards, quality assurance and best practices. Preferred candidate profile Bachelor/Master degree in Computer Science or comparable field of study 4-5 years experience in agile software development Proficiency in Node.js, Typescript, React js, React Native/Flutter, MongoDB, Docker along with any cloud experience (AWS, GCP, Azure). Understanding of software architecture and design patterns. Experience with test and deployment automation (Gitlab, Fastlane, Jest). Experience with GraphQL and Kubernetes as well as state management solutions (Redux, Saga). Ability to work cooperatively and independently, analytical and logical thinking, willingness to lead and take on responsibility. Strong understanding of OOPs concepts and their practical application in software development. Fluent in English. Nice to have: Experienced in IoT-to-Cloud managed services. Knowledge of IoT device management and message brokers like AMQP or MQTT Interest in designing dashboards for data visualization. Working Experience with Nest.js. Perks and benefits A responsible position in a fast-growing and highly innovative start-up An agile and diverse team with colleagues from all over the world, working with our main office in Germany English speaking open work environment, with flat hierarchies and short decision-making paths Creative freedom for own ideas, projects and personal development. Quarterly awards for recognizing the hard work and talent within the team.
Posted 1 month ago
8 - 13 years
15 - 25 Lacs
Gurugram, Chennai, Delhi / NCR
Work from Office
Role & responsibilities Java Backend Developer Java, Springboot, Spring,Microservices, GCP Cloud experience GCP certification Preferred candidate profile
Posted 1 month ago
8 - 10 years
27 - 32 Lacs
Hyderabad, Gurugram, Bengaluru
Work from Office
The Team: As a Senior Lead Machine Learning Engineer of the Document Platforms and AI Team, you will play a critical role in building the next generation of data extraction tools, working on cutting-edge ML-powered products and capabilities that power natural language understanding, information retrieval, and data sourcing solutions for the Enterprise Data Organization and our clients. This is an exciting opportunity to shape the future of data transformation and see your work make a real difference, all while having fun in a collaborative and engaging environment. You'll spearhead the development and deployment of production-ready AI products and pipelines, leading by example and mentoring a talented team. This role demands a deep understanding of machine learning principles, hands-on experience with relevant technologies, and the ability to inspire and guide others. You'll be at the forefront of a rapidly evolving field, learning and growing alongside some of the brightest minds in the industry. If you're passionate about AI, driven to make an impact, and thrive in a dynamic and supportive workplace, we encourage you to join us! The Impact: The Document Platforms and AI team has already delivered breakthrough products and significant business value over the last 3 years. In this role you will be developing our next generation of new products while enhancing existing ones aiming at solving high-impact business problems. Whats in it for you: Be a part of a global company and build solutions at enterprise scale Collaborate with a highly skilled and technically strong team Contribute to solving high complexity, high impact problems Responsibilities: Build production ready data acquisition and transformation pipelines from ideation to deployment. Being a hands-on problem solver and developer helping to extend and manage the data platforms. Apply best practices in data modeling and building ETL pipelines (streaming and batch) using cloud-native solutions Technical leadership: Drive the technical vision and architecture for the extraction project, making key decisions about model selection, infrastructure, and deployment strategies. Model development: Design, develop, and evaluate state-of-the-art machine learning models for information extraction, leveraging techniques from NLP, computer vision (if applicable), and other relevant domains. Data preprocessing and feature engineering: Develop robust pipelines for data cleaning, preprocessing, and feature engineering to prepare data for model training. Model training and evaluation: Train, tune, and evaluate machine learning models, ensuring high accuracy, efficiency, and scalability. Deployment and monitoring: Deploy and maintain machine learning models in a production environment, monitoring their performance and ensuring their reliability. Research and innovation: Stay up-to-date with the latest advancements in machine learning and NLP, and explore new techniques and technologies to improve the extraction process. Collaboration: Work closely with product managers, data scientists, and other engineers to understand project requirements and deliver effective solutions. Code quality and best practices: Ensure high code quality and adherence to best practices for software development. Communication: Effectively communicate technical concepts and project updates to both technical and non-technical audiences. What Were Looking For: 8-10 years of professional software work experience, with a strong focus on Machine Learning, Natural Language Processing (NLP) for information extraction and MLOps Expertise in Python and related NLP libraries (e.g., spaCy, NLTK, Transformers, Hugging Face) Experience with Apache Spark or other distributed computing frameworks for large-scale data processing. AWS/GCP Cloud expertise, particularly in deploying and scaling ML pipelines for NLP tasks. Solid understanding of the Machine Learning model lifecycle, including data preprocessing, feature engineering, model training, evaluation, deployment, and monitoring, specifically for information extraction models . Experience with CI/CD pipelines for ML models, including automated testing and deployment. Docker & Kubernetes experience for containerization and orchestration. OOP Design patterns, Test-Driven Development and Enterprise System design SQL (any variant, bonus if this is a big data variant) Linux OS (e.g. bash toolset and other utilities) Version control system experience with Git, GitHub, or Azure DevOps. Excellent Problem-solving, Code Review and Debugging skills Software craftsmanship, adherence to Agile principles and taking pride in writing good code Techniques to communicate change to non-technical people Nice to have Core Java 17+, preferably Java 21+, and associated toolchain Apache Avro Apache Kafka Other JVM based languages - e.g. Kotlin, Scala C# - in particular .NET Core
Posted 1 month ago
5 - 8 years
15 - 25 Lacs
Pune
Hybrid
Role & responsibilities Data Pipeline Development: Design, develop, and maintain data pipelines utilizing Google Cloud Platform (GCP) services like Dataflow, Dataproc, and Pub/Sub. Data Ingestion & Transformation: Build and implement data ingestion and transformation processes using tools such as Apache Beam and Apache Spark. Data Storage Management: Optimize and manage data storage solutions on GCP, including BigQuery, Cloud Storage, and Cloud SQL. Security Implementation: Implement data security protocols and access controls with GCP's Identity and Access Management (IAM) and Cloud Security Command Center. System Monitoring & Troubleshooting: Monitor and troubleshoot data pipelines and storage solutions using GCP's Stackdriver and Cloud Monitoring tools. Generative AI Systems: Develop and maintain scalable systems for deploying and operating generative AI models, ensuring efficient use of computational resources. Gen AI Capability Building: Build generative AI capabilities among engineers, covering areas such as knowledge engineering, prompt engineering, and platform engineering. Knowledge Engineering: Gather and structure domain-specific knowledge to be utilized by large language models (LLMs) effectively. Prompt Engineering: Design effective prompts to guide generative AI models, ensuring relevant, accurate, and creative text output. Collaboration: Work with data experts, analysts, and product teams to understand data requirements and deliver tailored solutions. Automation: Automate data processing tasks using scripting languages such as Python. Best Practices: Participate in code reviews and contribute to establishing best practices for data engineering within GCP. Continuous Learning: Stay current with GCP service innovations and advancements. Core data services (GCS, BigQuery, Cloud Storage, Dataflow, etc.). Skills and Experience: Experience: 5+ years of experience in Data Engineering or similar roles. Proficiency in GCP: Expertise in designing, developing, and deploying data pipelines, with strong knowledge of GCP core data services (GCS, BigQuery, Cloud Storage, Dataflow, etc.). Generative AI & LLMs: Hands-on experience with Generative AI models and large language models (LLMs) such as GPT-4, LLAMA3, and Gemini 1.5, with the ability to integrate these models into data pipelines and processes. Experience in Webscraping Technical Skills: Strong proficiency in Python and SQL for data manipulation and querying. Experience with distributed data processing frameworks like Apache Beam or Apache Spark is a plus. Security Knowledge: Familiarity with data security and access control best practices. • Collaboration: Excellent communication and problem-solving skills, with a demonstrated ability to collaborate across teams. Project Management: Ability to work independently, manage multiple projects, and meet deadlines. Preferred Knowledge: Familiarity with Sustainable Finance, ESG Risk, CSRD, Regulatory Reporting, cloud infrastructure, and data governance best practices. Bonus Skills: Knowledge of Terraform is a plus. Education: Degree: Bachelors or masters degree in computer science, Information Technology, or a related field. Experience: 3-5 years of hands-on experience in data engineering. Certification: Google Professional Data Engineer
Posted 1 month ago
9 - 14 years
17 - 32 Lacs
Chennai, Bengaluru
Work from Office
Job Title : DevOps Engineer GCP | Terraform | Jenkins | CI/CD Location: Bangalore/Chennai Job Summary: We are seeking a skilled and motivated DevOps Engineer with hands-on experience in Google Cloud Platform (GCP), Infrastructure as Code using Terraform, Jenkins automation, and CI/CD pipelines. You will play a key role in designing, building, and maintaining scalable, secure, and efficient cloud infrastructure to support continuous delivery and deployment. Key Responsibilities: Design and implement scalable infrastructure on GCP using Terraform. Develop, manage, and maintain CI/CD pipelines using Jenkins, Git, and related tools. Collaborate with developers, QA, and other IT teams to automate and optimize deployments. Manage version control, code integration, and release processes across environments. Monitor infrastructure performance, troubleshoot issues, and implement best practices in reliability and security. Ensure infrastructure is compliant with internal security and governance requirements. Maintain documentation for infrastructure and operational processes. Required Skills and Qualifications: 9+ years of experience in DevOps or Cloud Engineering roles. Strong experience with Google Cloud Platform (GCP) services (e.g., Compute Engine, Cloud Storage, VPC, IAM). Proven expertise in Terraform for writing and maintaining infrastructure as code. Hands-on experience with Jenkins for pipeline creation, job automation, and integration. Deep understanding of CI/CD concepts and best practices. Proficiency with Git, shell scripting, and automation tools. Working knowledge of Docker and containerization. Experience in monitoring/logging (e.g., Stackdriver, Prometheus, Grafana) is a plus. Preferred Qualifications: GCP certification (Associate or Professional Cloud Engineer). Experience with Kubernetes and GKE. Familiarity with configuration management tools (e.g., Ansible, Chef). Knowledge of security best practices in cloud environments. Soft Skills: Strong problem-solving and troubleshooting skills. Excellent communication and collaboration abilities. Ability to work independently and in a fast-paced, agile environment.
Posted 1 month ago
6 - 10 years
15 - 20 Lacs
Gurugram, Delhi / NCR
Work from Office
Role & responsibilities 1. Pipeline Development and Support Design, build, and optimize scalable ETL pipelines on Databricks using PySpark, SQL, and Delta Lake. Work with structured and semi-structured insurance data (policy, claims, actuarial, risk, customer data) from multiple sources. Implement data quality checks, governance, and monitoring across pipelines. Collaborate with data scientists, actuaries, and business stakeholders to translate analytics requirements into data models. Develop and deliver compelling visualizations and dashboards using Databricks SQL, Power BI, Tableau, or similar tools. Monitor and troubleshoot pipeline issues, ensuring data integrity and resolving bottlenecks or failures. Optimize Databricks clusters for performance and cost efficiency. Support ML model deployment pipelines in collaboration with data science teams. Document pipelines, workflows, and architecture following best practices. 2. SQL Write complex SQL queries to extract, transform, and load (ETL) data for reporting, analytics, or downstream applications. Optimize SQL queries for performance, especially when working with large datasets in Snowflake or other relational databases. Create and maintain database schemas, tables, views, and stored procedures to support business requirements. 3. Data Integration Integrate data from diverse sources (e.g., on-premises databases, cloud storage like S3 or Azure Blob, or third-party APIs) into a unified system. Ensure data consistency, quality, and availability by implementing data validation and cleansing processes. 4. Good to have skills Insurance domain experience P&C or L&A domain experience candidate will be preferred Team player / strong communication skills Experience with MLflow, feature stores, and model monitoring. Hands-on experience with data governance tools (e.g., Unity Catalog, Collibra). Familiarity with regulatory and compliance requirements in insurance data. Skills Typically Required 5+ years of experience in data engineering, with at least 2+ years hands-on with Databricks and Spark. Strong proficiency in PySpark, SQL, Delta Lake, and data modeling. Solid understanding of cloud platforms (Azure, AWS, or GCP) and data lake architectures. Experience integrating Databricks with BI tools (Power BI, Tableau, Looker) for business-facing dashboards. Knowledge of insurance data (L&A, P&C) and industry metrics is highly preferred. Familiarity with DevOps tools (Git, CI/CD pipelines) and orchestration tools (Airflow, Databricks Jobs). Strong communication skills to explain technical concepts to business stakeholders
Posted 1 month ago
3 - 8 years
3 - 7 Lacs
Bengaluru
Work from Office
Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Google Kubernetes Engine Good to have skills : Kubernetes, Google Cloud Compute Services Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education About The Role :Job Summary :We are seeking a motivated and talented GCP & Kubernetes Engineer to join our growing cloud infrastructure team. This role will be a key contributor in building and maintaining our Kubernetes platform, working closely with architects to design, deploy, and manage cloud-native applications on Google Kubernetes Engine (GKE).Responsibilities: Extensive hands-on experience with Google Cloud Platform (GCP) and Kubernetes implementations. Demonstrated expertise in operating and managing container orchestration engines such as Dockers or Kubernetes. Knowledge or experience on various Kubernetes tools like Kubekafka, Kubegres, Helm, Ingress, Redis, Grafana, and Prometheus Proven track record in supporting and deploying various public cloud services. Experience in building or managing self-service platforms to boost developer productivity. Proficiency in using Infrastructure as Code (IaC) tools like Terraform. Skilled in diagnosing and resolving complex issues in automation and cloud environments. Advanced experience in architecting and managing highly available and high-performance multi-zonal or multi-regional systems. Strong understanding of infrastructure CI/CD pipelines and associated tools. Collaborate with internal teams and stakeholders to understand user requirements and implement technical solutions. Experience working in GKE, Edge/GDCE environments. Assist development teams in building and deploying microservices-based applications in public cloud environments.Technical Skillset: Minimum of 3 years of hands-on experience in migrating or deploying GCP cloud-based solutions. At least 3 years of experience in architecting, implementing, and supporting GCP infrastructure and topologies. Over 3 years of experience with GCP IaC, particularly with Terraform, including writing and maintaining Terraform configurations and modules. Experience in deploying container-based systems such as Docker or Kubernetes on both private and public clouds (GCP GKE). Familiarity with CI/CD tools (e.g., GitHub) and processes.Certifications: GCP ACE certification is mandatory. CKA certification is highly desirable. HashiCorp Terraform certification is a significant plus.
Posted 1 month ago
4 - 9 years
16 - 31 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Role & responsibilities Execute project specific development activities in accordance to applicable standards and quality parameters Developing Reviewing Code Setting up the right environment for the projects. Ensure delivery within schedule by adhering to the engineering and quality standards. Own deliver end to end projects within GCP for Payments Data Platform Once a month available on support rota for a week for GCP 24x7 on call Basic Knowledge on Payments ISO standards Message Types etc Able to work under pressure on deliverables P1 Violations Incidents Should be fluent and clear in communications Written Verbal Should be able to follow Agile ways of working. Must have hands on experience on JAVA GCP Shell script and Python knowledge a plus Have in depth knowledge on Java Spring boot Should have experience in GCP Data Flow Big Table Big Query etc Should have experience on managing large database Should have worked on requirements design develop Event Driven and Near Real time data patterns Ingress Egress Preferred candidate profile
Posted 1 month ago
4 - 9 years
10 - 14 Lacs
Pune
Hybrid
Job Description; Technical Skills; Top skills for this positions is : Google Cloud Platform (Composer, Big Query, Airflow, DataProc, Data Flow, GCS) Data Warehousing knowledge Hands on experience in Python language and SQL database. Analytical technical skills to be able to predict the consequences of configuration changes (impact analysis), to identify root causes that are not obvious and to understand the business requirements. Excellent communication with different stakeholders (business, technical, project) Good understading of the overall Big Data and Data Science ecosystem Experience with buiding and deploying containers as services using Swarm/Kubernetes Good understanding of container concepts like buiding lean and secure images Understanding modern DevOps pipelines Experience with stream data pipelines using Kafka or Pub/Sub (mandatory for Kafka resources) Good to have: Professional Data Engineer or Associate Data Engineer Certification Roles and Responsibilities; Design, build & manage Big data ingestion and processing applications on Google Cloud using Big Query, Dataflow, Composer, Cloud Storage, Dataproc Performance tuning and analysis of Spark, Apache Beam (Dataflow) or similar distributed computing tools and applications on Google Cloud Good understanding of google cloud concepts, environments and utilities to design cloud optimal solutions for Machine Learning Applications Build systems to perform real-time data processing using Kafka, Pub-sub, Spark Streaming or similar technologies Manage the development life-cycle for agile software development projects Convert a proof of concept into an industrialization for Machine Learning Models (MLOps). Provide solutions to complex problems. Deliver customer-oriented solutions in a timely, collaborative manner Proactive thinking, planning and understanding of dependencies Develop & implement robust solutions in test & production environments.
Posted 1 month ago
10 - 20 years
25 - 40 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Hi, Hope you are looking for a job change. We have opening for GCP Data Architect for an MNC in Pan India Location, I'm sharing JD with you. Please have a look and revert with below details and Updated Resume. Apply only if you can join in 10 Days. Its 5-Days Work from Office . We dont process High-Notice period Candidates Role GCP Data Architect Experience: 10+ Years Mode: Permanent Work Location: Pan India Notice Period: immediate to 10 Days Mandatory Skills: : GCP, Architecture Experience , Big Data, Data Modelling , BigQuery Full Name ( As Per Aadhar Card ): Email ID: Mobile Number: Alternate No: Qualification: Graduation Year: Regular Course: Total Experience: Relevant experience: Current Organization: Working as Permanent Employee: Payroll Company:: Experience in GCP : Experience in Architecture : Experience in GCP Data Architecture : Experience in Big Data: Experience in BigQuery : Experience in Data Management : Official Notice period: Serving Notice Period: Current location: Preferred location: Current CTC: Exp CTC: CTC Breakup: Pan Card Number : Date of Birth : Any Offer in Hand: Offered CTC: LWD: Any Offer in hand: Serving Notice Period: Can you join immediately: Ready to work from Office for 5 Days : Job Description: GCP Data Architect We are seeking a skilled Data Solution Architect to design Solution and Lead the implementation on GCP. The ideal candidate will possess extensive experience in data Architecting, Solution design and data management practices. Responsibilities: Architect and design end-to-end data solutions on Cloud Platform, focusing on data warehousing and big data platforms. Collaborate with clients, developers, and architecture teams to understand requirements and translate them into effective data solutions. Develop high-level and detailed data architecture and design documentation. Implement data management and data governance strategies, ensuring compliance with industry standards. Architect both batch and real-time data solutions, leveraging cloud native services and technologies. Design and manage data pipeline processes for historic data migration and data integration. Collaborate with business analysts to understand domain data requirements and incorporate them into the design deliverables. Drive innovation in data analytics by leveraging cutting-edge technologies and methodologies. Demonstrate excellent verbal and written communication skills to communicate complex ideas and concepts effectively. Stay updated on the latest advancements in Data Analytics, data architecture, and data management techniques. Requirements Minimum of 5 years of experience in a Data Architect role, supporting warehouse and Cloud data platforms/environments. Experience with common GCP services such as BigQuery, Dataflow, GCS, Service Accounts, cloud function Extremely strong in BigQuery design, development Extensive knowledge and implementation experience in data management, governance, and security frameworks. Proven experience in creating high-level and detailed data architecture and design documentation. Strong aptitude for business analysis to understand domain data requirements. Proficiency in Data Modelling using any Modelling tool for Conceptual, Logical, and Physical models is preferred Hands-on experience with architecting end-to-end data solutions for both batch and real-time designs. Ability to collaborate effectively with clients, developers, and architecture teams to implement enterprise-level data solutions. Familiarity with Data Fabric and Data Mesh architecture is a plus. Excellent verbal and written communication skills. Regards, Rejeesh S Email : rejeesh.s@jobworld.jobs Mobile : +91 - 9188336668
Posted 1 month ago
1 - 6 years
4 - 9 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Role & responsibilities TO Handle field sales ( CLOUD PRODUCTS ) 1. Primarily a hunter and hustler personality with 1 - 8 years of experience in SME & Enterprise Segment. Strong enterprise sales background in solutions / SaaS space ideally with knowledge of / AI/Azure /AWS/GCP/GWS. 2. Sellthe Google Cloudproduct and services to new and existing clients. Identify and properly qualifyCloud opportunities. PresentCloudsolutions at the executive level (C level Executive). Lead negotiations and overcome objections for deal closure. Manage complex sales cycles and multiple engagements simultaneously, Work with partner sales consultants to discover, identify and meet customer requirements. 3. Prepare accurate BOQ & sales forecasts and sales cycle reporting. Provide hand holding to ensure the success of the potentialorcurrent clients. Leverage and enhance partner relationships to drive additional value and revenue. 4. Forge strong working relationships with Partners. Encourage and develop increased awareness of Microsoft Cloud services among partners. Collaborate with channel partners executive, sales, and technical teams. Develops and executes successful targeted territory development plans / GTM to help achieve growth and revenue. Monitor and report sales activity within the system. 5. Generate new ARR and long term TCVs by landing new clients. Create territory specific sales strategy aligned to Redington Limited GTM plans and execute on it. Grow business by signing new partnerships and leveraging existing ones. Preferred candidate profile Preferred from B2B Sales Specialist who has knowledge in GCP / GWS Products Perks and benefits Good salary and Work life balance
Posted 1 month ago
4 - 8 years
10 - 20 Lacs
Hyderabad
Hybrid
Job Description : We are seeking a highly motivated and experienced ML Engineer/Data Scientist to join our growing ML/GenAI team. You will play a key role in designing, developing and productionalizing ML applications by evaluating models, training and/or fine tuning them. You will play a crucial role in developing Gen AI based solutions for our customers. As a senior member of the team, you will take ownership of projects, collaborating with engineers and stakeholders to ensure successful project delivery. What we're looking for: At least 3 years of experience in designing & building AI applications for customer and deploying them into production At least 5 years of Software engineering experience in building Secure, scalable and performant applications for customers. Experience with Document extraction using AI, Conversational AI, Vision AI, NLP or Gen AI. Design, develop, and operationalize existing ML models by fine tuning, personalizing it. Evaluate machine learning models and perform necessary tuning. Develop prompts that instruct LLM to generate relevant and accurate responses. Collaborate with data scientists and engineers to analyze and preprocess datasets for prompt development, including data cleaning, transformation, and augmentation. Conduct thorough analysis to evaluate LLM responses, iteratively modify prompts to improve LLM performance. Hands on customer experience with RAG solution or fine tuning of LLM model. Build and deploy scalable machine learning pipelines on GCP or any equivalent cloud platform involving data warehouses, machine learning platforms, dashboards or CRM tools. Experience working with the end-to-end steps involving but not limited to data cleaning, exploratory data analysis, dealing outliers, handling imbalances, analyzing data distributions (univariate, bivariate, multivariate), transforming numerical and categorical data into features, feature selection, model selection, model training and deployment. Proven experience building and deploying machine learning models in production environments for real life applications Good understanding of natural language processing, computer vision or other deep learning techniques. Expertise in Python, Numpy, Pandas and various ML libraries (e.g., XGboost, TensorFlow, PyTorch, Scikit-learn, LangChain). Familiarity with Google Cloud or any other Cloud Platform and its machine learning services. Excellent communication, collaboration, and problem-solving skills.
Posted 1 month ago
3 - 7 years
10 - 14 Lacs
Pune
Work from Office
About The Role : Job Title GCP Data Engineer, AS LocationPune, India Role Description An Engineer is responsible for designing and developing entire engineering solutions to accomplish business goals. Key responsibilities of this role include ensuring that solutions are well architected, with maintainability and ease of testing built in from the outset, and that they can be integrated successfully into the end-to-end business process flow. They will have gained significant experience through multiple implementations and have begun to develop both depth and breadth in several engineering competencies. They have extensive knowledge of design and architectural patterns. They will provide engineering thought leadership within their teams and will play a role in mentoring and coaching of less experienced engineers. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Design, develop and maintain data pipelines using Python and SQL programming language on GCP. Experience in Agile methodologies, ETL, ELT, Data movement and Data processing skills. Work with Cloud Composer to manage and process batch data jobs efficiently. Develop and optimize complex SQL queries for data analysis, extraction, and transformation. Develop and deploy google cloud services using Terraform. Implement CI CD pipeline using GitHub Action Consume and Hosting REST API using Python. Monitor and troubleshoot data pipelines, resolving any issues in a timely manner. Ensure team collaboration using Jira, Confluence, and other tools. Ability to quickly learn new any existing technologies Strong problem-solving skills. Write advanced SQL and Python scripts. Certification on Professional Google Cloud Data engineer will be an added advantage. Your skills and experience 6+ years of IT experience, as a hands-on technologist. Proficient in Python for data engineering. Proficient in SQL. Hands on experience on GCP Cloud Composer, Data Flow, Big Query, Cloud Function, Cloud Run and well to have GKE Hands on experience in REST API hosting and consumptions. Proficient in Terraform/ Hashicorp. Experienced in GitHub and Git Actions Experienced in CI-CD Experience in automating ETL testing using python and SQL. Good to have APIGEE. Good to have Bit Bucket How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs.
Posted 1 month ago
5 - 7 years
0 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Job Title: Data Visualization Software Developer Engineer Location: Bangalore Experience:-5-7Years Job Description: Role Overview: We are looking for a skilled Data Visualization Software Developer Engineer with 6-8 years of experience in developing interactive dashboards and data-driven solutions using Looker and LookerML. The ideal candidate will have expertise in Google Cloud Platform (GCP) and BigQuery and a strong understanding of data visualization best practices. Experience in the media domain (OTT, DTH, Web) will be a plus. Key Responsibilities: Design, develop, and optimize interactive dashboards using Looker and LookerML. Work with BigQuery to create efficient data models and queries for visualization. Develop LookML models, explores, and derived tables to support business intelligence needs. Optimize dashboard performance by implementing best practices in data aggregation and visualization. Collaborate with data engineers, analysts, and business teams to understand requirements and translate them into actionable insights. Implement security and governance policies within Looker to ensure data integrity and controlled access. Leverage Google Cloud Platform (GCP) services to build scalable and reliable data solutions. Maintain documentation and provide training to stakeholders on using Looker dashboards effectively. Troubleshoot and resolve issues related to dashboard performance, data accuracy, and visualization constraints. Maintain and optimize existing Looker dashboards and reports to ensure continuity and alignment with business KPIs Understand, audit, and enhance existing LookerML models to ensure data integrity and performance Build new dashboards and data visualizations based on business requirements and stakeholder input Collaborate with data engineers to define and validate data pipelines required for dashboard development and ensure the timely availability of clean, structured data Document existing and new Looker assets and processes to support knowledge transfer, scalability, and maintenance Support the transition/handover process by acquiring detailed knowledge of legacy implementations and ensuring a smooth takeover Required Skills & Experience: 6-8 years of experience in data visualization and business intelligence using Looker and LookerML. Strong proficiency in writing and optimizing SQL queries, especially for BigQuery. Experience in Google Cloud Platform (GCP), particularly with BigQuery and related data services. Solid understanding of data modeling, ETL processes, and database structures. Familiarity with data governance, security, and access controls in Looker. Strong analytical skills and the ability to translate business requirements into technical solutions. Excellent communication and collaboration skills. Expertise in Looker and Looker, including Explore creation, Views, and derived tables Strong SQL skills for data exploration, transformation, and validation Experience in BI solution lifecycle management (build, test, deploy, maintain) Excellent documentation and stakeholder communication skills for handovers and ongoing alignment Strong data visualization and storytelling abilities, focusing on user-centric design and clarity Preferred Qualifications: Experience working in the media industry (OTT, DTH, Web) and handling large-scale media datasets. Knowledge of other BI tools like Tableau, Power BI, or Data Studio is a plus. Experience with Python or scripting languages for automation and data processing. Understanding of machine learning or predictive analytics is an advantage.
Posted 1 month ago
7 - 12 years
15 - 20 Lacs
Navi Mumbai, Bengaluru, Mumbai (All Areas)
Work from Office
Key Responsibilities: Design, implement, and maintain GCP cloud infrastructure using Infrastructure as Code (IaC) tools Manage and optimize Kubernetes clusters on GKE (Google Kubernetes Engine) Build and maintain CI/CD pipelines for efficient application delivery Monitor GCP infrastructure costs and drive optimization strategies Develop observability solutions using GCP-native and third-party tools Collaborate with engineering teams to streamline deployment and operations workflows Enforce security best practices and ensure compliance with internal and industry standards Design and implement high availability (HA) and disaster recovery (DR) architectures Mandatory Technical Skills: GCP Services: Compute Engine, VPC, Cloud Storage, Cloud SQL, IAM, Cloud DNS, Cloud Monitoring Infrastructure as Code: Terraform (preferred), Deployment Manager Containerization: Docker, Kubernetes (GKE expertise required) CI/CD Tools: GitHub Actions, Cloud Build, Jenkins, or similar Version Control: Git Scripting Languages: Python, Bash Monitoring & Logging: Stackdriver, Prometheus, Grafana, ELK Stack Strong experience with automation and configuration management (Terraform, Ansible, etc.) Solid understanding of cloud security best practices Experience designing fault-tolerant, resilient cloud-native architectures 47 years in DevOps/Cloud Engineering roles Minimum 2+ years hands-on with GCP infrastructure and services Proven experience managing CI/CD pipelines and container-based deployments Strong background in modern DevOps tools and cloud-native architectures Preferred candidate profile
Posted 1 month ago
3 - 8 years
8 - 18 Lacs
Bengaluru
Remote
JD for CAA: CAA (Cloud architect Advisory) Required Skill set Experience REX ( SAP Basis, SAP HANA and S/4 HANA skills, Migration/ Upgrades & Experience on Any Cloud (AWS, MS Azuring) Good Comm Skill ,who can articulate , Exp in Customer Facing Scope for the career Have a scope of learning new technologies on SAP Will be working as Elevated role to Basis consultant Exposure to Multiple customers in Various Industry Roles and Responsibilities: As a Technical Architect below would be the Roles & Responsibilities: Bill Of Material Review & Analysis Review technical accuracy of Bill of Material Provide feedback to the Sales Team on Bill of Material Do technical validation of customer landscape Analyze Early Watch Reports & Technical Data. Work closely with Sales Team on architecture, landscape, and to-be. Sizing the to be Landscape for Migrations. Customer Deal Support – Pre-Signature Prepare Technical Assessments Conduct Technical Assessments with customers Perform workshops with customer (networking, HA concepts, RACI reviews) Customer Deal Support – Post Signature Onboarding Questionnaire Support (assist customer with questions on onboarding) Answer questions of customer relating to delivery Provide delivery with key information during sales cycle Provide turnover to delivery of key items during sales cycle Answer questions of delivery specific to the delivery Technical Skill Set (CAA) Needed: Technical expertise in SAP Basis area. Good understanding & hands-on experience required in S/4 HANA application & HANA database. Hands-on experience in any of hyper scaler (AWS/ Azure/ GCP) is needed. Experience in SAP Upgrade & Migration (OS/DB) is required. Experience in SaaS products (Ariba, Salesforce, C4S etc.) integration with SAP Landscape is plus.
Posted 1 month ago
6 - 10 years
15 - 19 Lacs
Hyderabad, Ahmedabad
Hybrid
Summary: As a Senior SRE, you will ensure platform reliability, incident management, and performance optimization. You'll define SLIs/SLOs, contribute to robust observability practices, and drive proactive reliability engineering across services. Experience Required: 610 years of SRE or infrastructure engineering experience in cloud-native environments. Mandatory: Cloud: GCP (GKE, Load Balancing, VPN, IAM) Observability: Prometheus, Grafana, ELK, Datadog Containers & Orchestration: Kubernetes, Docker Incident Management: On-call, RCA, SLIs/SLOs IaC: Terraform, Helm Incident Tools: PagerDuty, OpsGenie Nice to Have : GCP Monitoring, Skywalking Service Mesh, API Gateway GCP Spanner, MongoDB (basic)
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane