Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 - 17.0 years
10 - 20 Lacs
Pune, Chennai, Bengaluru
Hybrid
Job Description: Cloud Infrastructure & Deployment Design and implement secure, scalable, and highly available cloud infrastructure on GCP. Provision and manage compute, storage, network, and database services. Automate infrastructure using Infrastructure as Code (IaC) tools such as Terraform or Deployment Manager. Architecture & Design Translate business requirements into scalable cloud solutions. Recommend GCP services aligned with application needs and cost optimization. Participate in high-level architecture and solution design discussions. DevOps & Automation Build and maintain CI/CD pipelines (e.g., using Cloud Build, Jenkins, GitLab CI). Integrate monitoring, logging, and alerting (e.g., Stackdriver / Cloud Operations Suite). Enable autoscaling, load balancing, and zero-downtime deployments. Security & Compliance Ensure compliance with security standards and best Migration & Optimization Support cloud migration projects from on-premise or other cloud providers to GCP. Optimize performance, reliability, and cost of GCP workloads. Documentation & Support Maintain technical documentation and architecture diagrams. Provide L2/L3 support for GCP-based services and incidents. Required Skills and Qualifications: Google Cloud Certification Associate Cloud Engineer or Professional Cloud Architect/Engineer Hands-on experience with GCP services (Compute Engine, GKE, Cloud SQL, BigQuery, etc.) Strong command of Linux , shell scripting , and networking fundamentals Proficiency in Terraform , Cloud Build , Cloud Functions , or other GCP-native tools Experience with containers and orchestration – Docker, Kubernetes (GKE) Familiarity with monitoring/logging – Cloud Monitoring , Prometheus , Grafana Understanding of IAM , VPCs , firewall rules , service accounts , and Cloud Identity Excellent written and verbal communication skills
Posted 2 weeks ago
4.0 - 9.0 years
7 - 17 Lacs
Mumbai, Mumbai Suburban, Mumbai (All Areas)
Hybrid
Job Title: Gcp Data Engineer Senior Associate / Manager Experience: 4 to 11 years Location: Mumbai Notice period: Immediate to 30 days Job Description Designing, building and deploying cloud solution for enterprise applications, with expertise in Cloud Platform Engineering. -Expertise in application migration projects including optimizing technical reliability and improving application performance -Good understanding of cloud security frameworks and cloud Security standards -Solid knowledge and extensive experience of GCP and its cloud services. -Experience with GCP services as Compute Engine, Dataproc, Dataflow, Big Query, Secret Manager, Kubernetes Engine and c. -Experiencing in Google storage products like Cloud Storage, Persistent Disk, Nearline, Coldline and Cloud Filestore -Experience in Database products like Datastore, Cloud SQL, Cloud Spanner & Cloud Bigtable -Experience with implementing containers using cloud native container orchestrators in GCP -Strong cloud programming skill with experience in API and Cloud Functions development using Python -Hands-on experience with enterprise config & DevOps tools including Ansible, BitBucket, Git, Jira, and Confluence. -Strong knowledge of cloud Security practices and Cloud IAM Policy preparation for GCP -Knowledge and experience in API development, AI/ML, Data Lake, Data Analytics, Cloud Monitoring tool like Stackdriver -Ability to participate in fast-paced DevOps and System Engineering teams within Scrum agile processes -Should have Understanding of data modelling, Data warehousing concepts -Understand the current application infrastructure and suggest changes to it.
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
You will be joining NTT DATA as a GCP BigQuery Developer in Hyderabad, Telangana, India. As a Sr. Application Developer for GCP, your primary responsibilities will include utilizing your ETL experience, expertise in Google Cloud Platform BigQuery, SQL, and Linux. In addition to the mandatory skills, having experience in Cloud Run and Cloud Functions would be beneficial for this role. We are looking for a Senior ETL Developer with a strong hands-on background in Linux and SQL. While experience in GCP BigQuery is preferred, a solid conceptual understanding is required at a minimum. NTT DATA is a trusted global innovator providing business and technology services to 75% of the Fortune Global 100 companies. As a Global Top Employer, we have experts in over 50 countries and a robust partner ecosystem. Our services cover consulting, data and artificial intelligence, industry solutions, as well as application, infrastructure, and connectivity development, implementation, and management. Join us to be a part of our commitment to helping clients innovate, optimize, and transform for long-term success. Visit us at us.nttdata.com to learn more about our contributions to digital and AI infrastructure worldwide.,
Posted 2 weeks ago
1.0 - 4.0 years
2 - 3 Lacs
Bengaluru
Remote
We are hiring a Full Stack Developer with strong exposure to AI tools, APIs, and product development workflows. This role is for someone who can independently design, build, and deploy full-stack applications, and also integrate AI-powered components such as RSVP agents, recommendation systems, conversational flows, and automation tools. Responsibilities Build and maintain full-stack web apps using React, Node.js, Python Integrate AI/ML APIs like OpenAI, Cohere, LangChain, Pinecone, etc. Architect intelligent features using vector databases, RAG pipelines, and custom agent flows Work on both frontend + backend and own deployment, testing & CI/CD Collaborate closely with product & automation teams Must-Have Skills Strong proficiency in JavaScript/TypeScript, Python, Node.js Comfortable with NoSQL, PostgreSQL, Firebase, or Supabase Experience with API integrations, automation, and microservices Understanding of AI agentic flows, embeddings, and webhooks Experience deploying products on Vercel, Render, or similar Good to Have Familiarity with tools like LangChain, LlamaIndex, or OpenAI Assistants Working knowledge of Next.js, Tailwind CSS, and prompt engineering Work Culture Fully remote Flat structure Fast execution environment Product-first mindset
Posted 3 weeks ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
As an iOS Developer with our B2C company, you will be responsible for building the iOS app from scratch. You will architect and develop new flows and features on our iOS app, ensuring the performance, quality, and responsiveness of the application. Your contributions will be vital in designing, architecting, and developing apps that are elegant, efficient, secure, highly available, and maintainable. Collaborating with a team, you will define, design, and ship new features, identifying and correcting bottlenecks and fixing bugs along the way. To excel in this role, you must possess strong iOS fundamentals and have experience with offline storage, threading, and performance tuning. Your proficiency in Objective-C/Swift programming, Xcode, and iOS SDK will be crucial. Additionally, you should be well-versed in RESTful APIs to connect iOS applications to back-end services and possess expertise in iOS UI design principles, patterns, and best practices. Knowledge of Cloud Firestore, Cloud Functions, Cloud Messaging APIs, and push notifications is essential. You should have ownership skills, demonstrating the ability to own problems end-to-end. Your enthusiasm and dedication to building a product used by millions of users are key, along with your problem-solving abilities and determination to achieve results. In return, we offer a high pace of learning, the opportunity to build a product from scratch, high autonomy and ownership, and the chance to work with a great and ambitious team on something that truly matters. You will receive a top-of-the-class market salary, meaningful ESOP ownership, and benefits including health insurance, paid sick time, paid time off, and provident fund. This is a full-time, permanent position with day shift hours from Monday to Friday, including a yearly bonus. A Bachelor's degree is preferred, and you should have at least 4 years of experience in iOS development. The work location is in person.,
Posted 3 weeks ago
12.0 - 14.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Summary: We are seeking an experienced and highly motivated Delivery Lead to spearhead the successful implementation of Google Contact Center AI (CCAI) solutions, with a strong emphasis on CCAI Agent Assist and Dialogflow (ES/CX) . This role is critical in bridging the gap between solution design and technical execution. The Delivery Lead will be responsible for leading project teams, managing client relationships, and ensuring the on-time, on-budget, and high-quality delivery of complex conversational AI and agent augmentation projects. You will act as the primary point of contact for project stakeholders, proactively identifying and mitigating risks, and ensuring that strategic objectives are met through technical excellence. Key Responsibilities: Project Leadership & Management (60%): Lead the full lifecycle of CCAI projects from initiation and planning through execution, monitoring, control, and closure. Develop and manage comprehensive project plans, including scope definition, detailed timelines, resource allocation, and budget tracking. Serve as the primary client contact for project delivery, establishing strong relationships, managing expectations, and providing regular progress updates. Lead and motivate diverse project teams (Solution Architects, NLU Specialists, Engineers, QA Analysts), fostering a collaborative and high-performing environment. Proactively identify, assess, and mitigate project risks and issues, implementing contingency plans to ensure successful outcomes. Manage project scope changes effectively, ensuring proper documentation and communication to all stakeholders. Conduct regular internal and external project review meetings, preparing and presenting status reports to senior management and clients. Ensure projects adhere to defined quality standards, best practices, and governance frameworks (e.g., Agile/Scrum). Technical Oversight & Quality Assurance (30%): Understand and validate the technical solution architecture for CCAI Agent Assist and Dialogflow, ensuring it aligns with client requirements and business objectives. Provide technical guidance and oversight to the engineering and development teams, ensuring adherence to design specifications and best practices for NLU and conversational AI. Specifically oversee the implementation of Dialogflow agents (intents, entities, flows, fulfillment logic) and CCAI Agent Assist features (real-time knowledge base integration, smart reply suggestions, sentiment analysis, script nudges). Ensure seamless integration of CCAI solutions with existing contact center platforms (e.g., Genesys, Twilio, Salesforce Service Cloud, Zendesk) and enterprise systems. Work closely with QA to define comprehensive testing strategies (unit, integration, UAT, performance) for conversational AI flows and agent assistance capabilities. Facilitate technical problem-solving during project execution, collaborating with architects and engineers to overcome complex challenges. Ensure solutions are built for scalability, security, reliability, and maintainability. Stakeholder Management & Communication (10%): Translate technical concepts and project updates into clear, concise language for non-technical stakeholders and business leadership. Negotiate and resolve conflicts effectively, maintaining positive client relationships. Collaborate with pre-sales teams to refine project scope and estimates during the planning phase. Facilitate knowledge transfer and training for client teams post-deployment, ensuring successful adoption and ongoing support. Required Qualifications: Bachelor's degree in Computer Science, Information Technology, Engineering, or a related field. 12+ years of experience in technical project management, delivery leadership, or a similar client-facing role. 3-5+ years of demonstrable experience leading the delivery of Google Cloud-based AI solutions, with specific hands-on project experience involving: Google CCAI Agent Assist (critical) Google Dialogflow (ES and/or CX) (critical) Strong understanding of conversational AI principles, NLU, and contact center operations. Proven experience managing complex projects with cross-functional technical teams. Familiarity with core Google Cloud Platform (GCP) services relevant to AI deployments (e.g., Cloud Functions, BigQuery, Pub/Sub). Experience with Agile/Scrum methodologies and tools (e.g., Jira, Confluence). Exceptional leadership, communication, interpersonal, and presentation skills (both written and verbal). Strong analytical, problem-solving, and negotiation abilities. Proven ability to manage multiple projects concurrently and adapt to changing priorities. Preferred Qualifications: Master's degree or PMP/Agile certification (CSM, PMI-ACP). Google Cloud Certification (e.g., Professional Cloud Architect, Professional Collaboration Engineer). Hands-on experience with contact center platforms beyond CCAI (e.g., Genesys, Avaya, Cisco, Five9). Experience with other Google AI services (e.g., Speech-to-Text, Text-to-Speech, Vertex AI, Gemini models) and understanding of their integration potential. Technical background in software development (e.g., Python, Node.js) to understand implementation complexities. Experience in pre-sales activities, including solution scoping and effort estimation. Understanding of data privacy and security best practices in a contact center environment.
Posted 3 weeks ago
7.0 - 12.0 years
27 - 35 Lacs
Bengaluru
Work from Office
Job Overview We are hiring a seasoned Site Reliability Engineer with strong experience in building and operating scalable systems on Google Cloud Platform (GCP). You will be responsible for ensuring system availability, performance, and security in a complex microservices ecosystem, while collaborating cross-functionally to improve infrastructure reliability and developer velocity. Key Responsibilities - Design and maintain highly available, fault-tolerant systems on GCP using SRE best practices. - Implement SLIs/SLOs, monitor error budgets, and lead post-incident reviews with RCA documentation. - Automate infrastructure provisioning (Terraform/Deployment Manager) and CI/CD workflows. - Operate and optimize Kubernetes (GKE) clusters including autoscaling, resource tuning, and HPA policies. - Integrate observability across microservices using Prometheus, Grafana, Stackdriver, and OpenTelemetry. - Manage and fine-tune databases (MySQL/Postgres/BigQuery/Firestore) for performance and cost. - Improve API reliability and performance through Apigee (proxy tuning, quota/policy handling, caching). - Drive container best practices including image optimization, vulnerability scanning, and registry hygiene. - Participate in on-call rotations, capacity planning, and infrastructure cost reviews. Must-Have Skills - Minimum 8 years of total experience, with at least 3 years in SRE, DevOps, or Platform Engineering roles. - Strong expertise in GCP services (GKE, IAM, Cloud Run, Cloud Functions, Pub/Sub, VPC, Monitoring). - Advanced Kubernetes knowledge: pod orchestration, secrets management, liveness/readiness probes. - Experience in writing automation tools/scripts in Python, Bash, or Go. - Solid understanding of incident response frameworks and runbook development. - CI/CD expertise with GitHub Actions, Cloud Build, or similar tools. Good to Have - Apigee hands-on experience: API proxy lifecycle, policies, debugging, and analytics. - Database optimization: index tuning, slow query analysis, horizontal/vertical sharding. - Distributed monitoring and tracing: familiarity with Jaeger, Zipkin, or GCP Trace. - Service Mesh (Istio/Linkerd) and secure workload identity configurations. - Exposure to BCP/DR planning, infrastructure threat modeling, and compliance (ISO/SOC2). Educational & Certification Requirements - B.Tech / M.Tech / MCA in Computer Science or equivalent. - GCP Professional Cl
Posted 3 weeks ago
10.0 - 15.0 years
12 - 22 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Job Description: Cloud Infrastructure & Deployment Design and implement secure, scalable, and highly available cloud infrastructure on GCP. Provision and manage compute, storage, network, and database services. Automate infrastructure using Infrastructure as Code (IaC) tools such as Terraform or Deployment Manager. Architecture & Design Translate business requirements into scalable cloud solutions. Recommend GCP services aligned with application needs and cost optimization. Participate in high-level architecture and solution design discussions. DevOps & Automation Build and maintain CI/CD pipelines (e.g., using Cloud Build, Jenkins, GitLab CI). Integrate monitoring, logging, and alerting (e.g., Stackdriver / Cloud Operations Suite). Enable autoscaling, load balancing, and zero-downtime deployments. Security & Compliance Ensure compliance with security standards and best Migration & Optimization Support cloud migration projects from on-premise or other cloud providers to GCP. Optimize performance, reliability, and cost of GCP workloads. Documentation & Support Maintain technical documentation and architecture diagrams. Provide L2/L3 support for GCP-based services and incidents. Required Skills and Qualifications: Google Cloud Certification Associate Cloud Engineer or Professional Cloud Architect/Engineer Hands-on experience with GCP services (Compute Engine, GKE, Cloud SQL, BigQuery, etc.) Strong command of Linux , shell scripting , and networking fundamentals Proficiency in Terraform , Cloud Build , Cloud Functions , or other GCP-native tools Experience with containers and orchestration – Docker, Kubernetes (GKE) Familiarity with monitoring/logging – Cloud Monitoring , Prometheus , Grafana Understanding of IAM , VPCs , firewall rules , service accounts , and Cloud Identity Excellent written and verbal communication skills
Posted 3 weeks ago
10.0 - 12.0 years
12 - 13 Lacs
Pune
Remote
Job Description: Cloud Infrastructure & Deployment Design and implement secure, scalable, and highly available cloud infrastructure on GCP. Provision and manage compute, storage, network, and database services. Automate infrastructure using Infrastructure as Code (IaC) tools such as Terraform or Deployment Manager. Architecture & Design Translate business requirements into scalable cloud solutions. Recommend GCP services aligned with application needs and cost optimization. Participate in high-level architecture and solution design discussions. Role & responsibilities Preferred candidate profile
Posted 3 weeks ago
10.0 - 18.0 years
12 - 22 Lacs
Bengaluru
Hybrid
GCP Certified with Sr Architect Experience Designing and Architecture, GCP services (Compute Engine, GKE, Cloud SQL, BigQuery, etc.) containers and orchestration Terraform, Cloud Build, Cloud Functions, or other GCP-native tools IAM, VPCs, firewall rules, service accounts, and Cloud Identity Grafana – any Monitoring tool
Posted 4 weeks ago
5.0 - 9.0 years
15 - 20 Lacs
Chennai
Hybrid
Position Description: Require GCP Cloud Infra Engineer with the below skills, • Creating and using cloud-native solutions • Establishing criteria with key stakeholders • Identifying weak points in systems and infrastructure • Creating mitigation plans • Assisting the security team • Overseeing the implementation of services or solutions • Creating and managing monitoring, capacity planning, configuration management, and scaling • Creation of CI-CD pipeline and workflow Monitor skill to fix the system performance and Dynatrace log monitoring - SRE Dashboard creation for all API services Skills Required: CI-CD, Cloud Function, GitHub, Cloud Computing, Cloud Infrastructure, Google Cloud Platform Skills Preferred: Cloud Architecture Experience Required: 5+ Years Bachelor's Degree
Posted 1 month ago
4.0 - 6.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Major skillset: GCP , Pyspark , SQL , Python , Cloud Architecture , ETL , Automation 4+ years of Experience in Data Engineering, Management with strong focus on Spark for building production ready Data Pipelines. Experienced with analyzing large data sets from multiple data sources and build automated testing and validations. Knowledge of Hadoop eco-system and components like HDFS , Spark , Hive , Sqoop Strong Python experience Hands on SQL , HQL to write optimized queries Strong hands-on experience with GCP Big Query , Data Proc , Airflow DAG , Dataflow , GCS , Pub/sub , Secret Manager , Cloud Functions , Beams . Ability to work in fast passed collaborative environment work with various stakeholders to define strategic optimization initiatives. Deep understanding of distributed computing, memory turning and spark optimization. Familiar with CI/CD workflows, Git . Experience in designing modular , automated , and secure ETL frameworks .
Posted 1 month ago
6.0 - 10.0 years
15 - 20 Lacs
Hyderabad
Work from Office
Develop, optimize, and maintain scalable data pipelines using Python and PySpark. Design and implement data processing workflows leveraging GCP services such as: BigQuery Dataflow Cloud Functions Cloud Storage
Posted 1 month ago
3.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description: We are seeking a skilled and proactive GCP Cloud Engineer with 3 5 years of hands on experience in managing and optimizing cloud infrastructure using Google Cloud Platform GCP The ideal candidate will be responsible for designing deploying and maintaining secure and scalable cloud environments collaborating with cross functional teams and driving automation and reliability across our cloud infrastructure Key Responsibilities: Design and implement cloud native solutions on Google Cloud Platform Deploy and manage infrastructure using Terraform Cloud Deployment Manager or similar IaC tools Manage GCP services such as Compute Engine GKE Kubernetes Cloud Storage Pub Sub Cloud Functions BigQuery etc Optimize cloud performance cost and scalability Ensure security best practices and compliance across the GCP environment Monitor and troubleshoot issues using Stackdriver Cloud Monitoring Collaborate with development DevOps and security teams Automate workflows CI CD pipelines using tools like Jenkins GitLab CI or Cloud Build Technical Requirements: 3 5 years of hands on experience with GCP Strong expertise in Terraform GCP networking and cloud security Proficient in container orchestration using Kubernetes GKE Experience with CI CD DevOps practices and shell scripting or Python Good understanding of IAM VPC firewall rules and service accounts Familiarity with monitoring logging tools like Stackdriver or Prometheus Strong problem solving and troubleshooting skills Additional Responsibilities: GCP Professional certification e g Professional Cloud Architect Cloud Engineer Experience with hybrid cloud or multi cloud architecture Exposure to other cloud platforms AWS Azure is a plus Strong communication and teamwork skills Preferred Skills: Cloud Platform ->Google Cloud Platform Developer->GCP/ Google Cloud,Java,Java->Springboot,.Net,Python
Posted 1 month ago
3.0 - 6.0 years
30 - 39 Lacs
Bengaluru
Work from Office
Responsibilities: * Design, develop, test & maintain full-stack applications using Python, React.JS, Node.JS, NestJS, TypeScript & NextJS with REST APIs on AWS/Azure/GCP cloud platforms. Annual bonus
Posted 1 month ago
5.0 - 8.0 years
10 - 20 Lacs
Noida, Bhubaneswar, Greater Noida
Work from Office
GA4, Firebase Analytics BigQuery (SQL, optimization, partitioning, clustering) Looker / Looker Studio (dashboarding, modeling) GCP tools: Cloud Storage, Pub/Sub, Dataflow, Functions GCP certifications, A/B testing, product analytics preferred
Posted 1 month ago
3.0 - 6.0 years
30 - 39 Lacs
Gurugram
Work from Office
Responsibilities: * Design, develop, test & maintain full-stack applications using Python, React.JS, Node.JS, NestJS, TypeScript & NextJS with REST APIs on AWS/Azure/GCP cloud platforms. Annual bonus
Posted 1 month ago
4.0 - 8.0 years
10 - 18 Lacs
Hyderabad
Hybrid
About the Role: We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems. Key Responsibilities: Design, develop, test, and maintain scalable ETL data pipelines using Python. Work extensively on Google Cloud Platform (GCP) services such as: Dataflow for real-time and batch data processing Cloud Functions for lightweight serverless compute BigQuery for data warehousing and analytics Cloud Composer for orchestration of data workflows (based on Apache Airflow) Google Cloud Storage (GCS) for managing data at scale IAM for access control and security Cloud Run for containerized applications Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery. Implement and enforce data quality checks, validation rules, and monitoring. Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions. Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects. Write complex SQL queries for data extraction and validation from relational databases such as SQL Server, Oracle, or PostgreSQL. Document pipeline designs, data flow diagrams, and operational support procedures. Required Skills: 4-6 years of hands-on experience in Python for backend or data engineering projects. Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.). Solid understanding of data pipeline architecture, data integration, and transformation techniques. Experience in working with version control systems like GitHub and knowledge of CI/CD practices. Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.). Good to Have (Optional Skills): Experience working with Snowflake cloud data platform. Hands-on knowledge of Databricks for big data processing and analytics. Familiarity with Azure Data Factory (ADF) and other Azure data engineering tools.
Posted 1 month ago
7.0 - 9.0 years
8 - 10 Lacs
Navi Mumbai
Work from Office
Develop scalable APIs and backend systems for AI-powered apps. Handle authentication, content automation, language handling, media streaming, and AI API integrations. Collaborate with mobile and AI teams in a fast-paced environment.
Posted 1 month ago
7.0 - 9.0 years
10 - 12 Lacs
Navi Mumbai
Hybrid
Build high-performance mobile apps using Flutter. Integrate AI APIs, manage backend tasks, and handle native module development. Work in a fast-paced setup with full ownership of features and cross-functional collaboration.
Posted 1 month ago
5.0 - 10.0 years
20 - 30 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Role & responsibilities Key Skills: 3 years of experience with building modern applications utilizing GCP services like Cloud Build, Cloud Functions/ Cloud Run, GKE, Logging, GCS, CloudSQL & IAM. Primary proficiency in Python and experience with a secondary language such as Golang or Java. In-depth knowledge and hands-on experience with GKE/K8s. You place a high emphasis on Software Engineering fundamentals such as code and configuration management, CICD/Automation and automated testing. Working with operations, security, compliance and architecture groups to develop secure, scalable and supportable solutions. Working and delivering solution in a complex enterprise environment. Proficiency in designing and developing scalable and decoupled microservices and adeptness in implementing event-driven architecture to ensure seamless and responsive service interactions. Proficiency in designing scalable and robust solutions leveraging cloud-native technologies and architectures. Expertise in managing diverse stakeholder expectations and adept at prioritizing tasks to align with strategic objectives and deliver optimal outcomes. Good to have knowledge, skills and experiences The ‘good to have’ knowledge, skill and experience (KSE) the role requires are: Ability to integrate Kafka to handle real-time data. Proficiency in monitoring tools Experience using Robot Framework for automated UAT is highly desirable.
Posted 1 month ago
5.0 - 8.0 years
5 - 8 Lacs
Mumbai, Maharashtra, India
On-site
Key Accountabilities Design, create, code, and support a variety of GCP, ETL, and SQL solutions. Apply agile techniques or methods to project execution. Collaborate effectively in a distributed global team environment. Communicate technical concepts effectively with business stakeholders and influence decision-making. Analyze existing processes and development requirements to enhance efficiency. Manage multiple stakeholders and tasks, navigating ambiguity and complexity. Translate business needs into insights by collaborating with architects, solution managers, and analysts. Maintain strong technical skills and share knowledge within the team. Resolve issues by working with system users, IT department, vendors, and service providers. Support existing data warehouse jobs and related processes. Utilize task/job scheduling tools like Talend, Tidal, Airflow, and Linux. Lead small projects/initiatives and contribute to enterprise implementations. Research modern development technologies and techniques proactively. Foster a continuous improvement and automation mindset to streamline processes. Train internal teams, IT functions, and business users. Be familiar with real-time and streaming data processes. Minimum Qualifications 58+ years of relevant experience as a Data Engineer or similar role. Hands-on experience with modern cloud data engineering services. Understanding of SAP landscape and data governance tools. Basic understanding of cybersecurity requirements. Excellent communication, analytical, and stakeholder management skills. Skill Proficiency Expert Level SQL Python Data Warehousing Concepts Intermediate Level GCP (Cloud Storage, Modeling, Real-time) BigQuery S3 / Blob Storage Composer Cloud Functions (Lambda/Azure Function) dbt Basic Level / Preferred Data Modeling Concepts Preferred Qualifications GCP Data Engineer certification Understanding of the CPG (Consumer Packaged Goods) industry
Posted 1 month ago
5.0 - 12.0 years
5 - 12 Lacs
Mumbai, Maharashtra, India
On-site
KEY ACCOUNTABILITIES Design, create, code, and support a variety of data pipelines and models on any cloud technology (GCP preferred) Partner with business analysts, architects, and other key project stakeholders to deliver business initiatives Seeks to learn new skills, mentor newer team members, build domain expertise and document processes Actively builds knowledge of D&T resources, people, and technology Participate in the evaluation, implementation and deployment of emerging tools & process in the big data space Collaboratively troubleshoot technical and performance issues in the data space Leans into ambiguity and partners with others to find solutions Ability to identify opportunities to contribute work to the broader GMI data community Ability to manage multiple stakeholders, tasks and navigate through ambiguity & complexity Able to lead small projects/initiatives and contribute/lead effectively to the implementation of enterprise projects. Support existing Data warehouses & related jobs. Familiarity with real time and streaming data processes Proactive research into up-to-date technology or techniques for development Should have automation mindset to embrace a Continuous Improvement mentality to streamline & eliminate waste in all processes. MINIMUM QUALIFICATIONS Identified as the technical /project lead for global projects Actively coaches and mentors team of developers Pro-actively identifies potential issues / deadline slippage /opportunities in projects/tasks and takes timely decisions Demonstrates strong affinity towards paying attention to details and delivery accuracy Collaborates with the business stakeholders and develop strong working relationships Self-motivated team player and should have ability to overcome challenges and achieve desired results 5-12 years of total experience in ETL/Data Space, Min. 2+ relevant experience in Cloud Space Excellent communication skills- verbal and written Excellent analytical skills Expert Level - Cloud (Storage, Modelling, Real time) GCP Preferred Data Storage (S3 / Blob Storage) Big Query SQL Composer Cloud Functions (Lambda/Azure function) Data Warehousing Concepts Intermediate Level - Python Kafka, Pub/Sub Basic Level - dBT PREFERRED QUALIFICATIONS GCP Data Engineer certification, GCP certification Understanding of CPG industry
Posted 1 month ago
0.0 years
0 Lacs
Pune, Maharashtra, India
On-site
About VOIS: VO IS (Vodafone Intelligent Solutions) is a strategic arm of Vodafone Group Plc, creating value and enhancing quality and efficiency across 28 countries, and operating from 7 locations: Albania, Egypt, Hungary, India, Romania, Spain and the UK. Over 29,000 highly skilled individuals are dedicated to being Vodafone Group's partner of choice for talent, technology, and transformation. We deliver the best services across IT, Business Intelligence Services, Customer Operations, Business Operations, HR, Finance, Supply Chain, HR Operations, and many more. Established in 2006, VO IS has evolved into a global, multi-functional organisation, a Centre of Excellence for Intelligent Solutions focused on adding value and delivering business outcomes for Vodafone. About VOIS India: In 2009, VO IS started operating in India and now has established global delivery centres in Pune, Bangalore and Ahmedabad. With more than 14,500 employees, VO IS India supports global markets and group functions of Vodafone, and delivers best-in-class customer experience through multi-functional services in the areas of Information Technology, Networks, Business Intelligence and Analytics, Digital Business Solutions (Robotics & AI), Commercial Operations (Consumer & Business), Intelligent Operations, Finance Operations, Supply Chain Operations and HR Operations and more. Job Description Role purpose: Creating detailed data architecture documentation, including data models, data flow diagrams, and technical specifications Creating and maintaining data models for databases, data warehouses, and data lakes, defining relationships between data entities to optimize data retrieval and analysis. Designing and implementing data pipelines to integrate data from multiple sources, ensuring data consistency and quality across systems. Collaborating with business stakeholders to define the overall data strategy, aligning data needs with business requirements. Support migration of new & changed software, elaborate and perform production checks Need to effectively communicate complex data concepts to both technical and non-technical stakeholders. GCP Knowledge/exp with Cloud Composer, BigQuery, Pub/Sub, Cloud Functions. -- Strong communicator, experienced in leading & negotiating decision and effective outcomes. -- Strong overarching Data Architecture knowledge and experience with ability to govern application of architecture principles within projects VOIS Equal Opportunity Employer Commitment India: VO IS is proud to be an Equal Employment Opportunity Employer. We celebrate differences and we welcome and value diverse people and insights. We believe that being authentically human and inclusive powers our employees growth and enables them to create a positive impact on themselves and society. We do not discriminate based on age, colour, gender (including pregnancy, childbirth, or related medical conditions), gender identity, gender expression, national origin, race, religion, sexual orientation, status as an individual with a disability, or other applicable legally protected characteristics. As a result of living and breathing our commitment, our employees have helped us get certified as a Great Place to Work in India for four years running. We have been also highlighted among the Top 10 Best Workplaces for Millennials, Equity, and Inclusion , Top 50 Best Workplaces for Women , Top 25 Best Workplaces in IT & IT-BPM and 10th Overall Best Workplaces in India by the Great Place to Work Institute in 2024. These achievements position us among a select group of trustworthy and high-performing companies which put their employees at the heart of everything they do. By joining us, you are part of our commitment. We look forward to welcoming you into our family which represents a variety of cultures, backgrounds, perspectives, and skills! Apply now, and we'll be in touch!
Posted 1 month ago
1.0 - 6.0 years
1 - 6 Lacs
Gurgaon / Gurugram, Haryana, India
On-site
We are looking for a Senior Flutter & Firebase Developer with expertise in full-stack mobile development. The ideal candidate should be skilled in building scalable, secure applications while leading technical decisions and ensuring UI excellence. Primary Skills (Must-Have) Flutter/Dart: Strong expertise in Flutter framework and Dart programming. State Management: Proficiency in BLoC/Redux for managing app states. Firebase Services: Experience with Firestore, Authentication, Cloud Functions, and Real-time Database. Cloud Architecture: Ability to design scalable and secure backend solutions. Security Implementation: Knowledge of app security best practices. UI/UX Mastery: Ability to craft pixel-perfect and highly responsive UI. Technical Leadership: Experience leading teams and making architectural decisions. Secondary Skills (Good-to-Have) RESTful APIs & GraphQL Backend Development (Node.js/Python/Go) CI/CD Pipelines & DevOps Docker & Kubernetes Performance Optimization & Debugging Unit & Integration Testing Role & Responsibilities Develop and maintain high-performance mobile applications using Flutter. Design, architect, and implement scalable Firebase-based solutions. Ensure security best practices and optimize app performance. Lead technical discussions and mentor junior developers. Collaborate with cross-functional teams for seamless development
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough