Jobs
Interviews

38 Cloud Pubsub Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 7.0 years

4 - 9 Lacs

bengaluru

Work from Office

Educational Requirements MCA,MTech,Bachelor of Engineering,Bachelor Of Technology,BCA,BSc,BTech Service Line Application Development and Maintenance Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organizations financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Preferred Locations: Bangalore, Hyderabad, Chennai, Pune Experience Required: 3 to 5 years of experience: Pure hands on and expertise on the skill, able to deliver without any support Experience Required: 5 - 9 years of experience: Design knowledge, estimation technique, leading and guiding the team on technical solution Experience Required: 9 - 13 years of experience: Architecture, Solutioning, (Optional) proposal Containerization, micro service development on AWS/Azure/GCP is preferred. In-depth knowledge of design issues and best practices Solid understanding of object-oriented programming Familiar with various design, architectural patterns and software development process. Implementing automated testing platforms and unit tests Strong experience in building and developing applications using technologies like Python Knowledge about RESTful APIs and ability to design cloud ready applications using cloud SDKs , microservices Exposure to cloud compute services like VMs, PaaS services, containers, serverless and storage services on AWS/Azure/GCP Good understanding of application development design patterns Technical and Professional Requirements: Primary Skill: PythonSecondary Skills: AWS/Azure/GCP Preferred Skills: Technology->Machine Learning->Python Generic Skills: Technology->Cloud Platform->AWS App Development Technology->Cloud Platform->Azure Development & Solution Architecting Technology->Cloud Platform->GCP Devops

Posted 4 days ago

Apply

5.0 - 7.0 years

8 - 13 Lacs

bengaluru

Work from Office

Educational Requirements Bachelor of Engineering,BTech,Bachelor Of Technology,BCA,BSc,MTech,MCA Service Line Application Development and Maintenance Responsibilities A day in the life of an InfoscionResponsibilities: Application migration to AWS/Azure/GCP cloud User requirements, envisioning system features and functionality. Identify bottlenecks and bugs, and recommend system solutions by comparing advantages and disadvantages of custom development Contributing to team meetings, troubleshooting development and production problems across multiple environments and operating platforms Understand Architecture Requirements and ensure effective Design, Development, Validation and Support activities Understand and analyze client requirements, refactor systems for workload migration / modernization to cloud (AWS, Azure, GCP) End-to-end feature development and resolving challenges faced in the implementation Create detailed design artifacts, work on development, and perform code reviews, implement validation and support activities Contribute to thought leadership within the area of technology specializationIf you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Preferred Locations: Bangalore, Hyderabad, Chennai, Pune Experience Required: 3 to 5 years of experience: Pure hands on and expertise on the skill, able to deliver without any support Experience Required: 5 - 9 years of experience: Design knowledge, estimation technique, leading and guiding the team on technical solution Experience Required: 9 - 13 years of experience: Architecture, Solutioning, (Optional) proposal Containerization, micro service development on AWS/Azure/GCP is preferred. In-depth knowledge of design issues and best practices Solid understanding of object-oriented programming Familiar with various design, architectural patterns and software development process. Implementing automated testing platforms and unit tests Strong experience in building and developing applications using technologies like Devops, Terraform Knowledge about RESTful APIs and ability to design cloud ready applications using cloud SDKs Exposure to cloud compute services like VMs, PaaS services, containers, serverless and storage services on AWS/Azure/GCP Good understanding of application development design patterns Technical and Professional Requirements: Primary Skill: AWS/Azure/GCP + Devops + Terraform Preferred Skills: Technology->Cloud Platform->Azure App Development->Azure API Management Technology->Cloud Platform->GCP App Development Generic Skills: Technology->Cloud Platform->AWS App Development Technology->Cloud Platform->Azure Devops Technology->Cloud Platform->GCP Devops

Posted 6 days ago

Apply

5.0 - 9.0 years

5 - 9 Lacs

hyderabad

Work from Office

Educational Requirements Bachelor of Engineering,Bachelor Of Technology,MSc,MTech,MCA Service Line Application Development and Maintenance Responsibilities Responsibilities: Application migration to AWS cloud User requirements, envisioning system features and functionality. Identify bottlenecks and bugs, and recommend system solutions by comparing advantages and disadvantages of custom development Contributing to team meetings, troubleshooting development and production problems across multiple environments and operating platforms Understand Architecture Requirements and ensure effective Design, Development, Validation and Support activities Understand and analyze client requirements, refactor systems for workload migration / modernization to cloud (AWS) End-to-end feature development and resolving challenges faced in the implementation Create detailed design artifacts, work on development, and perform code reviews, implement validation and support activities Contribute to thought leadership within the area of technology specialization Additional Responsibilities: Skills: Containerization, micro service development on AWS is preferred. In-depth knowledge of design issues and best practices Solid understanding of object-oriented programming Familiar with various design, architectural patterns and software development process. Implementing automated testing platforms and unit tests Strong experience in building and developing applications using technologies like Java/J2EE, Springboot/Python Knowledge about RESTful APIs and ability to design cloud ready applications using cloud SDKs , microservices Exposure to cloud compute services like VMs, PaaS services, containers, serverless and storage services on AWS Good understanding of application development design patternsCompetencies: Good verbal and written communication skills Ability to communicate with remote teams in effective manner High flexibility to travel Ability to work both independently and in a multi-disciplinary team environment Technical and Professional Requirements: AWS - ELB/RDS/EC2/S3/IAM, Java/J2EE, Springboot/Python Preferred Skills: Technology->Cloud Security->AWS - Infrastructure Security->AWS Systems Manager

Posted 6 days ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

We are seeking a Senior Data Architect with over 7 years of experience in the field, specifically in data architecture roles. As a Senior Data Architect, your responsibilities will involve designing and implementing scalable, secure, and cost-effective data architectures utilizing Google Cloud Platform (GCP). You will play a key role in leading the design and development of data pipelines using tools such as BigQuery, Dataflow, and Cloud Storage. Additionally, you will be responsible for architecting and implementing data lakes, data warehouses, and real-time data processing solutions on GCP. It will be your duty to ensure that the data architecture is aligned with business objectives, governance, and compliance requirements. Collaboration with stakeholders to define data strategy and roadmap will be essential. Moreover, you will design and deploy BigQuery solutions for optimized performance and cost efficiency, as well as build and maintain ETL/ELT pipelines for large-scale data processing. Utilizing Cloud Pub/Sub, Dataflow, and Cloud Functions for real-time data integration, you will implement best practices for data security, privacy, and compliance in cloud environments. Integration of machine learning workflows with data pipelines and analytics tools will also be within your scope of work. Your expertise will be crucial in defining data governance frameworks and managing data lineage. Furthermore, you will lead data modeling efforts to ensure consistency, accuracy, and performance across systems. Optimizing cloud infrastructure for scalability, performance, and reliability will enable you to mentor junior team members and guarantee adherence to architectural standards. Collaboration with DevOps teams for the implementation of Infrastructure as Code (Terraform, Cloud Deployment Manager) will be part of your responsibilities. Ensuring high availability and disaster recovery solutions are integrated into data systems, conducting technical reviews, audits, and performance tuning for data solutions, and designing multi-region and multi-cloud data architecture solutions will be essential tasks. Staying updated on emerging technologies and trends in data engineering and GCP will be crucial to driving innovation in data architecture, including recommending new tools and services on GCP. Preferred qualifications include a Google Cloud Certification, with primary skills encompassing 7+ years of data architecture experience, expertise in BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, and related GCP services, strong proficiency in SQL, Python, or other data processing languages, experience in cloud security, data governance, and compliance frameworks, strong problem-solving skills, and the ability to architect solutions for complex data environments. Leadership experience and excellent communication and collaboration skills are also highly valued. Role: Senior Data Architect Location: Trivandrum/Bangalore Close Date: 14-03-2025,

Posted 1 week ago

Apply

5.0 - 10.0 years

15 - 19 Lacs

noida

Remote

Mandatory skill- Advance Python & SQL GCP Services- BigQuery, Dataflow, Dataproc and Pub/Sub. Key Responsibilities Design, develop and optimize scalable data pipelines and ETL workflows using Google Cloud Platform (GCP), particularly leveraging BigQuery, Dataflow, Dataproc and Pub/Sub. Design and manage secure, efficient data integrations involving Snowflake and BigQuery. Write, test and maintain high-quality Python code for data extraction, transformation and loading (ETL), analytics and automation tasks. Use Git for collaborative version control, code reviews and managing data engineering projects. Implement infrastructure-as-code practices using Pulumi for cloud resources management and automation within GCP environments. Apply clean room techniques to design and maintain secure data sharing environments in alignment with privacy standards and client requirements. Collaborate with cross-functional teams (data scientists, business analysts, product teams) to deliver data solutions, troubleshoot issues and assure data integrity throughout the lifecycle. Optimize performance of batch and streaming data pipelines, ensuring reliability and scalability. Maintain documentation on processes, data flows and configurations for operational transparency. ________________________________________ Required Skills Strong hands-on experience of 5+ years with GCP core data services: BigQuery, Dataflow, Dataproc and Pub/Sub. Proficiency in data engineering development using Python. Deep familiarity with Snowflakedata modeling, secure data sharing and advanced query optimization. Proven experience with Git for source code management and collaborative development. Demonstrated ability using Pulumi (or similar IaC tools) for deployment and support of cloud infrastructure. Practical understanding of cleanroom concepts in cloud data warehousing, including privacy/compliance considerations. Solid skills in debugging complex issues within data pipelines and cloud environments. Effective communication and documentation skills. ________________________________________ Great to Have GCP certification (e.g., Professional Data Engineer). Experience working in regulated environments (telecom/financial/healthcare) with data privacy and compliance focus. Exposure to additional GCP services such as Cloud Storage, Cloud Functions or Kubernetes. Demonstrated success collaborating in agile, distributed teams. Experience with data visualization tools (e.g., Tableau, Looker) is nice to have.

Posted 1 week ago

Apply

1.0 - 4.0 years

4 - 8 Lacs

hyderabad

Work from Office

Project Role : Technology Support Engineer Project Role Description : Resolve incidents and problems across multiple business system components and ensure operational stability. Create and implement Requests for Change (RFC) and update knowledge base articles to support effective troubleshooting. Collaborate with vendors and help service management teams with issue analysis and resolution. Must have skills : Google Cloud Platform Administration Good to have skills : Google Cloud FunctionsMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time educationWe are seeking an experienced Senior GCP Architect to lead the end-to-end design, deployment, and governance of scalable, secure, and cost-effective cloud solutions on Google Cloud Platform. This role combines deep cloud architecture experience with DevOps automation, compliance governance, and cloud financial management (FinOps).You will drive modernization efforts, influence cloud-native best practices, and ensure GCP infrastructure meets enterprise-grade security, resiliency, and cost-efficiency.________________________________________Key Responsibilities:Cloud Architecture (GCP)Design and implement scalable and secure GCP-based architectures for hybrid and cloud-native workloads.Select and configure core services:GKE, Cloud Functions, Cloud Run, VPC, IAM, Cloud Armor, Cloud SQL/Spanner, BigQuery, etc.Drive adoption of infrastructure-as-code using tools like Terraform and Deployment Manager. DevOps EnablementDesign CI/CD pipelines using Cloud Build, Jenkins, GitHub Actions, or GitLab CI.Integrate security and compliance checks into DevOps pipelines (DevSecOps).Promote containerization (Docker, GKE, Artifact Registry) and automation strategies. Security, Compliance & GovernanceDefine and enforce security controls, policies, and auditing mechanisms using Cloud Identity, Forseti, Config Validator, and Security Command Center.Ensure compliance with regulatory frameworks (e.g., HIPAA, SOC 2, ISO 27001, GDPR).Lead implementation of policy-as-code and guardrails using Org Policy, Cloud Asset Inventory, and Policy Simulator. Cloud Cost Optimization (FinOps)Drive cost visibility and optimization across GCP using Billing reports, Budgets, Recommendations, and Custom dashboards (Looker, BigQuery).Implement cost allocation models, showback/chargeback, and budget forecasting.Identify underutilized resources and advise on purchasing strategies (e.g., committed use discounts, autoscaling, right-sizing).Strategic & Leadership ResponsibilitiesProvide GCP expertise across enterprise programs and product teams.Mentor cloud engineers, DevOps engineers, and security/compliance stakeholders.Support cloud maturity assessments, cloud center of excellence (CCoE), and platform engineering practices.________________________________________Required Skills & Experience:Proven experience as a GCP Cloud Architect or Cloud Solutions ArchitectStrong DevOps background:CI/CD, IaC, container orchestration (GKE/Kubernetes)Hands-on experience with Terraform, Cloud SDK, and monitoring tools (e.g., Stackdriver, Prometheus, Grafana)Knowledge of GCP Security, IAM, Org Policy, VPC-SC, and SCCProven record in compliance mapping and governance in enterprise settingsExperience with cloud cost optimization tools like Apptio, Cloudability, or native GCP toolsStrong scripting skills in Python, Bash, or Go________________________________________Preferred Qualifications:Google Certified Professional Cloud ArchitectFinOps Certified Practitioner or equivalent experienceCertifications in Terraform, Kubernetes, or DevOps toolchainsExperience in multi-cloud or hybrid environments (AWS/Azure in addition to GCP)________________________________________Soft Skills: Strong communication and stakeholder managementBusiness acumen in balancing cloud value vs costCross-functional collaboration with DevOps, Finance, Compliance, and Security teamsProactive leadership in driving cloud governance maturity Qualification 15 years full time education

Posted 3 weeks ago

Apply

7.0 - 12.0 years

11 - 15 Lacs

noida

Work from Office

Primary Skill(s): Lead Data Visualization Engineer with experience in Sigma BI Experience: 7+ Years in of experience in Data Visualization with experience in Sigma BI, PowerBI, Tableau or Looker Job Summary: Lead Data Visualization Engineer with deep expertise in Sigma BI and a strong ability to craft meaningful, insight-rich visual stories for business stakeholders. This role will be instrumental in transforming raw data into intuitive dashboards and visual analytics, helping cross-functional teams make informed decisions quickly and effectively. Key Responsibilities: Lead the design, development, and deployment of Sigma BI dashboards and reports tailored for various business functions. Translate complex data sets into clear, actionable insights using advanced visualization techniques. Collaborate with business stakeholders to understand goals, KPIs, and data requirements. Build data stories that communicate key business metrics, trends, and anomalies. Serve as a subject matter expert in Sigma BI and guide junior team members on best practices. Ensure visualizations follow design standards, accessibility guidelines, and performance optimization. Partner with data engineering and analytics teams to source and structure data effectively. Conduct workshops and training sessions to enable business users in consuming and interacting with dashboards. Drive the adoption of self-service BI tools and foster a data-driven decision-making culture. Required Skills & Experience: 7+ years of hands-on experience in Business Intelligence, with at least 2 years using Sigma BI. Proven ability to build end-to-end dashboard solutions that tell a story and influence decisions. Strong understanding of data modeling, SQL, and cloud data platforms (Snowflake, BigQuery, etc.). Demonstrated experience working with business users, gathering requirements, and delivering user-friendly outputs. Proficient in data storytelling, UX design principles, and visualization best practices. Experience integrating Sigma BI with modern data stacks and APIs is a plus. Excellent communication and stakeholder management skills. Preferred Qualifications: Experience with other BI tools (such as Sigma BI, Tableau, Power BI, Looker) is a plus. Familiarity with AWS Cloud Data Ecosystems (AWS Databricks). Background in Data Analysis, Statistics, or Business Analytics. Working Hours: 2 PM 11 PM IST [~4.30 AM ET 1.30 PM ET]. Communication skills: Good Mandatory Competencies BI and Reporting Tools - BI and Reporting Tools - Power BI BI and Reporting Tools - BI and Reporting Tools - Tableau Database - Database Programming - SQL Cloud - GCP - Cloud Data Fusion, Dataproc, BigQuery, Cloud Composer, Cloud Bigtable Data Science and Machine Learning - Data Science and Machine Learning - Databricks Cloud - AWS - ECS DMS - Data Analysis Skills Beh - Communication and collaboration BI and Reporting Tools - BI and Reporting Tools - Sigma BI

Posted 3 weeks ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

noida

Hybrid

Data Engineer (SaaS-Based). Immediate Joiners Preferred. Shift : 3 PM to 12 AM IST. Good to have : GCP Certified Data Engineer. Overview Of The Role. As a GCP Data Engineer, you'll focus on solving problems and creating value for the business by building solutions that are reliable and scalable to work with the size and scope of the company. You will be tasked with creating custom-built pipelines as well as migrating on-prem data pipelines to the GCP stack. You will be part of a team tackling intricate problems by designing and deploying reliable and scalable solutions tailored to the company's data landscape.. Required Skills: 5+ years of industry experience in software development, data engineering, business intelligence, or related field with experience in manipulating, processing, and extracting value from datasets.. Extensive experience in doing requirement discovery, analysis and data pipeline solution design.. Design, build and deploy internal applications to support our technology life cycle, collaboration and spaces, service delivery management, data and business intelligence among others.. Building Modular code for multi usable pipeline or any kind of complex Ingestion Framework used to ease the job to load the data into Datalake or Data Warehouse from multiple sources.. Work closely with analysts and business process owners to translate business requirements into technical solutions.. Coding experience in scripting and languages (Python, SQL, PySpark).. Expertise in Google Cloud Platform (GCP) technologies in the data warehousing space (BigQuery, GCP Workflows, Cloud Scheduler, Secret Manager, Batch, Cloud Logging, Cloud SDK, Google Cloud Storage, IAM).. Exposure of Google Dataproc and Dataflow.. Maintain highest levels of development practices including: technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, writing clean, modular and self-sustaining code, with repeatable quality and predictability.. Understanding CI/CD Processes using Pulumi, GitHub, Cloud Build, Cloud SDK, Docker. Experience with SAS/SQL Server/SSIS is an added advantage.. Qualifications: Bachelor's degree in Computer Science or related technical field, or equivalent practical experience.. GCP Certified Data Engineer (preferred). Excellent verbal and written communication skills with the ability to effectively advocate technical solutions to other engineering teams and business audiences..

Posted 3 weeks ago

Apply

3.0 - 5.0 years

5 - 8 Lacs

chennai

Work from Office

" PLEASE READ THE JOB DESTCRIPTION AND APPLY" Data Engineer Job Description Position Overview Yesterday is history, tomorrow is a mystery, but today is a gift. That's why we call it the present. - Master Oogway Join CustomerLabs' dynamic data team as a Data Engineer and play a pivotal role in transforming raw marketing data into actionable insights that power our digital marketing platform. As a key member of our data infrastructure team, you will design, develop, and maintain robust data pipelines, data warehouses, and analytics platforms that serve as the backbone of our digital marketing product development. Sometimes the hardest choices require the strongest wills. - Thanos (but we promise, our data decisions are much easier! ) In this role, you will collaborate with cross-functional teams including Data Scientists, Product Managers, and Marketing Technology specialists to ensure seamless data flow from various marketing channels, ad platforms, and customer touchpoints to our analytics dashboards and reporting systems. You'll be responsible for building scalable, reliable, and efficient data solutions that can handle high-volume marketing data processing and real-time campaign analytics. What You'll Do: - Design and implement enterprise-grade data pipelines for marketing data ingestion and processing - Build and optimize data warehouses and data lakes to support digital marketing analytics - Ensure data quality, security, and compliance across all marketing data systems - Create data models and schemas that support marketing attribution, customer journey analysis, and campaign performance tracking - Develop monitoring and alerting systems to maintain data pipeline reliability for critical marketing operations - Collaborate with product teams to understand digital marketing requirements and translate them into technical solutions Why This Role Matters: I can do this all day. - Captain America (and you'll want to, because this role is that rewarding!) You'll be the backbone behind the data infrastructure that powers CustomerLabs' digital marketing platform, making marketers' lives easier and better. Your work directly translates to smarter automation, clearer insights, and more successful campaigns - helping marketers focus on what they do best while we handle the complex data heavy lifting. Sometimes you gotta run before you can walk. - Iron Man (and sometimes you gotta build the data pipeline before you can analyze the data! ) Our Philosophy: We believe in the power of data to transform lives, just like the Dragon Warrior transformed the Valley of Peace. Every line of code you write, every pipeline you build, and every insight you enable has the potential to change how marketers work and succeed. We're not just building data systems - we're building the future of digital marketing, one insight at a time. Your story may not have such a happy beginning, but that doesn't make you who you are. It is the rest of your story, who you choose to be. - Soothsayer What Makes You Special: We're looking for someone who embodies the spirit of both Captain America's unwavering dedication and Iron Man's innovative genius. You'll need the patience to build robust systems (like Cap's shield ) and the creativity to solve complex problems (like Tony's suit). Most importantly, you'll have the heart to make a real difference in marketers' lives. Inner peace... Inner peace... Inner peace... - Po (because we know data engineering can be challenging, but we've got your back! ) Key Responsibilities Data Pipeline Development - Design, build, and maintain robust, scalable data pipelines and ETL/ELT processes - Develop data ingestion frameworks to collect data from various sources (databases, APIs, files, streaming sources) - Implement data transformation and cleaning processes to ensure data quality and consistency - Optimize data pipeline performance and reliability Data Infrastructure Management - Design and implement data warehouse architectures - Manage and optimize database systems (SQL and NoSQL) - Implement data lake solutions and data governance frameworks - Ensure data security, privacy, and compliance with regulatory requirements Data Modeling and Architecture - Design and implement data models for analytics and reporting - Create and maintain data dictionaries and documentation - Develop data schemas and database structures - Implement data versioning and lineage tracking Data Quality, Security, and Compliance - Ensure data quality, integrity, and consistency across all marketing data systems - Implement and monitor data security measures to protect sensitive information - Ensure privacy and compliance with regulatory requirements (e.g., GDPR, CCPA) - Develop and enforce data governance policies and best practices Collaboration and Support - Work closely with Data Scientists, Analysts, and Business stakeholders - Provide technical support for data-related issues and queries Monitoring and Maintenance - Implement monitoring and alerting systems for data pipelines - Perform regular maintenance and optimization of data systems - Troubleshoot and resolve data pipeline issues - Conduct performance tuning and capacity planning Required Qualifications Experience - 2+ years of experience in data engineering or related roles - Proven experience with ETL/ELT pipeline development - Experience with cloud data platform (GCP) - Experience with big data technologies Technical Skills - Programming Languages : Python, SQL, Golang (preferred) - Databases: PostgreSQL, MySQL, Redis - Big Data Tools: Apache Spark, Apache Kafka, Apache Airflow, DBT, Dataform - Cloud Platforms: GCP (BigQuery, Dataflow, Cloud run, Cloud SQL, Cloud Storage, Pub/Sub, App Engine, Compute Engine etc.) - Data Warehousing: Google BigQuery - Data Visualization: Superset, Looker, Metabase, Tableau - Version Control: Git, GitHub - Containerization: Docker Soft Skills - Strong problem-solving and analytical thinking - Excellent communication and collaboration skills - Ability to work independently and in team environments - Strong attention to detail and data quality - Continuous learning mindset Preferred Qualifications Additional Experience - Experience with real-time data processing and streaming - Knowledge of machine learning pipelines and MLOps - Experience with data governance and data catalog tools - Familiarity with business intelligence tools (Tableau, Power BI, Looker, etc.) - Experience using AI-powered tools (such as Cursor, Claude, Copilot, ChatGPT, Gemini, etc.) to accelerate coding, automate tasks, or assist in system design ( We belive run with machine, not against machine ) Interview Process 1. Initial Screening: Phone/video call with HR 2. Technical Interview: Deep dive into data engineering concepts 3. Final Interview: Discussion with senior leadership Note: This job description is intended to provide a general overview of the position and may be modified based on organizational needs and candidate qualifications. Our Team Culture We are Groot. - We work together, we grow together, we succeed together. We believe in: - Innovation First - Like Iron Man, we're always pushing the boundaries of what's possible - Team Over Individual - Like the Avengers, we're stronger together than apart - Continuous Learning - Like Po learning Kung Fu, we're always evolving and improving - Making a Difference - Like Captain America, we fight for what's right (in this case, better marketing!) Growth Journey There is no charge for awesomeness... or attractiveness. - Po Your journey with us will be like Po's transformation from noodle maker to Dragon Warrior: - Level 1 : Master the basics of our data infrastructure - Level 2: Build and optimize data pipelines - Level 3 : Lead complex data projects and mentor others - Level 4: Become a data engineering legend (with your own theme music! ) What We Promise I am Iron Man. - We promise you'll feel like a superhero every day! - Work that matters - Every pipeline you build helps real marketers succeed - Growth opportunities - Learn new technologies and advance your career - Supportive team - We've got your back, just like the Avengers - Work-life balance - Because even superheroes need rest! Apply : https://customerlabs.freshteam.com/jobs

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Senior Software Engineer specializing in AI/ML development and leading Vertex AI & Gemini projects, you will play a crucial role in developing and deploying solutions using cutting-edge technologies. With 5-8 years of experience in Software Development/Engineering, you will be responsible for integrating GenAI components into enterprise-grade document automation workflows. Your expertise in Google Cloud Platform, Vertex AI, and Gemini models will be essential in contributing to scalable, cloud-native architectures for document ingestion, extraction, summarization, and transformation. Your future duties and responsibilities include developing and deploying solutions using Vertex AI, Gemini models, Document AI, and custom NLP/OCR components. You will also collaborate with architects, MLOps engineers, and business stakeholders to translate requirements into scalable code while ensuring secure and compliant handling of sensitive data in document processing workflows. Staying up to date with the latest Gemini/LLM advancements and integrating relevant innovations into projects will be a key aspect of your role. In order to be successful in this role, you must possess expertise in various skills including Google Cloud Platform (Vertex AI, Cloud Functions, Cloud Run, BigQuery, Document AI, Firestore), GenAI/LLMs (Google Gemini, PaLM, LangChain), OCR & NLP tools (Tesseract, GCP Document AI, spaCy, Hugging Face Transformers), Full Stack technologies (React or Next.js, Node.js or FastAPI, Firebase/Firestore), DevOps/MLOps practices (GitHub Actions, Vertex Pipelines, Docker, Terraform), and Data & Integration tools (REST APIs, GraphQL, Webhooks, Cloud Pub/Sub, JSON/Protobuf). With a solid background in full-stack development, hands-on experience in building products leveraging GenAI, NLP, and OCR, as well as proficiency in Kubernetes concepts, relational and non-relational databases, you will be well-equipped to tackle complex issues and adapt to rapidly evolving AI technologies. Your understanding of privacy regulations, security best practices, and ethical considerations in AI development will be crucial in developing production-ready systems. Additionally, experience working with Google Gemini models, document parsing, NLP, OCR, and GenAI-based transformation will further enhance your capabilities in this role. As an integral part of the team at CGI, you will have the opportunity to turn meaningful insights into action, shaping your career in a company focused on growth and innovation. With a startup mentality and a strong sense of ownership, you will contribute to delivering innovative solutions and building valuable relationships with teammates and clients, ultimately driving success in the world of IT and business consulting services.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

As a Software Engineer at HSBC, you will play a crucial role in designing, implementing, and managing scalable, secure, and reliable cloud infrastructure on the Google Cloud Platform (GCP). Your responsibilities will include collaborating with development teams to optimize applications for cloud deployment, setting up and configuring various GCP services, and ensuring compliance with security policies and best practices. To excel in this role, you should have proven experience as a Cloud Engineer with a strong focus on GCP. Your expertise should include a deep understanding of cloud architecture and services such as Compute Engine, Kubernetes Engine, Cloud Storage, and BigQuery. Additionally, you will be expected to automate infrastructure provisioning using tools like Terraform, Google Cloud Deployment Manager, or similar, and implement CI/CD pipelines for efficient software delivery. The successful candidate will possess proficiency in scripting languages like Python and Bash, as well as the ability to troubleshoot and resolve issues related to cloud infrastructure and services. Google Cloud certifications, particularly the Google Cloud Professional Cloud Architect certification, are considered a plus. Staying updated with the latest GCP features, services, and best practices is essential for this role. Any knowledge of other cloud platforms like AWS or Azure will be an added advantage. If you are a skilled and experienced Google Cloud Engineer seeking a career where your expertise is valued, HSBC offers an inclusive and diverse environment where employees are respected, valued, and provided with opportunities for continuous professional development and growth. Join us at HSBC and realize your ambitions in a workplace that prioritizes employee well-being and career advancement. For more information about career opportunities at HSBC, visit www.hsbc.com/careers.,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

One of our prestigious clients, a TOP MNC Giant with a global presence, is currently seeking a Lead Enterprise Architect to join their team in Pune, Mumbai, or Bangalore. **Qualifications and Certifications:** **Education:** - Bachelors or masters degree in Computer Science, Information Technology, Engineering, or a related field. **Experience:** - A minimum of 10+ years of experience in data engineering, with at least 4 years of hands-on experience with GCP cloud platforms. - Proven track record in designing and implementing data workflows using GCP services such as BigQuery, Dataform Cloud Dataflow, Cloud Pub/Sub, and Cloud Composer. **Certifications:** - Google Cloud Professional Data Engineer certification is preferred. **Key Skills:** **Mandatory Skills:** - Advanced proficiency in Python for developing data pipelines and automation. - Strong SQL skills for querying, transforming, and analyzing large datasets. - Hands-on experience with various GCP services including Cloud Storage, Dataflow, Cloud Pub/Sub, Cloud SQL, BigQuery, Dataform, Compute Engine, and Kubernetes Engine (GKE). - Familiarity with CI/CD tools like Jenkins, GitHub, or Bitbucket. - Proficiency in Docker, Kubernetes, Terraform, or Ansible for containerization, orchestration, and infrastructure as code (IaC). - Knowledge of workflow orchestration tools such as Apache Airflow or Cloud Composer. - Strong understanding of Agile/Scrum methodologies. **Nice-to-Have Skills:** - Experience with other cloud platforms like AWS or Azure. - Familiarity with data visualization tools such as Power BI, Looker, or Tableau. - Understanding of machine learning workflows and their integration with data pipelines. **Soft Skills:** - Strong problem-solving and critical-thinking abilities. - Excellent communication skills to effectively collaborate with both technical and non-technical stakeholders. - Proactive attitude towards innovation and continuous learning. - Ability to work independently and as part of a collaborative team. If you are interested in this opportunity, please reply back with your updated CV and provide the following details: - Total Experience: - Relevant experience in Data Engineering: - Relevant experience in GCP cloud platforms: - Relevant experience as an Enterprise Architect: - Availability to join ASAP: - Preferred location (Pune / Mumbai / Bangalore): We will contact you once we receive your CV along with the above-mentioned details. Thank you, Kavita.A,

Posted 1 month ago

Apply

1.0 - 5.0 years

0 Lacs

ahmedabad, gujarat

On-site

As a Data Engineer at Synoptek, you will be responsible for designing, developing, and maintaining robust and scalable data pipelines on the Google Cloud Platform (GCP). You will leverage your hands-on experience with GCP services such as BigQuery, Jitterbit, Cloud Dataflow, Cloud Pub/Sub, and Cloud Storage to build efficient data processing solutions. Collaborating with cross-functional teams, you will translate their data needs into technical requirements, ensuring data quality, integrity, and security throughout the data lifecycle. Your role will involve developing and optimizing ETL/ELT processes to extract, transform, and load data from various sources into data warehouses and data lakes. Additionally, you will build and maintain data models and schemas to support business intelligence and analytics, while troubleshooting data quality issues and performance bottlenecks. To excel in this position, you should have a Bachelor's degree in Computer Science, Engineering, or a related field, along with 3 to 4 years of experience as a Data Engineer focusing on GCP. Proficiency in Python, SQL, and BigQuery is essential, as well as hands-on experience with data ingestion, transformation, and loading tools like Jitterbit and Apache Beam. A strong understanding of data warehousing and data lake concepts, coupled with experience in data modeling and schema design, will be beneficial. The ideal candidate will exhibit excellent problem-solving and analytical skills, working both independently and collaboratively with internal and external teams. Familiarity with acquiring and managing data from various sources, as well as the ability to identify trends in complex datasets and propose business solutions, are key attributes for success in this role. At Synoptek, we value employees who embody our core DNA behaviors, including clarity, integrity, innovation, accountability, and a results-focused mindset. We encourage continuous learning, adaptation, and growth in a fast-paced environment, promoting a culture of teamwork, flexibility, respect, and collaboration. If you have a passion for data engineering, a drive for excellence, and a commitment to delivering impactful results, we invite you to join our dynamic team at Synoptek. Work hard, play hard, and let's achieve superior outcomes together.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

punjab

On-site

As a GCP Data Engineer in Australia, you will be responsible for leveraging your experience in Google Cloud Platform (GCP) to handle various aspects of data engineering. Your role will involve working on data migration projects from legacy systems such as SQL and Oracle. You will also be designing and building ETL pipelines for data lake and data warehouse solutions on GCP. In this position, your expertise in GCP data and analytics services will be crucial. You will work with tools like Cloud Dataflow, Cloud Dataprep, Apache Beam/Cloud composer, Cloud BigQuery, Cloud Fusion, Cloud PubSub, Cloud storage, and Cloud Functions. Additionally, you will utilize Cloud Native GCP CLI/gsutil for operations and scripting languages like Python and SQL to enhance data processing efficiencies. Furthermore, your experience with data governance practices, metadata management, data masking, and encryption will be essential. You will utilize GCP tools such as Cloud Data Catalog and GCP KMS tools to ensure data security and compliance. Overall, this role requires a strong foundation in GCP technologies and a proactive approach to data engineering challenges in a dynamic environment.,

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 8 Lacs

Chennai

Work from Office

" PLEASE READ THE JOB DESTCRIPTION AND APPLY" Data Engineer Job Description Position Overview Yesterday is history, tomorrow is a mystery, but today is a gift. That's why we call it the present. - Master Oogway Join CustomerLabs' dynamic data team as a Data Engineer and play a pivotal role in transforming raw marketing data into actionable insights that power our digital marketing platform. As a key member of our data infrastructure team, you will design, develop, and maintain robust data pipelines, data warehouses, and analytics platforms that serve as the backbone of our digital marketing product development. Sometimes the hardest choices require the strongest wills. - Thanos (but we promise, our data decisions are much easier! ) In this role, you will collaborate with cross-functional teams including Data Scientists, Product Managers, and Marketing Technology specialists to ensure seamless data flow from various marketing channels, ad platforms, and customer touchpoints to our analytics dashboards and reporting systems. You'll be responsible for building scalable, reliable, and efficient data solutions that can handle high-volume marketing data processing and real-time campaign analytics. What You'll Do: - Design and implement enterprise-grade data pipelines for marketing data ingestion and processing - Build and optimize data warehouses and data lakes to support digital marketing analytics - Ensure data quality, security, and compliance across all marketing data systems - Create data models and schemas that support marketing attribution, customer journey analysis, and campaign performance tracking - Develop monitoring and alerting systems to maintain data pipeline reliability for critical marketing operations - Collaborate with product teams to understand digital marketing requirements and translate them into technical solutions Why This Role Matters: I can do this all day. - Captain America (and you'll want to, because this role is that rewarding!) You'll be the backbone behind the data infrastructure that powers CustomerLabs' digital marketing platform, making marketers' lives easier and better. Your work directly translates to smarter automation, clearer insights, and more successful campaigns - helping marketers focus on what they do best while we handle the complex data heavy lifting. Sometimes you gotta run before you can walk. - Iron Man (and sometimes you gotta build the data pipeline before you can analyze the data! ) Our Philosophy: We believe in the power of data to transform lives, just like the Dragon Warrior transformed the Valley of Peace. Every line of code you write, every pipeline you build, and every insight you enable has the potential to change how marketers work and succeed. We're not just building data systems - we're building the future of digital marketing, one insight at a time. Your story may not have such a happy beginning, but that doesn't make you who you are. It is the rest of your story, who you choose to be. - Soothsayer What Makes You Special: We're looking for someone who embodies the spirit of both Captain America's unwavering dedication and Iron Man's innovative genius. You'll need the patience to build robust systems (like Cap's shield ) and the creativity to solve complex problems (like Tony's suit). Most importantly, you'll have the heart to make a real difference in marketers' lives. Inner peace... Inner peace... Inner peace... - Po (because we know data engineering can be challenging, but we've got your back! ) Key Responsibilities Data Pipeline Development - Design, build, and maintain robust, scalable data pipelines and ETL/ELT processes - Develop data ingestion frameworks to collect data from various sources (databases, APIs, files, streaming sources) - Implement data transformation and cleaning processes to ensure data quality and consistency - Optimize data pipeline performance and reliability Data Infrastructure Management - Design and implement data warehouse architectures - Manage and optimize database systems (SQL and NoSQL) - Implement data lake solutions and data governance frameworks - Ensure data security, privacy, and compliance with regulatory requirements Data Modeling and Architecture - Design and implement data models for analytics and reporting - Create and maintain data dictionaries and documentation - Develop data schemas and database structures - Implement data versioning and lineage tracking Data Quality, Security, and Compliance - Ensure data quality, integrity, and consistency across all marketing data systems - Implement and monitor data security measures to protect sensitive information - Ensure privacy and compliance with regulatory requirements (e.g., GDPR, CCPA) - Develop and enforce data governance policies and best practices Collaboration and Support - Work closely with Data Scientists, Analysts, and Business stakeholders - Provide technical support for data-related issues and queries Monitoring and Maintenance - Implement monitoring and alerting systems for data pipelines - Perform regular maintenance and optimization of data systems - Troubleshoot and resolve data pipeline issues - Conduct performance tuning and capacity planning Required Qualifications Experience - 2+ years of experience in data engineering or related roles - Proven experience with ETL/ELT pipeline development - Experience with cloud data platform (GCP) - Experience with big data technologies Technical Skills - Programming Languages : Python, SQL, Golang (preferred) - Databases: PostgreSQL, MySQL, Redis - Big Data Tools: Apache Spark, Apache Kafka, Apache Airflow, DBT, Dataform - Cloud Platforms: GCP (BigQuery, Dataflow, Cloud run, Cloud SQL, Cloud Storage, Pub/Sub, App Engine, Compute Engine etc.) - Data Warehousing: Google BigQuery - Data Visualization: Superset, Looker, Metabase, Tableau - Version Control: Git, GitHub - Containerization: Docker Soft Skills - Strong problem-solving and analytical thinking - Excellent communication and collaboration skills - Ability to work independently and in team environments - Strong attention to detail and data quality - Continuous learning mindset Preferred Qualifications Additional Experience - Experience with real-time data processing and streaming - Knowledge of machine learning pipelines and MLOps - Experience with data governance and data catalog tools - Familiarity with business intelligence tools (Tableau, Power BI, Looker, etc.) - Experience using AI-powered tools (such as Cursor, Claude, Copilot, ChatGPT, Gemini, etc.) to accelerate coding, automate tasks, or assist in system design ( We belive run with machine, not against machine ) Interview Process 1. Initial Screening: Phone/video call with HR 2. Technical Interview: Deep dive into data engineering concepts 3. Final Interview: Discussion with senior leadership Note: This job description is intended to provide a general overview of the position and may be modified based on organizational needs and candidate qualifications. Our Team Culture We are Groot. - We work together, we grow together, we succeed together. We believe in: - Innovation First - Like Iron Man, we're always pushing the boundaries of what's possible - Team Over Individual - Like the Avengers, we're stronger together than apart - Continuous Learning - Like Po learning Kung Fu, we're always evolving and improving - Making a Difference - Like Captain America, we fight for what's right (in this case, better marketing!) Growth Journey There is no charge for awesomeness... or attractiveness. - Po Your journey with us will be like Po's transformation from noodle maker to Dragon Warrior: - Level 1 : Master the basics of our data infrastructure - Level 2: Build and optimize data pipelines - Level 3 : Lead complex data projects and mentor others - Level 4: Become a data engineering legend (with your own theme music! ) What We Promise I am Iron Man. - We promise you'll feel like a superhero every day! - Work that matters - Every pipeline you build helps real marketers succeed - Growth opportunities - Learn new technologies and advance your career - Supportive team - We've got your back, just like the Avengers - Work-life balance - Because even superheroes need rest!

Posted 1 month ago

Apply

7.0 - 12.0 years

11 - 15 Lacs

Noida

Work from Office

Primary Skill(s): Lead Data Visualization Engineer with experience in Sigma BI Experience: 7+ Years in of experience in Data Visualization with experience in Sigma BI, PowerBI, Tableau or Looker Job Summary: Lead Data Visualization Engineer with deep expertise in Sigma BI and a strong ability to craft meaningful, insight-rich visual stories for business stakeholders. This role will be instrumental in transforming raw data into intuitive dashboards and visual analytics, helping cross-functional teams make informed decisions quickly and effectively. Key Responsibilities: Lead the design, development, and deployment of Sigma BI dashboards and reports tailored for various business functions. Translate complex data sets into clear, actionable insights using advanced visualization techniques. Collaborate with business stakeholders to understand goals, KPIs, and data requirements. Build data stories that communicate key business metrics, trends, and anomalies. Serve as a subject matter expert in Sigma BI and guide junior team members on best practices. Ensure visualizations follow design standards, accessibility guidelines, and performance optimization. Partner with data engineering and analytics teams to source and structure data effectively. Conduct workshops and training sessions to enable business users in consuming and interacting with dashboards. Drive the adoption of self-service BI tools and foster a data-driven decision-making culture. Required Skills & Experience: 7+ years of hands-on experience in Business Intelligence, with at least 2 years using Sigma BI. Proven ability to build end-to-end dashboard solutions that tell a story and influence decisions. Strong understanding of data modeling, SQL, and cloud data platforms (Snowflake, BigQuery, etc.). Demonstrated experience working with business users, gathering requirements, and delivering user-friendly outputs. Proficient in data storytelling, UX design principles, and visualization best practices. Experience integrating Sigma BI with modern data stacks and APIs is a plus. Excellent communication and stakeholder management skills. Preferred Qualifications: Experience with other BI tools (such as Sigma BI, Tableau, Power BI, Looker) is a plus. Familiarity with AWS Cloud Data Ecosystems (AWS Databricks). Background in Data Analysis, Statistics, or Business Analytics. Working Hours: 2 PM 11 PM IST [~4.30 AM ET 1.30 PM ET]. Communication skills: Good Mandatory Competencies BI and Reporting Tools - BI and Reporting Tools - Power BI BI and Reporting Tools - BI and Reporting Tools - Tableau Database - Database Programming - SQL Cloud - GCP - Cloud Data Fusion, Dataproc, BigQuery, Cloud Composer, Cloud Bigtable Data Science and Machine Learning - Data Science and Machine Learning - Databricks Cloud - AWS - ECS DMS - Data Analysis Skills Beh - Communication and collaboration

Posted 1 month ago

Apply

5.0 - 10.0 years

6 - 10 Lacs

Pune

Work from Office

Educational BSc,BCA,Master Of Engineering,MSc,MCA,MTech,BTech,Bachelor of Engineering Service Line Application Development and Maintenance Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Preferred LocationsBangalore, Hyderabad, Chennai, Pune Experience Required3 to 5 years of experiencePure hands on and expertise on the skill, able to deliver without any support Experience Required5 - 9 years of experienceDesign knowledge, estimation technique, leading and guiding the team on technical solution Experience Required9 - 13 years of experienceArchitecture, Solutioning, (Optional) proposal Containerization, micro service development on AWS/Azure/GCP is preferred. In-depth knowledge of design issues and best practices Solid understanding of object-oriented programming Familiar with various design, architectural patterns and software development process. Implementing automated testing platforms and unit tests Strong experience in building and developing applications using technologies like Java/J2EE, Spring boot, Microservices Knowledge about RESTful APIs and ability to design cloud ready applications using cloud SDK’s Exposure to cloud compute services like VM’s, PaaS services, containers, serverless and storage services on AWS/Azure/GCP Good understanding of application development design patterns Technical and Professional : Primary SkillCore Java, Spring boot, Microservices, (Optional) Data Base like SQL etc. Secondary Skills: AWS/Azure/GCP Preferred Skills: Technology-Java-Springboot Generic Skills: Technology-Cloud Platform-Azure Development & Solution Architecting Technology-Cloud Platform-Azure Devops-data on cloud-aws Technology-Cloud Platform-GCP Devops

Posted 2 months ago

Apply

5.0 - 7.0 years

4 - 9 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering,BTech,Bachelor Of Technology,BCA,BSc,MTech,MCA Service Line Application Development and Maintenance Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Preferred LocationsBangalore, Hyderabad, Chennai, Pune Experience Required3 to 5 years of experiencePure hands on and expertise on the skill, able to deliver without any support Experience Required5 - 9 years of experienceDesign knowledge, estimation technique, leading and guiding the team on technical solution Experience Required9 - 13 years of experienceArchitecture, Solutioning, (Optional) proposal Containerization, micro service development on AWS/Azure/GCP is preferred. In-depth knowledge of design issues and best practices Solid understanding of object-oriented programming Familiar with various design, architectural patterns and software development process. Implementing automated testing platforms and unit tests Strong experience in building and developing applications using technologies like Python Knowledge about RESTful APIs and ability to design cloud ready applications using cloud SDK’s , microservices Exposure to cloud compute services like VM’s, PaaS services, containers, serverless and storage services on AWS/Azure/GCP Good understanding of application development design patterns Technical and Professional : Primary SkillPythonSecondary Skills: AWS/Azure/GCP Preferred Skills: Technology-Machine Learning-Python Generic Skills: Technology-Cloud Platform-AWS App Development Technology-Cloud Platform-Azure Development & Solution Architecting Technology-Cloud Platform-GCP Devops

Posted 2 months ago

Apply

5.0 - 9.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Educational Bachelor of Engineering,Bachelor Of Technology,MSc,MTech,MCA Service Line Application Development and Maintenance Responsibilities Responsibilities Application migration to AWS cloud User requirements, envisioning system features and functionality. Identify bottlenecks and bugs, and recommend system solutions by comparing advantages and disadvantages of custom development Contributing to team meetings, troubleshooting development and production problems across multiple environments and operating platforms Understand Architecture and ensure effective Design, Development, Validation and Support activities Understand and analyze client requirements, refactor systems for workload migration / modernization to cloud (AWS) End-to-end feature development and resolving challenges faced in the implementation Create detailed design artifacts, work on development, and perform code reviews, implement validation and support activities Contribute to thought leadership within the area of technology specialization Additional Responsibilities: Skills: Containerization, micro service development on AWS is preferred. In-depth knowledge of design issues and best practices Solid understanding of object-oriented programming Familiar with various design, architectural patterns and software development process. Implementing automated testing platforms and unit tests Strong experience in building and developing applications using technologies like Java/J2EE, Springboot/Python Knowledge about RESTful APIs and ability to design cloud ready applications using cloud SDK’s , microservices Exposure to cloud compute services like VM’s, PaaS services, containers, serverless and storage services on AWS Good understanding of application development design patternsCompetencies: Good verbal and written communication skills Ability to communicate with remote teams in effective manner High flexibility to travel Ability to work both independently and in a multi-disciplinary team environment Technical and Professional : AWS - ELB/RDS/EC2/S3/IAM, Java/J2EE, Springboot/Python Preferred Skills: Technology-Cloud Security-AWS - Infrastructure Security-AWS Systems Manager

Posted 2 months ago

Apply

5.0 - 7.0 years

8 - 13 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering,BTech,Bachelor Of Technology,BCA,BSc,MTech,MCA Service Line Application Development and Maintenance Responsibilities A day in the life of an InfoscionResponsibilities Application migration to AWS/Azure/GCP cloud User requirements, envisioning system features and functionality. Identify bottlenecks and bugs, and recommend system solutions by comparing advantages and disadvantages of custom development Contributing to team meetings, troubleshooting development and production problems across multiple environments and operating platforms Understand Architecture and ensure effective Design, Development, Validation and Support activities Understand and analyze client requirements, refactor systems for workload migration / modernization to cloud (AWS, Azure, GCP) End-to-end feature development and resolving challenges faced in the implementation Create detailed design artifacts, work on development, and perform code reviews, implement validation and support activities Contribute to thought leadership within the area of technology specializationIf you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Preferred LocationsBangalore, Hyderabad, Chennai, Pune Experience Required3 to 5 years of experiencePure hands on and expertise on the skill, able to deliver without any support Experience Required5 - 9 years of experienceDesign knowledge, estimation technique, leading and guiding the team on technical solution Experience Required9 - 13 years of experienceArchitecture, Solutioning, (Optional) proposal Containerization, micro service development on AWS/Azure/GCP is preferred. In-depth knowledge of design issues and best practices Solid understanding of object-oriented programming Familiar with various design, architectural patterns and software development process. Implementing automated testing platforms and unit tests Strong experience in building and developing applications using technologies like Devops, Terraform Knowledge about RESTful APIs and ability to design cloud ready applications using cloud SDK’s Exposure to cloud compute services like VM’s, PaaS services, containers, serverless and storage services on AWS/Azure/GCP Good understanding of application development design patterns Technical and Professional : Primary SkillAWS/Azure/GCP + Devops + Terraform Preferred Skills: Technology-Cloud Platform-Azure App Development-Azure API Management Technology-Cloud Platform-GCP App Development Generic Skills: Technology-Cloud Platform-AWS App Development Technology-Cloud Platform-Azure Devops Technology-Cloud Platform-GCP Devops

Posted 2 months ago

Apply

5.0 - 9.0 years

5 - 9 Lacs

Chennai

Work from Office

Educational Bachelor of Engineering,MTech,MCA Service Line Application Development and Maintenance Responsibilities Responsibilities Application migration to AWS cloud User requirements, envisioning system features and functionality. Identify bottlenecks and bugs, and recommend system solutions by comparing advantages and disadvantages of custom development Contributing to team meetings, troubleshooting development and production problems across multiple environments and operating platforms Understand Architecture and ensure effective Design, Development, Validation and Support activities Understand and analyze client requirements, refactor systems for workload migration / modernization to cloud (AWS) End-to-end feature development and resolving challenges faced in the implementation Create detailed design artifacts, work on development, and perform code reviews, implement validation and support activities Contribute to thought leadership within the area of technology specialization Additional Responsibilities: Skills: Containerization, micro service development on AWS is preferred. In-depth knowledge of design issues and best practices Solid understanding of object-oriented programming Familiar with various design, architectural patterns and software development process. Implementing automated testing platforms and unit tests Strong experience in building and developing applications using technologies like Java/J2EE, Springboot/Python Knowledge about RESTful APIs and ability to design cloud ready applications using cloud SDK’s , microservices Exposure to cloud compute services like VM’s, PaaS services, containers, serverless and storage services on AWS Good understanding of application development design patterns Technical and Professional : AWS - ELB/RDS/EC2/S3/IAM, Java/J2EE, Springboot/Python Preferred Skills: Technology-Data On Cloud - Platform-AWS

Posted 2 months ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Noida

Hybrid

Data Engineer (SaaS-Based). Immediate Joiners Preferred. Shift : 3 PM to 12 AM IST. Good to have : GCP Certified Data Engineer. Overview Of The Role. As a GCP Data Engineer, you'll focus on solving problems and creating value for the business by building solutions that are reliable and scalable to work with the size and scope of the company. You will be tasked with creating custom-built pipelines as well as migrating on-prem data pipelines to the GCP stack. You will be part of a team tackling intricate problems by designing and deploying reliable and scalable solutions tailored to the company's data landscape.. Required Skills: 5+ years of industry experience in software development, data engineering, business intelligence, or related field with experience in manipulating, processing, and extracting value from datasets.. Extensive experience in doing requirement discovery, analysis and data pipeline solution design.. Design, build and deploy internal applications to support our technology life cycle, collaboration and spaces, service delivery management, data and business intelligence among others.. Building Modular code for multi usable pipeline or any kind of complex Ingestion Framework used to ease the job to load the data into Datalake or Data Warehouse from multiple sources.. Work closely with analysts and business process owners to translate business requirements into technical solutions.. Coding experience in scripting and languages (Python, SQL, PySpark).. Expertise in Google Cloud Platform (GCP) technologies in the data warehousing space (BigQuery, GCP Workflows, Cloud Scheduler, Secret Manager, Batch, Cloud Logging, Cloud SDK, Google Cloud Storage, IAM).. Exposure of Google Dataproc and Dataflow.. Maintain highest levels of development practices including: technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, writing clean, modular and self-sustaining code, with repeatable quality and predictability.. Understanding CI/CD Processes using Pulumi, GitHub, Cloud Build, Cloud SDK, Docker. Experience with SAS/SQL Server/SSIS is an added advantage.. Qualifications: Bachelor's degree in Computer Science or related technical field, or equivalent practical experience.. GCP Certified Data Engineer (preferred). Excellent verbal and written communication skills with the ability to effectively advocate technical solutions to other engineering teams and business audiences..

Posted 2 months ago

Apply

5.0 - 10.0 years

9 - 19 Lacs

Pune, Chennai, Bengaluru

Hybrid

Project Role : Cloud Platform Architect Project Role Description : Oversee application architecture and deployment in cloud platform environments -- including public cloud, private cloud and hybrid cloud. This can include cloud adoption plans, cloud application design, and cloud management and monitoring. Must have skills : Google Cloud Platform Architecture Summary: As a Cloud Platform Architect, you will be responsible for overseeing application architecture and deployment in cloud platform environments, including public cloud, private cloud, and hybrid cloud. Your typical day will involve designing cloud adoption plans, managing and monitoring cloud applications, and ensuring cloud application design meets business requirements. Roles & Responsibilities: - Design and implement cloud adoption plans, including public cloud, private cloud, and hybrid cloud environments. - Oversee cloud application design, ensuring it meets business requirements and aligns with industry best practices. - Manage and monitor cloud applications, ensuring they are secure, scalable, and highly available. - Collaborate with cross-functional teams to ensure cloud applications are integrated with other systems and services. - Stay up-to-date with the latest advancements in cloud technology, integrating innovative approaches for sustained competitive advantage. Professional & Technical Skills: - Must To Have Skills: Strong experience in Google Cloud Platform Architecture. - Good To Have Skills: Experience with other cloud platforms such as AWS or Azure. - Experience in designing and implementing cloud adoption plans. - Strong understanding of cloud application design and architecture. - Experience in managing and monitoring cloud applications. - Solid grasp of cloud security, scalability, and availability best practices.

Posted 2 months ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Hyderabad

Remote

Canterr is looking for talented and passionate professionals for exciting opportunities with a US-based MNC product company! You will be working permanently with Canterr and deployed to a top-tier global tech client. Key Responsibilities: Design and develop data pipelines and ETL processes to ingest, process, and store large volumes of data. Implement and manage big data technologies such as Kafka, Dataflow, BigQuery, CloudSQL, Kafka, PubSub Collaborate with stakeholders to understand data requirements and deliver high-quality data solutions. Monitor and troubleshoot data pipeline issues and implement solutions to prevent future occurrences. Required Skills and Experience: Generally, we use Google Cloud Platform (GCP) for all software deployed at Wayfair. Data Storage and Processing BigQuery CloudSQL PostgreSQL DataProc Pub/Sub Data modeling: Breaking the business requirements (KPIs) to data points. Building the scalable data model ETL Tools: DBT SQL Data Orchestration and ETL Dataflow Cloud Composer Infrastructure and Deployment Kubernetes Helm Data Access and Management Looker Terraform Ideal Business Domain Experience: Supply chain or warehousing experience: The project is focused on building a normalized data layer which ingests information from multiple Warehouse Management Systems (WMS) and projects it for back-office analysis

Posted 3 months ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Mumbai

Work from Office

Job Summary This position provides leadership in full systems life cycle management (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.) to ensure delivery is on time and within budget. He/She directs component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements and ensure compliance. This position develops and leads AD project activities and integrations. He/She guides teams to ensure effective communication and achievement of objectives. This position researches and supports the integration of emerging technologies. He/She provides knowledge and support for applications development, integration, and maintenance. This position leads junior team members with project related activities and tasks. He/She guides and influences department and project teams. This position facilitates collaboration with stakeholders. Responsibilities: GCP services (like Big Query, GKE, Spanner, Cloud run, Data flow etc.,) Angular, Java (Rest Api), SQL, Python, Terraforms, Azure DevOps CICD Pipelines Leads systems analysis and design. Leads design and development of applications. Develops and ensures creation of application documents. Defines and produces integration builds. Monitors emerging technology trends. Leads maintenance and support. Primary Skills: Minimum 5 years Java-Springboot/J2EE (Full Stack Developer) Minimum 2 years in GCP platform (Cloud PubSub, GKE, BigQuery) - Experience in BigTable and Spanner will be a plus Working in Agile environment, CI/CD experience. Qualifications: Bachelors Degree or International equivalent Bachelor's Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics, or related field - Preferred

Posted 3 months ago

Apply
Page 1 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies