Jobs
Interviews

38 Cloud Orchestration Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

Armada is an edge computing startup that provides computing infrastructure to remote areas where connectivity and cloud infrastructure is limited, as well as areas where data needs to be processed locally for real-time analytics and AI at the edge. We are looking to bring on the most brilliant minds to help further our mission of bridging the digital divide with advanced technology infrastructure that can be rapidly deployed anywhere. We are seeking a highly skilled and motivated Software Engineer to drive the design and implementation of our on-premises Compute as a Service (CaaS) and GPU as a Service (GPUaaS) offerings. In this role, you will be pivotal in building a robust and scalable infrastructure platform, enabling our engineering teams to efficiently deploy and manage applications. This position offers a fantastic opportunity to grow your career and work with cutting-edge technologies in cloud infrastructure, automation, and AI. You will be part of a collaborative team, tackling real-world challenges in a fast-paced environment and seeing your code have a direct impact on our products. This role is office-based at our Trivandrum, Kerala office. Key Responsibilities: - Build & Automate: Write, test, and maintain code for our edge cloud platform, focusing on automating the provisioning and management of GPU and CPU resources. - Develop Core Services: Contribute to the backend services and APIs that power our compute offerings, helping to implement new features and improve existing ones. - Troubleshoot & Support: Assist in diagnosing and resolving issues across our distributed infrastructure, learning how to ensure reliability and performance for our customers. - Collaborate & Learn: Work closely with senior engineers and architects, participating in code reviews and design discussions to grow your skills in cloud-native technologies. - Implement Tooling: Help develop the tools and scripts that our engineering teams use to deploy, monitor, and manage our edge datacenters. Required Qualifications: - Programming Foundation: Solid programming skills in Golang - Linux and container technology Experience: Comfortable working in a Linux environment and good grasp of containerization technology like Docker, Kubernetes. - Virtualization & IaaS Background: Good knowledge of virtualization technologies (KVM preferred) and cloud orchestration. - Eagerness to Learn: Strong desire to learn about cloud computing, distributed systems, and automation. Curiosity is valued. - Problem-Solving Skills: Enjoy breaking down problems, thinking logically, and collaborating with a team to find solutions. - Education: A bachelor's degree in Computer Science, a related technical field, or equivalent practical experience. (Typically, 3-6 years of experience for this level). Preferred Qualifications: - Familiarity with container technologies like Docker. - Exposure to cloud platforms (AWS, GCP, Azure) or virtualization technologies. - Experience in Kubernetes or other orchestration systems. - Experience with Infrastructure as Code (IaC) tools like Terraform or Ansible. - Exposure to AI/ML space and the role GPUs play. Compensation & Benefits: For India-based candidates: We offer a competitive base salary along with equity options, providing an opportunity to share in the success and growth of Armada. You're a Great Fit if You're: - A go-getter with a growth mindset, intellectually curious, have strong business acumen, and actively seek opportunities to build relevant skills and knowledge. - A detail-oriented problem-solver who can independently gather information, solve problems efficiently, and deliver results with a "get-it-done" attitude. - Thrive in a fast-paced environment, energized by an entrepreneurial spirit, capable of working quickly, and excited to contribute to a growing company. - A collaborative team player who focuses on business success and is motivated by team accomplishment vs personal agenda. - Highly organized and results-driven, with strong prioritization skills and a dedicated work ethic. Equal Opportunity Statement,

Posted 21 hours ago

Apply

6.0 - 10.0 years

0 - 0 Lacs

karnataka

On-site

As a Pre-Sales Solution Architect, your primary responsibility will be to lead the pre-sales efforts in collaboration with the sales team to understand client requirements and design effective solutions. You will be managing the preparation and submission of RFPs/RFIs, ensuring technical accuracy and alignment with business needs. Additionally, you will conduct customer demonstrations and oversee the deployment of solutions that meet client specifications. Analyzing prospective customer business challenges and recommending appropriate cloud solutions tailored to their needs will also be a key part of your role. Creating and delivering detailed solution documents, technical presentations, and proposals to key decision-makers will be crucial in your day-to-day tasks. You will also play a vital role in assisting in closing deals by providing technical expertise and supporting the sales team throughout the sales cycle. Staying updated on industry trends, particularly in IaaS and PaaS services, and sharing insights with the team will be essential. Furthermore, mentoring junior team members and facilitating knowledge sharing within the pre-sales team will be part of your responsibilities. In order to succeed in this role, you must have 6-8 years of experience in a Pre-Sales Solution Architect or similar role. Strong knowledge of cloud migration and management of customer cloud environments is a must, along with proven experience with AWS IaaS and PaaS services. You should possess the ability to create compelling solution design documents and technical presentations, as well as familiarity with cloud orchestration tools and strong sizing skills. Excellent written and verbal communication skills are required, capable of engaging with both technical and non-technical stakeholders. Strong analytical and problem-solving skills are also essential for this position. While not mandatory, having exceptional interpersonal and communication abilities, a sales-oriented mindset with a focus on customer success, and a deep understanding of cloud concepts and emerging cloud solutions would be considered nice-to-have requirements. In return, we offer a competitive salary package ranging from 18-24 LPA with fixed compensation and performance-based incentives. Equity options are also available for long-term growth and investment in the company's success. You will be part of a dynamic work environment that encourages innovation and professional growth. Skills required for this role include leadership, team management, communication, professional growth, AWS IaaS, cloud orchestration, PaaS services, management of customer cloud environments, customer success, cloud migration, AWS, verbal communication, equity options, cloud concepts, sizing skills, sales mindset, written communication, problem-solving skills, innovation, analytical skills, interpersonal communication, and cloud sales.,

Posted 3 days ago

Apply

7.0 - 10.0 years

20 - 27 Lacs

Noida

Work from Office

Job Responsibilities: Technical Leadership: • Provide technical leadership and mentorship to a team of data engineers. • Design, architect, and implement highly scalable, resilient, and performant data pipelines, using GCP technologies is a plus (e.g., Dataproc, Cloud Composer, Pub/Sub, BigQuery). • Guide the team in adopting best practices for data engineering, including CI/CD, infrastructure-as-code, and automated testing. • Conduct code reviews, design reviews, and provide constructive feedback to team members. • Stay up-to-date with the latest technologies and trends in data engineering, Data Pipeline Development: • Develop and maintain robust and efficient data pipelines to ingest, process, and transform large volumes of structured and unstructured data from various sources. • Implement data quality checks and monitoring systems to ensure data accuracy and integrity. • Collaborate with cross functional teams, and business stakeholders to understand data requirements and deliver data solutions that meet their needs. Platform Building & Maintenance: • Design and implement secure and scalable data storage solutions • Manage and optimize cloud infrastructure costs related to data engineering workloads. • Contribute to the development and maintenance of data engineering tooling and infrastructure to improve team productivity and efficiency. Collaboration & Communication: • Effectively communicate technical designs and concepts to both technical and non-technical audiences. • Collaborate effectively with other engineering teams, product managers, and business stakeholders. • Contribute to knowledge sharing within the team and across the organization. Required Qualifications: • Bachelor's or Master's degree in Computer Science, Engineering, or a related field. • 7+ years of experience in data engineering and Software Development. • 7+ years of experience of coding in SQL and Python/Java. • 3+ years of hands-on experience building and managing data pipelines in cloud environment like GCP. • Strong programming skills in Python or Java, with experience in developing data-intensive applications. • Expertise in SQL and data modeling techniques for both transactional and analytical workloads. • Experience with CI/CD pipelines and automated testing frameworks. • Excellent communication, interpersonal, and problem-solving skills. • Experience leading or mentoring a team of engineers Roles and Responsibilities Job Responsibilities: Technical Leadership: • Provide technical leadership and mentorship to a team of data engineers. • Design, architect, and implement highly scalable, resilient, and performant data pipelines, using GCP technologies is a plus (e.g., Dataproc, Cloud Composer, Pub/Sub, BigQuery). • Guide the team in adopting best practices for data engineering, including CI/CD, infrastructure-as-code, and automated testing. • Conduct code reviews, design reviews, and provide constructive feedback to team members. • Stay up-to-date with the latest technologies and trends in data engineering, Data Pipeline Development: • Develop and maintain robust and efficient data pipelines to ingest, process, and transform large volumes of structured and unstructured data from various sources. • Implement data quality checks and monitoring systems to ensure data accuracy and integrity. • Collaborate with cross functional teams, and business stakeholders to understand data requirements and deliver data solutions that meet their needs. Platform Building & Maintenance: • Design and implement secure and scalable data storage solutions • Manage and optimize cloud infrastructure costs related to data engineering workloads. • Contribute to the development and maintenance of data engineering tooling and infrastructure to improve team productivity and efficiency. Collaboration & Communication: • Effectively communicate technical designs and concepts to both technical and non-technical audiences. • Collaborate effectively with other engineering teams, product managers, and business stakeholders. • Contribute to knowledge sharing within the team and across the organization. Required Qualifications: • Bachelor's or Master's degree in Computer Science, Engineering, or a related field. • 7+ years of experience in data engineering and Software Development. • 7+ years of experience of coding in SQL and Python/Java. • 3+ years of hands-on experience building and managing data pipelines in cloud environment like GCP. • Strong programming skills in Python or Java, with experience in developing data-intensive applications. • Expertise in SQL and data modeling techniques for both transactional and analytical workloads. • Experience with CI/CD pipelines and automated testing frameworks. • Excellent communication, interpersonal, and problem-solving skills. • Experience leading or mentoring a team of engineers

Posted 1 week ago

Apply

3.0 - 5.0 years

5 - 8 Lacs

Chennai

Work from Office

" PLEASE READ THE JOB DESTCRIPTION AND APPLY" Data Engineer Job Description Position Overview Yesterday is history, tomorrow is a mystery, but today is a gift. That's why we call it the present. - Master Oogway Join CustomerLabs' dynamic data team as a Data Engineer and play a pivotal role in transforming raw marketing data into actionable insights that power our digital marketing platform. As a key member of our data infrastructure team, you will design, develop, and maintain robust data pipelines, data warehouses, and analytics platforms that serve as the backbone of our digital marketing product development. Sometimes the hardest choices require the strongest wills. - Thanos (but we promise, our data decisions are much easier! ) In this role, you will collaborate with cross-functional teams including Data Scientists, Product Managers, and Marketing Technology specialists to ensure seamless data flow from various marketing channels, ad platforms, and customer touchpoints to our analytics dashboards and reporting systems. You'll be responsible for building scalable, reliable, and efficient data solutions that can handle high-volume marketing data processing and real-time campaign analytics. What You'll Do: - Design and implement enterprise-grade data pipelines for marketing data ingestion and processing - Build and optimize data warehouses and data lakes to support digital marketing analytics - Ensure data quality, security, and compliance across all marketing data systems - Create data models and schemas that support marketing attribution, customer journey analysis, and campaign performance tracking - Develop monitoring and alerting systems to maintain data pipeline reliability for critical marketing operations - Collaborate with product teams to understand digital marketing requirements and translate them into technical solutions Why This Role Matters: I can do this all day. - Captain America (and you'll want to, because this role is that rewarding!) You'll be the backbone behind the data infrastructure that powers CustomerLabs' digital marketing platform, making marketers' lives easier and better. Your work directly translates to smarter automation, clearer insights, and more successful campaigns - helping marketers focus on what they do best while we handle the complex data heavy lifting. Sometimes you gotta run before you can walk. - Iron Man (and sometimes you gotta build the data pipeline before you can analyze the data! ) Our Philosophy: We believe in the power of data to transform lives, just like the Dragon Warrior transformed the Valley of Peace. Every line of code you write, every pipeline you build, and every insight you enable has the potential to change how marketers work and succeed. We're not just building data systems - we're building the future of digital marketing, one insight at a time. Your story may not have such a happy beginning, but that doesn't make you who you are. It is the rest of your story, who you choose to be. - Soothsayer What Makes You Special: We're looking for someone who embodies the spirit of both Captain America's unwavering dedication and Iron Man's innovative genius. You'll need the patience to build robust systems (like Cap's shield ) and the creativity to solve complex problems (like Tony's suit). Most importantly, you'll have the heart to make a real difference in marketers' lives. Inner peace... Inner peace... Inner peace... - Po (because we know data engineering can be challenging, but we've got your back! ) Key Responsibilities Data Pipeline Development - Design, build, and maintain robust, scalable data pipelines and ETL/ELT processes - Develop data ingestion frameworks to collect data from various sources (databases, APIs, files, streaming sources) - Implement data transformation and cleaning processes to ensure data quality and consistency - Optimize data pipeline performance and reliability Data Infrastructure Management - Design and implement data warehouse architectures - Manage and optimize database systems (SQL and NoSQL) - Implement data lake solutions and data governance frameworks - Ensure data security, privacy, and compliance with regulatory requirements Data Modeling and Architecture - Design and implement data models for analytics and reporting - Create and maintain data dictionaries and documentation - Develop data schemas and database structures - Implement data versioning and lineage tracking Data Quality, Security, and Compliance - Ensure data quality, integrity, and consistency across all marketing data systems - Implement and monitor data security measures to protect sensitive information - Ensure privacy and compliance with regulatory requirements (e.g., GDPR, CCPA) - Develop and enforce data governance policies and best practices Collaboration and Support - Work closely with Data Scientists, Analysts, and Business stakeholders - Provide technical support for data-related issues and queries Monitoring and Maintenance - Implement monitoring and alerting systems for data pipelines - Perform regular maintenance and optimization of data systems - Troubleshoot and resolve data pipeline issues - Conduct performance tuning and capacity planning Required Qualifications Experience - 2+ years of experience in data engineering or related roles - Proven experience with ETL/ELT pipeline development - Experience with cloud data platform (GCP) - Experience with big data technologies Technical Skills - Programming Languages : Python, SQL, Golang (preferred) - Databases: PostgreSQL, MySQL, Redis - Big Data Tools: Apache Spark, Apache Kafka, Apache Airflow, DBT, Dataform - Cloud Platforms: GCP (BigQuery, Dataflow, Cloud run, Cloud SQL, Cloud Storage, Pub/Sub, App Engine, Compute Engine etc.) - Data Warehousing: Google BigQuery - Data Visualization: Superset, Looker, Metabase, Tableau - Version Control: Git, GitHub - Containerization: Docker Soft Skills - Strong problem-solving and analytical thinking - Excellent communication and collaboration skills - Ability to work independently and in team environments - Strong attention to detail and data quality - Continuous learning mindset Preferred Qualifications Additional Experience - Experience with real-time data processing and streaming - Knowledge of machine learning pipelines and MLOps - Experience with data governance and data catalog tools - Familiarity with business intelligence tools (Tableau, Power BI, Looker, etc.) - Experience using AI-powered tools (such as Cursor, Claude, Copilot, ChatGPT, Gemini, etc.) to accelerate coding, automate tasks, or assist in system design ( We belive run with machine, not against machine ) Interview Process 1. Initial Screening: Phone/video call with HR 2. Technical Interview: Deep dive into data engineering concepts 3. Final Interview: Discussion with senior leadership Note: This job description is intended to provide a general overview of the position and may be modified based on organizational needs and candidate qualifications. Our Team Culture We are Groot. - We work together, we grow together, we succeed together. We believe in: - Innovation First - Like Iron Man, we're always pushing the boundaries of what's possible - Team Over Individual - Like the Avengers, we're stronger together than apart - Continuous Learning - Like Po learning Kung Fu, we're always evolving and improving - Making a Difference - Like Captain America, we fight for what's right (in this case, better marketing!) Growth Journey There is no charge for awesomeness... or attractiveness. - Po Your journey with us will be like Po's transformation from noodle maker to Dragon Warrior: - Level 1 : Master the basics of our data infrastructure - Level 2: Build and optimize data pipelines - Level 3 : Lead complex data projects and mentor others - Level 4: Become a data engineering legend (with your own theme music! ) What We Promise I am Iron Man. - We promise you'll feel like a superhero every day! - Work that matters - Every pipeline you build helps real marketers succeed - Growth opportunities - Learn new technologies and advance your career - Supportive team - We've got your back, just like the Avengers - Work-life balance - Because even superheroes need rest!

Posted 1 week ago

Apply

3.0 - 8.0 years

10 - 20 Lacs

Noida, New Delhi, Gurugram

Hybrid

Role & responsibilities Strategically partner with the Customer Cloud Sales Team to identify and qualify business opportunities and identify key customer technical objections. Develop strategies to resolve technical obstacles and architect client solutions to meet complex business and technical requirements Lead the technical aspects of the sales cycle, including technical trainings, client presentations, technical bid responses, product and solution briefings, and proof-of-concept technical work Identify and respond to key technical objections from client, providing prescriptive guidance for successful resolutions tailored to specific client needs May directly work with Customer's Cloud products to demonstrate, design and prototype integrations in customer/partner environments Develop and deliver thorough product messaging to highlight advanced technical value propositions, using techniques such as: whiteboard and slide presentations, technical product demonstrations, white papers, trial management and RFI response documents Assess technical challenges to develop and deliver recommendations on integration strategies, enterprise architectures, platforms and application infrastructure required to successfully implement a complete solution Leverage technical expertise to provide best practice counsel to optimize advanced technical products effectiveness THER CRITICAL FUNCTIONS AND RESPONSIBILTIES Ensure customer data is accurate and actionable using Salesforce.com (SFDC) systems Leverage 3rd party prospect and account intelligence tools to extract meaningful insights and support varying client needs Navigate, analyse and interpret technical documentation for technical products, often including Customer Cloud products Enhance skills and knowledge by using a Learning Management Solution (LMS) for training and certification Serve as a technical and subject matter expert to support advanced trainings for team members on moderate to highly complex technical subjects Offer thought leadership in the advanced technical solutions, such as cloud computing Coach and mentor team members and advise managers on creating business and process efficiencies in internal workflows and training materials Collect and codify best practices between sales, marketing, and sales engineers Preferred candidate profile Required Qualifications Bachelors degree in Computer Science or other technical field, or equivalent practical experience (preferred) 3-5 years of experience serving in a technical Sales Engineer in an advanced technical environment Prior experience with advanced technologies, such as: Big Data, PaaS, and IaaS technologies, etc. Proven strong communication skills with a proactive and positive approach to task management (written and verbal)Confident presenter with excellent presentation and persuasion skills Strong work ethic and ability to work independently Perks and benefits

Posted 1 week ago

Apply

7.0 - 12.0 years

11 - 15 Lacs

Noida

Work from Office

Primary Skill(s): Lead Data Visualization Engineer with experience in Sigma BI Experience: 7+ Years in of experience in Data Visualization with experience in Sigma BI, PowerBI, Tableau or Looker Job Summary: Lead Data Visualization Engineer with deep expertise in Sigma BI and a strong ability to craft meaningful, insight-rich visual stories for business stakeholders. This role will be instrumental in transforming raw data into intuitive dashboards and visual analytics, helping cross-functional teams make informed decisions quickly and effectively. Key Responsibilities: Lead the design, development, and deployment of Sigma BI dashboards and reports tailored for various business functions. Translate complex data sets into clear, actionable insights using advanced visualization techniques. Collaborate with business stakeholders to understand goals, KPIs, and data requirements. Build data stories that communicate key business metrics, trends, and anomalies. Serve as a subject matter expert in Sigma BI and guide junior team members on best practices. Ensure visualizations follow design standards, accessibility guidelines, and performance optimization. Partner with data engineering and analytics teams to source and structure data effectively. Conduct workshops and training sessions to enable business users in consuming and interacting with dashboards. Drive the adoption of self-service BI tools and foster a data-driven decision-making culture. Required Skills & Experience: 7+ years of hands-on experience in Business Intelligence, with at least 2 years using Sigma BI. Proven ability to build end-to-end dashboard solutions that tell a story and influence decisions. Strong understanding of data modeling, SQL, and cloud data platforms (Snowflake, BigQuery, etc.). Demonstrated experience working with business users, gathering requirements, and delivering user-friendly outputs. Proficient in data storytelling, UX design principles, and visualization best practices. Experience integrating Sigma BI with modern data stacks and APIs is a plus. Excellent communication and stakeholder management skills. Preferred Qualifications: Experience with other BI tools (such as Sigma BI, Tableau, Power BI, Looker) is a plus. Familiarity with AWS Cloud Data Ecosystems (AWS Databricks). Background in Data Analysis, Statistics, or Business Analytics. Working Hours: 2 PM 11 PM IST [~4.30 AM ET 1.30 PM ET]. Communication skills: Good Mandatory Competencies BI and Reporting Tools - BI and Reporting Tools - Power BI BI and Reporting Tools - BI and Reporting Tools - Tableau Database - Database Programming - SQL Cloud - GCP - Cloud Data Fusion, Dataproc, BigQuery, Cloud Composer, Cloud Bigtable Data Science and Machine Learning - Data Science and Machine Learning - Databricks Cloud - AWS - ECS DMS - Data Analysis Skills Beh - Communication and collaboration

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

As a member of our team, you will be responsible for the logical design, sizing, interoperability, scaling, and security aspects definition of the solution. Your role will involve managing cloud environments in accordance with company security guidelines, analyzing, optimizing, implementing, and maintaining Multi-Platform cloud-based back-end computing environments. Additionally, you will implement and maintain security technologies and sound engineering design for all cloud deployments. You will be tasked with researching, auditing, testing, and setting standards for AWS and Azure cloud deployment frameworks and best practices to ensure compatibility and functionality in the enterprise. Independently developing reference materials for supported cloud technologies and leading the deployment and operation of various cloud-related infrastructure components will be part of your responsibilities. Reporting on the health of the cloud services to leadership, supporting project execution activities, and making recommendations for improvements to security, scalability, manageability, and performance across a wide variety of cloud network, storage, and compute technologies are key aspects of this role. You will be required to liaise with customers, architecture leadership, and technical teams, including systems and network administrators, security engineers, and IT support teams. Furthermore, you will build and configure build plans, code pipelines, and create automated solutions that can be framed and re-used. Managing assigned projects, meeting deadlines with minimal supervision, and performing other duties as assigned will also be part of your responsibilities. To be successful in this role, you must possess a strong understanding of IT technology, including hardware and software, with a holistic end-to-end focus on applications and services architecture. A minimum of 5 years of IT background, with at least 3 years of cloud infrastructure and engineering experience, is required. Experience with virtualization, cloud, server, storage, and networking technologies is essential. Knowledge of DevOps, SDLC, containers, microservices, APIs design, cloud computing models (IaaS, PaaS, SaaS), cloud orchestration, and monitoring technology is crucial. Experience in AWS, Azure, .Net, ITSM, IT operations, and programming skills is a plus. Excellent communication skills, the ability to work individually, within a team, and partner with other business groups are also essential. Experience in Disaster Recovery and Business Continuity initiatives, the ability to develop policies and procedures, conduct risk assessments, and effectively communicate at the C-level are required. A BS/BA degree in Computer Science or equivalent experience is necessary, while certifications such as TOGAF, AWS, Infrastructure, and Cloud certifications are advantageous.,

Posted 1 week ago

Apply

3.0 - 6.0 years

13 - 18 Lacs

Bengaluru

Work from Office

We are looking to hire Data engineer for the Platform Engineering team. It is a collection of highly skilled individuals ranging from development to operations with a security first mindset who strive to push the boundaries of technology. We champion a DevSecOps culture and raise the bar on how and when we deploy applications to production. Our core principals are centered around automation, testing, quality, and immutability all via code. The role is responsible for building self-service capabilities that improve our security posture, productivity, and reduce time to market with automation at the core of these objectives. The individual collaborates with teams across the organization to ensure applications are designed for Continuous Delivery (CD) and are well-architected for their targeted platform which can be on-premise or the cloud. If you are passionate about developer productivity, cloud native applications, and container orchestration, this job is for you! Principal Accountabilities: The incumbent is mentored by senior individuals on the team to capture the flow and bottlenecks in the holistic IT delivery process and define future tool sets Skills and Software Requirements: Experience with a language such as Python, Go,SQL, Java, or Scala GCP data services (BigQuery; Dataflow; Dataproc; Cloud Composer; Pub/Sub; Google Cloud Storage; IAM) Experience with Jenkins, Maven, Git, Ansible, or CHEF Experience working with containers, orchestration tools (like Kubernetes, Mesos, Docker Swarm etc.) and container registries (GCE, Docker hub etc.) Experience with [SPI]aaS- Software-as-a-Service, Platform-as-a-Service, or Infrastructure-as- a-Service Acquire, cleanse, and ingest structured and unstructured data on the cloud Combine data from disparate sources in to a single, unified, authoritative view of data (e.g., Data Lake) Enable and support data movement from one system service to another system service Experience implementing or supporting automated solutions to technical problems Experience working in a team environment, proactively executing on tasks while meeting agreed delivery timelines Ability to contribute to effective and timely solutions Excellent oral and written communication skills

Posted 2 weeks ago

Apply

3.0 - 4.0 years

5 - 7 Lacs

Chandigarh

Work from Office

Key Responsibilities Design, develop, and maintain scalable ETL workflows using Cloud Data Fusion and Apache Airflow . Configure and manage various data connectors (e.g., Cloud Storage, Pub/Sub, JDBC, SaaS APIs) for batch and streaming data ingestion. Implement data transformations, cleansing, and enrichment logic in Python (and SQL) to meet analytic requirements. Optimize BigQuery data models (fact/dimension tables, partitioning, clustering) for performance and cost-efficiency. Monitor, troubleshoot, and tune pipeline performance; implement robust error-handling and alerting mechanisms. Collaborate with data analysts, BI developers, and architects to understand data requirements and deliver accurate datasets. Maintain documentation for data pipelines, schemas, and operational runbooks. Ensure data security and governance best practices are followed across the data lifecycle. Minimum Qualifications 3+ years of hands-on experience in data engineering, with a focus on cloud-native ETL. Proven expertise with Google Cloud Data Fusion , including pipeline authoring and custom plugin development. Solid experience building and orchestrating pipelines in Apache Airflow (DAG design, operators, hooks). Strong Python programming skills for data manipulation and automation. Deep understanding of BigQuery : schema design, SQL scripting, performance tuning, and cost management. Familiarity with additional GCP services: Cloud Storage, Pub/Sub, Dataflow, and IAM. Experience with version control (Git), CI/CD pipelines, and DevOps practices for data projects. Excellent problem-solving skills, attention to detail, and the ability to work independently in a fast-paced environment. Immediate availability to join. Preferred (Nice-to-Have) Experience with other data integration tools (e.g., Dataflow, Talend, Informatica). Knowledge of containerization (Docker, Kubernetes) for scalable data workloads. Familiarity with streaming frameworks (Apache Beam, Spark Streaming). Background in data modeling methodologies (Star/Snowflake schemas). Exposure to metadata management, data cataloguing, and data governance frameworks.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Bengaluru

Work from Office

About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google Dataproc, Apache Spark Good to have skills : Apache AirflowMinimum 5 year(s) of experience is required Educational Qualification : minimum 15 years of fulltime education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Google Dataproc. Your typical day will involve working with Apache Spark and collaborating with cross-functional teams to deliver impactful data-driven solutions. Roles & Responsibilities:- Design, build, and configure applications to meet business process and application requirements using Google Dataproc.- Collaborate with cross-functional teams to deliver impactful data-driven solutions.- Utilize Apache Spark for data processing and analysis.- Develop and maintain technical documentation for applications. Professional & Technical Skills: - Strong expereince in Apache Spark and Java for Spark.- Strong experience with multiple database models ( SQL, NoSQL, OLTP and OLAP)- Strong experience with Data Streaming Architecture ( Kafka, Spark, Airflow)- Strong knowledge of cloud data platforms and technologies such as GCS, BigQuery, Cloud Composer, Dataproc and other cloud-native offerings- Knowledge of Infrastructure as Code (IaC) and associated tools (Terraform, ansible etc)- Experience pulling data from a variety of data source types including Mainframe (EBCDIC), Fixed Length and delimited files, databases (SQL, NoSQL, Time-series)- Experience performing analysis with large datasets in a cloud-based environment, preferably with an understanding of Googles Cloud Platform (GCP)- Comfortable communicating with various stakeholders (technical and non-technical)- GCP Data Engineer Certification is a nice to have Additional Information:- The candidate should have a minimum of 5 years of experience in Google Dataproc and Apache Spark.- The ideal candidate will possess a strong educational background in software engineering or a related field.- This position is based at our Mumbai office. Qualification minimum 15 years of fulltime education

Posted 3 weeks ago

Apply

8.0 - 13.0 years

7 - 12 Lacs

Pune

Work from Office

: Job TitleBusiness Functional Analyst Corporate TitleAssociate LocationPune, India Role Description Business Functional Analysis is responsible for business solution design in complex project environments (e.g. transformational programmes). Work includes: Identifying the full range of business requirements and translating requirements into specific functional specifications for solution development and implementation Analysing business requirements and the associated impacts of the changes Designing and assisting businesses in developing optimal target state business processes Creating and executing against roadmaps that focus on solution development and implementation Answering questions of methodological approach with varying levels of complexity Aligning with other key stakeholder groups (such as Project Management & Software Engineering) to support the link between the business divisions and the solution providers for all aspects of identifying, implementing and maintaining solutions What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Write clear and well-structured business requirements/documents. Convert roadmap features into smaller user stories. Analyse process issues and bottlenecks and to make improvements. Communicate and validate requirements with relevant stakeholders. Perform data discovery, analysis, and modelling. Assist with project management for selected projects. Understand and translate business needs into data models supporting long-term solutions. Understand existing SQL/Python code convert to business requirement. Write advanced SQL and Python scripts. Your skills and experience A minimum of 8+ years of experience in business analysis or a related field. Exceptional analytical and conceptual thinking skills. Proficient in SQL. Proficient in Python for data engineering. Experience in automating ETL testing using python and SQL. Exposure on GCP services corresponding cloud storage, data lake, database, data warehouse; like Big Query, GCS, Dataflow, Cloud Composer, gsutil, Shell Scripting etc. Previous experience in Procurement and Real Estate would be plus. Competency in JIRA, Confluence, draw i/o and Microsoft applications including Word, Excel, Power Point an Outlook. Previous Banking Domain experience is a plus. Good problem-solving skills How well support you .

Posted 3 weeks ago

Apply

7.0 - 12.0 years

16 - 20 Lacs

Pune

Work from Office

: Job TitleData Engineer (ETL, Big Data, Hadoop, Spark, GCP), AS Location:Pune, India Role Description Engineer is responsible for developing and delivering elements of engineering solutions to accomplish business goals. Awareness is expected of the important engineering principles of the bank. Root cause analysis skills develop through addressing enhancements and fixes 2 products build reliability and resiliency into solutions through early testing peer reviews and automating the delivery life cycle. Successful candidate should be able to work independently on medium to large sized projects with strict deadlines. Successful candidates should be able to work in a cross application mixed technical environment and must demonstrate solid hands-on development track record while working on an agile methodology. The role demands working alongside a geographically dispersed team. The position is required as a part of the buildout of Compliance tech internal development team in India. The overall team will primarily deliver improvements in compliance tech capabilities that are major components of the regular regulatory portfolio addressing various regulatory common commitments to mandate monitors. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Analyzing data sets and designing and coding stable and scalable data ingestion workflows also integrating into existing workflows Working with team members and stakeholders to clarify requirements and provide the appropriate ETL solution. Hands-on experience for various data sourcing in Hadoop also GCP. Ensuring new code is tested both at unit level and system level design develop and peer review new code and functionality. Operate as a team member of an agile scrum team. Root cause analysis skills to identify bugs and issues for failures. Support Prod support and release management teams in their tasks. Your skills and experience: More than 7+ years of coding experience in experience and reputed organizations Hands on experience in Bitbucket and CI/CD pipelines Proficient in Hadoop, Python, Spark, SQL Unix and Hive Basic understanding of on Prem and GCP data security Hands on development experience on large ETL/ big data systems .GCP being a big plus Hands on experience on cloud build, artifact registry ,cloud DNS ,cloud load balancing etc. Hands on experience on Data flow, Cloud composer, Cloud storage ,Data proc etc. Basic understanding of data quality dimensions like Consistency, Completeness, Accuracy, Lineage etc. Hands on business and systems knowledge gained in a regulatory delivery environment. Banking experience regulatory and cross product knowledge. Passionate about test driven development. How well support you . . .

Posted 3 weeks ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Bengaluru

Work from Office

We are seeking a skilled DevOps Engineer with strong experience in Google Cloud Platform (GCP) to support AI/ML project infrastructure. The ideal candidate will work closely with data scientists, ML engineers, and developers to build and manage scalable, secure, and automated pipelines for AI/ML model training, testing, and deployment. Responsibilities: Design and manage cloud infrastructure to support AI/ML workloads on GCP. Develop and maintain CI/CD pipelines for ML models and applications. Automate model training, validation, deployment, and monitoring processes using tools like Kubeflow, Vertex AI, Cloud Composer, Airflow, etc. Set up and manage infrastructure as code (IaC) using tools such as Terraform or Deployment Manager. Implement robust security, monitoring, logging, and alerting systems using Cloud Monitoring, Cloud Logging, Prometheus, Grafana, etc. Collaborate with ML engineers and data scientists to optimize compute environments (e.g., GPU/TPU instances, notebooks). Manage and maintain containerized environments using Docker and Kubernetes (GKE). Ensure cost-efficient cloud resource utilization and governance. Required Skills Bachelor's degree in engineering or relevant field Must have 4 years of proven experience as DevOps Engineer with at least 1 year on GCP Strong experience with DevOps tools and methodologies in production environments Proficiency in scripting with Python, Bash, or Shell Experience with Terraform, Ansible, or other IaC tools. Deep understanding of Docker, Kubernetes, and container orchestration Knowledge of CI/CD pipelines, automated testing, and model deployment best practices. Familiarity with ML lifecycle tools such as MLflow, Kubeflow Pipelines, or TensorFlow Extended (TFX). Experience in designing conversational flows for AI Agents/chatbot

Posted 3 weeks ago

Apply

3.0 - 5.0 years

5 - 5 Lacs

Kochi, Thiruvananthapuram

Work from Office

Role Proficiency: Independently develops error free code with high quality validation of applications guides other developers and assists Lead 1 - Software Engineering Outcomes: Understand and provide input to the application/feature/component designs; developing the same in accordance with user stories/requirements. Code debug test document and communicate product/component/features at development stages. Select appropriate technical options for development such as reusing improving or reconfiguration of existing components. Optimise efficiency cost and quality by identifying opportunities for automation/process improvements and agile delivery models Mentor Developer 1 - Software Engineering and Developer 2 - Software Engineering to effectively perform in their roles Identify the problem patterns and improve the technical design of the application/system Proactively identify issues/defects/flaws in module/requirement implementation Assists Lead 1 - Software Engineering on Technical design. Review activities and begin demonstrating Lead 1 capabilities in making technical decisions Measures of Outcomes: Adherence to engineering process and standards (coding standards) Adherence to schedule / timelines Adhere to SLAs where applicable Number of defects post delivery Number of non-compliance issues Reduction of reoccurrence of known defects Quick turnaround of production bugs Meet the defined productivity standards for project Number of reusable components created Completion of applicable technical/domain certifications Completion of all mandatory training requirements Outputs Expected: Code: Develop code independently for the above Configure: Implement and monitor configuration process Test: Create and review unit test cases scenarios and execution Domain relevance: Develop features and components with good understanding of the business problem being addressed for the client Manage Project: Manage module level activities Manage Defects: Perform defect RCA and mitigation Estimate: Estimate time effort resource dependence for one's own work and others' work including modules Document: Create documentation for own work as well as perform peer review of documentation of others' work Manage knowledge: Consume and contribute to project related documents share point libraries and client universities Status Reporting: Report status of tasks assigned Comply with project related reporting standards/process Release: Execute release process Design: LLD for multiple components Mentoring: Mentor juniors on the team Set FAST goals and provide feedback to FAST goals of mentees Skill Examples: Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Develop user interfaces business software components and embedded software components 5 Manage and guarantee high levels of cohesion and quality6 Use data models Estimate effort and resources required for developing / debugging features / components Perform and evaluate test in the customer or target environment Team Player Good written and verbal communication abilities Proactively ask for help and offer help Knowledge Examples: Appropriate software programs / modules Technical designing Programming languages DBMS Operating Systems and software platforms Integrated development environment (IDE) Agile methods Knowledge of customer domain and sub domain where problem is solved Additional Comments: UST is looking for Java Senior developers to build end to end business solutions and to work with one of the leading financial services organization in the UK. The ideal candidate must possess strong background on frontend and backend development technologies. The candidate must possess excellent written and verbal communication skills with the ability to collaborate effectively with domain experts and technical experts in the team. Responsibilities: As a Java developer, you will - Maintain active relationships with Product Owner to understand business requirements, lead requirement gathering meetings and review designs with the product owner - Own backlog items and coordinate with other team members to develop the features planned for each sprint - Perform technical design reviews and code reviews - Mentor, Lead and Guide the team on technical skills - Be Responsible for prototyping, developing, and troubleshooting software in the user interface or service layers - Perform peer reviews on source code to ensure reuse, scalability and the use of best practices - Participate in collaborative technical discussions that focus on software user experience, design, architecture, and development - Perform demonstrations for client stakeholders on project features and sub features, which utilizes the latest Front end and Backend development technologies Requirements: - 5+ years of experience in Java/JEE development - Skills in developing applications using multi-tier architecture - 2+ years of experience in GCP service development is preferred - Skills in developing applications in GCP is preferred - Should be an expert in Cloud Composer, Data Flow, Dataproc, Cloud pub/sub, DAG creation - Python scripting knowledge is preferred - Apache Beam knowledge is mandatory - Java/JEE, Spring, Spring boot, REST/SOAP web services, Hibernate, SQL, Tomcat, Application servers (WebSphere), SONAR, Agile, AJAX, Jenkins... - Skills in UML, application designing/architecture, Design Patterns.. - Skills in Unit testing application using Junit or similar technologies - Capability to support QA teams with test plans, root cause analysis and defect fixing - Strong experience in Responsive design, cross browser web applications - Strong knowledge of web service models - Strong knowledge in creating and working with APIs - Experience with Cloud services, specifically on Google cloud - Strong exposure in Agile, Scaled Agile based development models - Familiar with Interfaces such as REST web services, swagger profiles, JSON payloads. - Familiar with tools/utilities such as Bitbucket / Jira / Confluence. Required Skills Java,Spring ,Spring Boot,Microservices

Posted 4 weeks ago

Apply

5.0 - 8.0 years

4 - 7 Lacs

Noida

Work from Office

company name=Apptad Technologies Pvt Ltd., industry=Employment Firms/Recruitment Services Firms, experience=5 to 8 , jd= Job Title:-GCP Admin Job Location:- Remote Job Type:- Full Time JD:- Responsibilities— Manages and configures roles/permissions in GCP IAM by following the principle of least privileged access — Manages Big Query service by way of optimizing slot assignments and SQL Queries, adopting FinOps practices for cost control, troubleshooting and resolution of critical data queries, etc. — Collaborate with teams like Data Engineering, Data Warehousing, Cloud Platform Engineering, SRE, etc. for efficient Data management and operational practices in GCP — Create automations and monitoring mechanisms for GCP Data-related services, processes and tasks — Work with development teams to design the GCP-specific cloud architecture — Provisioning and de-provisioning GCP accounts and resources for internal projects. — Managing, and operating multiple GCP subscriptions — Keep technical documentation up to date — Proactively being up to date on GCP announcements, services and developments. — Must have 8+ years of work experience on provisioning, operating, and maintaining systems in GCP — Must have a valid certification of either GCP Associate Cloud Engineer or GCP Professional Cloud Architect. — Must have hands-on experience on GCP services such as Identity and Access Management (IAM), BigQuery, Google Kubernetes Engine (GKE), etc. — Must be capable to provide support and guidance on GCP operations and services depending upon enterprise needs — Must have a working knowledge of docker containers and Kubernetes. — Must have strong communication skills and the ability to work both independently and in a collaborative environment. — Fast learner, Achiever, sets high personal goals — Must be able to work on multiple projects and consistently meet project deadlines — Must be willing to work on shift-basis based on project requirements. Good to Have— Experience in Terraform Automation over GCP Infrastructure provisioning — Experience in Cloud Composer, Dataproc, Dataflow Storage and Monitoring services — Experience in building and supporting any form of data pipeline. — Multi-Cloud experience with AWS. — New-Relic monitoring., Title=GCP Admin, ref=6566414

Posted 1 month ago

Apply

3.0 - 7.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Job Summary Google Cloud Platform (GCP) Services Hands-on experience with GCP Cloud Storage, Cloud Composer, and Dataflow ETL and Data Pipeline Development Strong knowledge of building end-to-end data pipelines, including data ingestion, transformation, and broadcasting across heterogeneous data systems Database TechnologiesStrong in Complex SQL Proficiency in SQL and NoSQL databases (e g , MongoDB, PostgreSQL), with the ability to write complex queries and optimize performance Programming Skills (Python, Java, or Scala) Proficient in at least one programming language for developing scalable data solutions and automation scripts CI/CD and DevOps Tools Leveraging tools like GitHub, CircleCI, and Harness to automate deployment workflows and manage data pipeline releases efficiently Application Deployment & Observability Experience in production deployment, issue triage, and use of observability tools and best practices

Posted 1 month ago

Apply

6.0 - 11.0 years

4 - 8 Lacs

Bengaluru

Work from Office

React Developer Responsibilities Developing new user-facing features using React.js Building reusable components and front-end libraries for future use Translate User Stories and wireframes into high quality code Create applications which provide fantastic UI/UX and responsive design Integrate apps with third-party APIs and Cloud APIs Apply core Computer Science concepts to improve consumer web apps Profile and improve our frontend performance Design for scalability and adherence to standards Required Skills: Should be excellent in UI development using React framework Should be strong in Redux or Flux Should be strong in JavaScript (ES 6 and above standards)

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Pune

Work from Office

Tech stack GCP Data Fusion, BigQuery, Dataproc, SQL/T-SQL, Cloud Run, Secret Manager Git, Ansible Tower / Ansible scripts, Jenkins, Java, Python, Terraform, Cloud Composer/Airflow Experience and Skills Must Have Proven (3+ years) hands on experience in designing, testing, and implementing data ingestion pipelines on GCP Data Fusion, CDAP or similar tools, including ingestion and parsing and wrangling of CSV, JSON, XML etc formatted data from RESTful & SOAP APIs, SFTP servers, etc. Modern world data contract best practices in-depth understanding with proven experience (3+ years) for independently directing, negotiating, and documenting best in class data contracts. Java (2+ years) experience in development, testing and deployment (ideally custom plugins for Data Fusion) Proficiency in working with Continuous Integration (CI), Continuous Delivery (CD) and continuous testing tools, ideally for Cloud based Data solutions. Experience in working in Agile environment and toolset. Strong problem-solving and analytical skills Enthusiastic willingness to learn and develop technical and soft skills as needs require rapidly and independently. Strong organisational and multi-tasking skills. Good team player who embraces teamwork and mutual support. Nice to Have Hands on experience in Cloud Composer/Airflow, Cloud Run, Pub/Sub Hands on development in Python, Terraform Strong SQL skills for data transformation, querying and optimization in BigQuery, with a focus on cost, time-effective SQL coding and concurrency/data integrity (ideally in BigQuery dialect) Data Transformation/ETL/ELT pipelines development, testing and implementation ideally in Big Query Experience in working in DataOps model Experience in Data Vault modelling and usage. Proficiency in Git usage for version control and collaboration. Proficiency with CI/CD processes/pipelines designing, creation, maintenance in DevOps tools like Ansible/Jenkins etc. for Cloud Based Applications (Ideally GCP)

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Bengaluru

Work from Office

Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build what’s next for their businesses. Your Role Solution Design & Architecture Implementation & Deployment Technical Leadership & Guidance Client Engagement & Collaboration Performance Monitoring & Optimization Your Profile Bachelor's or Master's degree in Computer Science, Engineering, or a related field. 3-8 years of experience in designing, implementing, and managing data solutions. 3-8 years of hands-on experience working with Google Cloud Platform (GCP) data services. Strong expertise in core GCP data services, including BigQuery (Data Warehousing) Cloud Storage (Data Lake) Dataflow (ETL/ELT) Cloud Composer (Workflow Orchestration - Apache Airflow) Pub/Sub and Dataflow (Streaming Data) Cloud Data Fusion (Graphical Data Integration) Dataproc (Managed Hadoop and Spark) Proficiency in SQL and experience with data modeling techniques. Experience with at least one programming language (e.g., Python, Java, Scala). Experience with Infrastructure-as-Code (IaC) tools such as Terraform or Cloud Deployment Manager. Understanding of data governance, security, and compliance principles in a cloud environment. Experience with CI/CD pipelines and DevOps practices. Excellent problem-solving, analytical, and communication skills. Ability to work independently and as part of a collaborative team. What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of "22.5 billion.

Posted 1 month ago

Apply

2.0 - 6.0 years

5 - 9 Lacs

Noida

Work from Office

About Us :. At TELUS Digital, we enable customer experience innovation through spirited teamwork, agile thinking, and a caring culture that puts customers first. TELUS Digital is the global arm of TELUS Corporation, one of the largest telecommunications service providers in Canada. We deliver contact center and business process outsourcing (BPO) solutions to some of the world's largest corporations in the consumer electronics, finance, telecommunications and utilities sectors. With global call center delivery capabilities, our multi-shore, multi-language programs offer safe, secure infrastructure, value-based pricing, skills-based resources and exceptional customer service all backed by TELUS, our multi-billion dollar telecommunications parent.. Required Skills:. Design, develop, and support data pipelines and related data products and platforms.. Design and build data extraction, loading, and transformation pipelines and data products across on-prem and cloud platforms.. Perform application impact assessments, requirements reviews, and develop work estimates.. Develop test strategies and site reliability engineering measures for data products and solutions.. Participate in agile development "scrums" and solution reviews.. Mentor junior Data Engineers.. Lead the resolution of critical operations issues, including post-implementation reviews.. Perform technical data stewardship tasks, including metadata management, security, and privacy by design.. Design and build data extraction, loading, and transformation pipelines using Python and other GCP Data Technologies. Demonstrate SQL and database proficiency in various data engineering tasks.. Automate data workflows by setting up DAGs in tools like Control-M, Apache Airflow, and Prefect.. Develop Unix scripts to support various data operations.. Model data to support business intelligence and analytics initiatives.. Utilize infrastructure-as-code tools such as Terraform, Puppet, and Ansible for deployment automation.. Expertise in GCP data warehousing technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog,. Cloud Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion and Dataproc (good to have).. Qualifications:. Bachelor's degree in Software Engineering, Computer Science, Business, Mathematics, or related field.. 4+ years of data engineering experience.. 2 years of data solution architecture and design experience.. GCP Certified Data Engineer (preferred).. Show more Show less

Posted 1 month ago

Apply

2.0 - 6.0 years

5 - 9 Lacs

Noida

Work from Office

About Us :. At TELUS Digital, we enable customer experience innovation through spirited teamwork, agile thinking, and a caring culture that puts customers first. TELUS Digital is the global arm of TELUS Corporation, one of the largest telecommunications service providers in Canada. We deliver contact center and business process outsourcing (BPO) solutions to some of the world's largest corporations in the consumer electronics, finance, telecommunications and utilities sectors. With global call center delivery capabilities, our multi-shore, multi-language programs offer safe, secure infrastructure, value-based pricing, skills-based resources and exceptional customer service all backed by TELUS, our multi-billion dollar telecommunications parent.. Required Skills :. 5+ years of industry experience in data engineering, business intelligence, or a related field with experience in manipulating, processing, and extracting value from datasets.. Expertise in architecting, designing, building, and deploying internal applications to support technology life cycle management, service delivery management, data, and business intelligence.. Experience in developing modular code for versatile pipelines or complex ingestion frameworks aimed at loading data into Cloud SQL and managing data migration from multiple on-premises sources.. Strong collaboration with analysts and business process owners to translate business requirements into technical solutions.. Proficiency in coding with scripting languages (Shell scripting, Python, SQL).. Deep understanding and hands-on experience with Google Cloud Platform (GCP) technologies, especially in data migration and warehousing, including Database Migration Service (DMS), Cloud SQL, BigQuery,. Dataflow, Data Catalog, Cloud Composer, Google Cloud Storage (GCS), IAM, Compute Engine, Cloud Data Fusion, and optionally Dataproc.. Adherence to best development practices including technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, and writing clean, modular, self-sustaining code.. Familiarity with CI/CD processes using GitHub, Cloud Build, and Google Cloud SDK.. Qualifications:. Bachelor's degree in Computer Science or a related technical field, or equivalent practical experience.. GCP Certified Data Engineer (preferred).. Excellent verbal and written communication skills with the ability to effectively advocate technical solutions to research scientists, engineering teams, and business audiences.. Show more Show less

Posted 1 month ago

Apply

5.0 - 10.0 years

9 - 19 Lacs

Pune, Chennai, Bengaluru

Hybrid

Project Role : Cloud Platform Architect Project Role Description : Oversee application architecture and deployment in cloud platform environments -- including public cloud, private cloud and hybrid cloud. This can include cloud adoption plans, cloud application design, and cloud management and monitoring. Must have skills : Google Cloud Platform Architecture Summary: As a Cloud Platform Architect, you will be responsible for overseeing application architecture and deployment in cloud platform environments, including public cloud, private cloud, and hybrid cloud. Your typical day will involve designing cloud adoption plans, managing and monitoring cloud applications, and ensuring cloud application design meets business requirements. Roles & Responsibilities: - Design and implement cloud adoption plans, including public cloud, private cloud, and hybrid cloud environments. - Oversee cloud application design, ensuring it meets business requirements and aligns with industry best practices. - Manage and monitor cloud applications, ensuring they are secure, scalable, and highly available. - Collaborate with cross-functional teams to ensure cloud applications are integrated with other systems and services. - Stay up-to-date with the latest advancements in cloud technology, integrating innovative approaches for sustained competitive advantage. Professional & Technical Skills: - Must To Have Skills: Strong experience in Google Cloud Platform Architecture. - Good To Have Skills: Experience with other cloud platforms such as AWS or Azure. - Experience in designing and implementing cloud adoption plans. - Strong understanding of cloud application design and architecture. - Experience in managing and monitoring cloud applications. - Solid grasp of cloud security, scalability, and availability best practices.

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Hyderabad

Remote

Canterr is looking for talented and passionate professionals for exciting opportunities with a US-based MNC product company! You will be working permanently with Canterr and deployed to a top-tier global tech client. Key Responsibilities: Design and develop data pipelines and ETL processes to ingest, process, and store large volumes of data. Implement and manage big data technologies such as Kafka, Dataflow, BigQuery, CloudSQL, Kafka, PubSub Collaborate with stakeholders to understand data requirements and deliver high-quality data solutions. Monitor and troubleshoot data pipeline issues and implement solutions to prevent future occurrences. Required Skills and Experience: Generally, we use Google Cloud Platform (GCP) for all software deployed at Wayfair. Data Storage and Processing BigQuery CloudSQL PostgreSQL DataProc Pub/Sub Data modeling: Breaking the business requirements (KPIs) to data points. Building the scalable data model ETL Tools: DBT SQL Data Orchestration and ETL Dataflow Cloud Composer Infrastructure and Deployment Kubernetes Helm Data Access and Management Looker Terraform Ideal Business Domain Experience: Supply chain or warehousing experience: The project is focused on building a normalized data layer which ingests information from multiple Warehouse Management Systems (WMS) and projects it for back-office analysis

Posted 1 month ago

Apply

6.0 - 8.0 years

8 - 14 Lacs

Chennai

Work from Office

Must have skills : - 6 + Years of Exp required. - Deep understanding of Google Cloud Platform tools and services - Solid knowledge of Terraform and IaC (for orchestration and configuration) - Experience implementing and maintaining CI/CD solutions - Kubernetes management and deployments - Experience in Bash or Python - Strong communication skills Bonus : - It's a bonus if you have working experience with Java, Kotlin, Airflow, Spark, Beam, or BigQuery. - Deep understanding of Google Cloud Platform tools and services - >4 years - Solid knowledge of Terraform and IaC (for orchestration and configuration) and Kubernetes >4 years - Experience implementing and maintaining CI/CD solutions >4 years Responsibilities : - Govern the infrastructure and resources in GCP (primarily) and AWS - Experiment fearlessly, take responsibilities, and grow your expertise to the next level. - Ensure that Compliance and Security standards are adhered to. - Enable internal stakeholders to leverage GCP resources for their needs. - Collaborate with the rest of Data Platform team, in a cross-functional way, to deliver value and reach team goals. - Drive projects and implementations within our team's areas of responsibility. - On-call duties may be required

Posted 1 month ago

Apply

5.0 - 10.0 years

35 - 40 Lacs

Pune

Work from Office

: Job Title- Data Engineer (ETL, Big Data, Hadoop, Spark, GCP), AVP Location- Pune, India Role Description Engineer is responsible for developing and delivering elements of engineering solutions to accomplish business goals. Awareness is expected of the important engineering principles of the bank. Root cause analysis skills develop through addressing enhancements and fixes 2 products build reliability and resiliency into solutions through early testing peer reviews and automating the delivery life cycle. Successful candidate should be able to work independently on medium to large sized projects with strict deadlines. Successful candidates should be able to work in a cross application mixed technical environment and must demonstrate solid hands-on development track record while working on an agile methodology. The role demands working alongside a geographically dispersed team. The position is required as a part of the buildout of Compliance tech internal development team in India. The overall team will primarily deliver improvements in Com in compliance tech capabilities that are major components of the regular regulatory portfolio addressing various regulatory common commitments to mandate monitors. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Analyzing data sets and designing and coding stable and scalable data ingestion workflows also integrating into existing workflows Working with team members and stakeholders to clarify requirements and provide the appropriate ETL solution. Work as a senior developer for developing analytics algorithm on top of ingested data. Work as though senior developer for various data sourcing in Hadoop also GCP. Ensuring new code is tested both at unit level and system level design develop and peer review new code and functionality. Operate as a team member of an agile scrum team. Root cause analysis skills to identify bugs and issues for failures. Support Prod support and release management teams in their tasks. Your skills and experience: More than 6+ years of coding experience in experience and reputed organizations Hands on experience in Bitbucket and CI/CD pipelines Proficient in Hadoop, Python, Spark ,SQL Unix and Hive Basic understanding of on Prem and GCP data security Hands on development experience on large ETL/ big data systems .GCP being a big plus Hands on experience on cloud build, artifact registry ,cloud DNS ,cloud load balancing etc. Hands on experience on Data flow, Cloud composer, Cloud storage ,Data proc etc. Basic understanding of data quality dimensions like Consistency, Completeness, Accuracy, Lineage etc. Hands on business and systems knowledge gained in a regulatory delivery environment. Banking experience regulatory and cross product knowledge. Passionate about test driven development. Prior experience with release management tasks and responsibilities. Data visualization experience in tableau is good to have. How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs.

Posted 1 month ago

Apply
Page 1 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies