Jobs
Interviews

55 Cloud Orchestration Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 5.0 years

7 - 11 Lacs

pune, bengaluru

Work from Office

About the Team Come join our growing team! The Distributed Cloud Business Unit at F5 is looking for a Software Engineer-II with experience in designing and developing distributed solutions. On our team, you will participate in the design and development of the data path modules of our distributed cloud offering, to deliver secure, fast and reliable solutions to anyone, anywhere, at any time. You will make a meaningful impact by collaborating with Architects, SRE and application development teams to vet and validate test automation for our edge computing platform that is used to deploy global, scalable and secure applications! Position Summary The Distributed Cloud Team is looking for a technically strong Engineer who can work on our data path solutions in the lifecycle development of a multi cloud distributed platform. Experience with Config,Control,data path, L7, Cloud services, Orchestration, Security are highly desirable. Professional Experience: Bachelor's and/or Master's degree in Computer Science Engineering 3 to 5 years of experience in software design and development Knowledge, Skills: Experience in designing and developing distributed software. Good understanding of computer networking (routing/switching) concepts, network security, Loadbalancers, proxies like Nginx etc.Experience working in any L7 products. Extensive experience with programming languages like Golang,Python. Good understanding of Virtualization technologies like KVM, Docker. Working knowledge of Cloud orchestration systems suchasOpenstack/Kubernetes. Experience working on well-known clouds like AWS/Azure/GCP would be a plus. Excellent written and verbal communication skills. Strong interpersonal, team building, and mentoring skills. Responsibilities New feature design and development Writing unit tests to cover the feature Feature ownership and assisting the support team on customer issues Analysis and debugging of issues reported Proactively identifying and resolving key technical issues Efficiently communicate/collaborate internally at F5.

Posted 15 hours ago

Apply

2.0 - 5.0 years

7 - 11 Lacs

bengaluru

Work from Office

About the Team Come join our growing team! The Distributed Cloud Business Unit at F5 is looking for a Software Engineer-II with experience in designing and developing distributed solutions. On our team, you will participate in the design and development of the data path modules of our distributed cloud offering, to deliver secure, fast and reliable solutions to anyone, anywhere, at any time. You will make a meaningful impact by collaborating with Architects, SRE and application development teams to vet and validate test automation for our edge computing platform that is used to deploy global, scalable and secure applications! Position Summary The Distributed Cloud Team is looking for a technically strong Engineer who can work on our data path solutions in the lifecycle development of a multi cloud distributed platform. Experience with Config,Control,data path, L7, Cloud services, Orchestration, Security are highly desirable. Professional Experience: Bachelor's and/or Master's degree in Computer Science Engineering 3 to 5 years of experience in software design and development Knowledge, Skills: Experience in designing and developing distributed software. Good understanding of computer networking (routing/switching) concepts, network security, Loadbalancers, proxies like Nginx etc. Experience working in any L7 products. Extensive experience with programming languages like Golang,Python. Good understanding of Virtualization technologies like KVM, Docker. Working knowledge of Cloud orchestration systems suchasOpenstack/Kubernetes. Experience working on well-known clouds like AWS/Azure/GCP would be a plus. Excellent written and verbal communication skills. Strong interpersonal, team building, and mentoring skills. Responsibilities New feature design and development Writing unit tests to cover the feature Feature ownership and assisting the support team on customer issues Analysis and debugging of issues reported Proactively identifying and resolving key technical issues Efficiently communicate/collaborate internally at F5.

Posted 15 hours ago

Apply

2.0 - 7.0 years

4 - 9 Lacs

coimbatore

Work from Office

Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Google Cloud Machine Learning Services Good to have skills : GCP Dataflow, Google Pub/Sub, Google Dataproc Minimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are seeking a skilled GCP Data Engineer to join our dynamic team. The ideal candidate will design, build, and maintain scalable data pipelines and solutions on Google Cloud Platform (GCP). This role requires expertise in cloud-based data engineering and hands-on experience with GCP tools and services, ensuring efficient data integration, transformation, and storage for various business use cases.________________________________________ Roles & Responsibilities: Design, develop, and deploy data pipelines using GCP services such as Dataflow, BigQuery, Pub/Sub, and Cloud Storage. Optimize and monitor data workflows for performance, scalability, and reliability. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and implement solutions. Implement data security and governance measures, ensuring compliance with industry standards. Automate data workflows and processes for operational efficiency. Troubleshoot and resolve technical issues related to data pipelines and platforms. Document technical designs, processes, and best practices to ensure maintainability and knowledge sharing.________________________________________ Professional & Technical Skills:a) Must Have: Proficiency in GCP tools such as BigQuery, Dataflow, Pub/Sub, Cloud Composer, and Cloud Storage. Expertise in SQL and experience with data modeling and query optimization. Solid programming skills in Python ofor data processing and ETL development. Experience with CI/CD pipelines and version control systems (e.g., Git). Knowledge of data warehousing concepts, ELT/ETL processes, and real-time streaming. Strong understanding of data security, encryption, and IAM policies on GCP.b) Good to Have: Experience with Dialogflow or CCAI tools Knowledge of machine learning pipelines and integration with AI/ML services on GCP. Certifications such as Google Professional Data Engineer or Google Cloud Architect.________________________________________ Additional Information: - The candidate should have a minimum of 3 years of experience in Google Cloud Machine Learning Services and overall Experience is 3- 5 years - The ideal candidate will possess a strong educational background in computer science, mathematics, or a related field, along with a proven track record of delivering impactful data-driven solutions. Qualifications 15 years full time education

Posted 1 week ago

Apply

15.0 - 19.0 years

0 Lacs

karnataka

On-site

At Aviatrix, we are at the forefront of transforming cloud networking. Our mission is to simplify the complexities of cloud networking through innovative and reliable technology solutions. We are looking for a Director of Software Engineering to build and lead our India software engineering team, playing a crucial role in our technical direction and development of groundbreaking solutions in cloud networking. Work Model: In-Office Preferred (3 days in office) Role and Responsibilities As the Director of Software Engineering, you will: - Lead Technically and Strategically: Serve as a technical leader, setting the strategic direction for the engineering team and ensuring alignment with our global objectives. - Team Recruitment and Expansion: Oversee the recruitment process, building an initial team size of 20-30 members with significant room for future growth. - Direct Performance Management: Implement performance management systems that foster high performance and continuous improvement in alignment with business goals. - Project Oversight: Ensure successful execution of projects, collaborating closely with cross-functional teams like product management and customer support to maintain exceptional service and customer satisfaction. - Collaboration and Communication: Maintain effective communication across all team levels, promoting a culture of collaboration and shared success. Qualifications and Experience - A track record of leading software engineering teams, with at least 15 years in software development and 8 years in a senior management role. - Deep expertise in Cloud Service Provider environments and large-scale distributed systems. - A masters degree in computer science, Engineering, or a related discipline is preferred. - Expertise in Cloud orchestration/observability/metrics/UI/UX AI and ML areas required. Skills and Attributes - Proven leadership skills with a capability to motivate and drive tech teams toward innovation. - Excellent problem-solving skills, with an analytical mindset apt for complex technical challenges. - Outstanding communication skills, proficient in articulating technical details to varied audiences. - Customer-centric approach, focusing on delivering exceptional customer value and service. Reporting Structure Reports directly to the Vice President of Engineering, ensuring strategic alignment and effective execution of projects. Company Culture - We value collaboration, innovation, and a relentless focus on excellence. - We encourage a culture of continuous learning and adaptability to new technologies and methodologies. - Interested candidates are invited to apply by submitting a detailed resume and a cover letter outlining their relevant experience and professional motivations. BENEFITS US: We cover 100% of employee premiums and 88% of dependent(s) premiums for medical, dental and vision coverage, 401(k) match, short and long-term disability, life/AD&D insurance, $1,000/year education reimbursement, and a flexible vacation policy. Outside the US: We offer a comprehensive benefits package which, (subject to regional variations) could include pension, private medical for you and dependents, generous holiday allowance, life assurance, long-term disability, annual wellbeing stipend Your total compensation package will be based on job-related knowledge, education, certifications and location, per our aligned ranges. About Aviatrix Aviatrix is the cloud networking expert. Were on a mission to make cloud networking simple so companies stay agile. Trusted by more than 500 of the worlds leading enterprises, our cloud networking platform creates the visibility, security, and control needed to adapt with ease and move ahead at speed. Combined with the Aviatrix Certified Engineer (ACE) Program, the industry's leading multicloud networking and security certification, Aviatrix empowers the cloud networking community to stay at the forefront of digital transformation. WE WANT TO INCLUDE YOU We embrace the fact that not everyones journey took the same route or started at the same place. If your experience doesnt quite meet the requirements but the opportunity excites you and you believe you could be great, dont let that hold you back from applying. Tell us what you CAN bring and what makes you special. Aviatrix is a community where everyone's career can grow and we want to help you achieve your goals and be your best YOU, however that looks. If you're seeking an opportunity where you can be excited to start work every morning with enthusiastic people, make a real difference and be part of something amazing then lets talk. We want to get to know you and how we could grow together.,

Posted 1 week ago

Apply

5.0 - 10.0 years

15 - 19 Lacs

noida

Remote

Mandatory skill- Advance Python & SQL GCP Services- BigQuery, Dataflow, Dataproc and Pub/Sub. Key Responsibilities Design, develop and optimize scalable data pipelines and ETL workflows using Google Cloud Platform (GCP), particularly leveraging BigQuery, Dataflow, Dataproc and Pub/Sub. Design and manage secure, efficient data integrations involving Snowflake and BigQuery. Write, test and maintain high-quality Python code for data extraction, transformation and loading (ETL), analytics and automation tasks. Use Git for collaborative version control, code reviews and managing data engineering projects. Implement infrastructure-as-code practices using Pulumi for cloud resources management and automation within GCP environments. Apply clean room techniques to design and maintain secure data sharing environments in alignment with privacy standards and client requirements. Collaborate with cross-functional teams (data scientists, business analysts, product teams) to deliver data solutions, troubleshoot issues and assure data integrity throughout the lifecycle. Optimize performance of batch and streaming data pipelines, ensuring reliability and scalability. Maintain documentation on processes, data flows and configurations for operational transparency. ________________________________________ Required Skills Strong hands-on experience of 5+ years with GCP core data services: BigQuery, Dataflow, Dataproc and Pub/Sub. Proficiency in data engineering development using Python. Deep familiarity with Snowflakedata modeling, secure data sharing and advanced query optimization. Proven experience with Git for source code management and collaborative development. Demonstrated ability using Pulumi (or similar IaC tools) for deployment and support of cloud infrastructure. Practical understanding of cleanroom concepts in cloud data warehousing, including privacy/compliance considerations. Solid skills in debugging complex issues within data pipelines and cloud environments. Effective communication and documentation skills. ________________________________________ Great to Have GCP certification (e.g., Professional Data Engineer). Experience working in regulated environments (telecom/financial/healthcare) with data privacy and compliance focus. Exposure to additional GCP services such as Cloud Storage, Cloud Functions or Kubernetes. Demonstrated success collaborating in agile, distributed teams. Experience with data visualization tools (e.g., Tableau, Looker) is nice to have.

Posted 1 week ago

Apply

3.0 - 8.0 years

6 - 10 Lacs

bengaluru

Work from Office

About The Role Project Role : Cloud Services Engineer Project Role Description : Act as liaison between the client and Accenture operations teams for support and escalations. Communicate service delivery health to all stakeholders and explain any performance issues or risks. Ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Must have skills : SUSE Linux Administration Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Cloud Services Engineer, you will act as a liaison between the client and Accenture operations teams for support and escalations. You will communicate service delivery health to all stakeholders, explain any performance issues or risks, and ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Proactively identify and address potential issues in Cloud services.- Collaborate with cross-functional teams to optimize Cloud orchestration processes.- Develop and implement strategies to enhance Cloud automation capabilities.- Analyze performance data to identify trends and areas for improvement.- Provide technical guidance and support to junior team members. Professional & Technical Skills: - Must To Have Skills: Proficiency in SUSE Linux Administration.- Strong understanding of Cloud orchestration and automation.- Experience in managing and troubleshooting Cloud services.- Knowledge of scripting languages for automation tasks.- Hands-on experience with monitoring and alerting tools.- Good To Have Skills: Experience with DevOps practices. Additional Information:- The candidate should have a minimum of 3 years of experience in SUSE Linux Administration.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Cloud Services Engineer, you will act as a liaison between the client and Accenture operations teams for support and escalations. You will communicate service delivery health to all stakeholders, explain any performance issues or risks, and ensure Cloud orchestration and automation capability is operating based on target SLAs with minimal downtime. Hold performance meetings to share performance and consumption data and trends. Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute to providing solutions to work-related problems. Proactively identify and address potential issues in Cloud services. Collaborate with cross-functional teams to optimize Cloud orchestration processes. Develop and implement strategies to enhance Cloud automation capabilities. Analyze performance data to identify trends and areas for improvement. Provide technical guidance and support to junior team members. Professional & Technical Skills: - Must To Have Skills: Proficiency in SUSE Linux Administration. - Strong understanding of Cloud orchestration and automation. - Experience in managing and troubleshooting Cloud services. - Knowledge of scripting languages for automation tasks. - Hands-on experience with monitoring and alerting tools. - Good To Have Skills: Experience with DevOps practices. Additional Information: - The candidate should have a minimum of 3 years of experience in SUSE Linux Administration. - This position is based at our Bengaluru office. - A 15 years full-time education is required.,

Posted 2 weeks ago

Apply

7.0 - 12.0 years

11 - 15 Lacs

noida

Work from Office

Primary Skill(s): Lead Data Visualization Engineer with experience in Sigma BI Experience: 7+ Years in of experience in Data Visualization with experience in Sigma BI, PowerBI, Tableau or Looker Job Summary: Lead Data Visualization Engineer with deep expertise in Sigma BI and a strong ability to craft meaningful, insight-rich visual stories for business stakeholders. This role will be instrumental in transforming raw data into intuitive dashboards and visual analytics, helping cross-functional teams make informed decisions quickly and effectively. Key Responsibilities: Lead the design, development, and deployment of Sigma BI dashboards and reports tailored for various business functions. Translate complex data sets into clear, actionable insights using advanced visualization techniques. Collaborate with business stakeholders to understand goals, KPIs, and data requirements. Build data stories that communicate key business metrics, trends, and anomalies. Serve as a subject matter expert in Sigma BI and guide junior team members on best practices. Ensure visualizations follow design standards, accessibility guidelines, and performance optimization. Partner with data engineering and analytics teams to source and structure data effectively. Conduct workshops and training sessions to enable business users in consuming and interacting with dashboards. Drive the adoption of self-service BI tools and foster a data-driven decision-making culture. Required Skills & Experience: 7+ years of hands-on experience in Business Intelligence, with at least 2 years using Sigma BI. Proven ability to build end-to-end dashboard solutions that tell a story and influence decisions. Strong understanding of data modeling, SQL, and cloud data platforms (Snowflake, BigQuery, etc.). Demonstrated experience working with business users, gathering requirements, and delivering user-friendly outputs. Proficient in data storytelling, UX design principles, and visualization best practices. Experience integrating Sigma BI with modern data stacks and APIs is a plus. Excellent communication and stakeholder management skills. Preferred Qualifications: Experience with other BI tools (such as Sigma BI, Tableau, Power BI, Looker) is a plus. Familiarity with AWS Cloud Data Ecosystems (AWS Databricks). Background in Data Analysis, Statistics, or Business Analytics. Working Hours: 2 PM 11 PM IST [~4.30 AM ET 1.30 PM ET]. Communication skills: Good Mandatory Competencies BI and Reporting Tools - BI and Reporting Tools - Power BI BI and Reporting Tools - BI and Reporting Tools - Tableau Database - Database Programming - SQL Cloud - GCP - Cloud Data Fusion, Dataproc, BigQuery, Cloud Composer, Cloud Bigtable Data Science and Machine Learning - Data Science and Machine Learning - Databricks Cloud - AWS - ECS DMS - Data Analysis Skills Beh - Communication and collaboration BI and Reporting Tools - BI and Reporting Tools - Sigma BI

Posted 3 weeks ago

Apply

8.0 - 13.0 years

7 - 12 Lacs

pune

Work from Office

: Job TitleBusiness Functional Analyst Corporate TitleAssociate LocationPune, India Role Description Business Functional Analysis is responsible for business solution design in complex project environments (e.g. transformational programmes). Work includes: Identifying the full range of business requirements and translating requirements into specific functional specifications for solution development and implementation Analysing business requirements and the associated impacts of the changes Designing and assisting businesses in developing optimal target state business processes Creating and executing against roadmaps that focus on solution development and implementation Answering questions of methodological approach with varying levels of complexity Aligning with other key stakeholder groups (such as Project Management & Software Engineering) to support the link between the business divisions and the solution providers for all aspects of identifying, implementing and maintaining solutions What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Write clear and well-structured business requirements/documents. Convert roadmap features into smaller user stories. Analyse process issues and bottlenecks and to make improvements. Communicate and validate requirements with relevant stakeholders. Perform data discovery, analysis, and modelling. Assist with project management for selected projects. Understand and translate business needs into data models supporting long-term solutions. Understand existing SQL/Python code convert to business requirement. Write advanced SQL and Python scripts. Your skills and experience A minimum of 8+ years of experience in business analysis or a related field. Exceptional analytical and conceptual thinking skills. Proficient in SQL. Proficient in Python for data engineering. Experience in automating ETL testing using python and SQL. Exposure on GCP services corresponding cloud storage, data lake, database, data warehouse; like Big Query, GCS, Dataflow, Cloud Composer, gsutil, Shell Scripting etc. Previous experience in Procurement and Real Estate would be plus. Competency in JIRA, Confluence, draw i/o and Microsoft applications including Word, Excel, Power Point an Outlook. Previous Banking Domain experience is a plus. Good problem-solving skills How well support you .

Posted 3 weeks ago

Apply

12.0 - 15.0 years

0 - 20 Lacs

noida

Work from Office

Roles and Responsibilities : Design, develop, test, deploy, and maintain large-scale data pipelines using GCP Data Flow. Collaborate with cross-functional teams to gather requirements and design solutions for complex data processing needs. Develop automated testing frameworks to ensure high-quality delivery of data products. Troubleshoot issues related to pipeline failures or errors in a timely manner. Job Requirements : 12-15 years of experience in software development with expertise in data engineering on Google Cloud Platform (GCP). Strong understanding of GCP cloud storage services such as BigQuery, Cloud Storage Bucket, etc. Experience with cloud orchestration tools like Kubernetes Engine (K8s) or Cloud Run.

Posted 3 weeks ago

Apply

3.0 - 5.0 years

5 - 8 Lacs

chennai

Work from Office

" PLEASE READ THE JOB DESTCRIPTION AND APPLY" Data Engineer Job Description Position Overview Yesterday is history, tomorrow is a mystery, but today is a gift. That's why we call it the present. - Master Oogway Join CustomerLabs' dynamic data team as a Data Engineer and play a pivotal role in transforming raw marketing data into actionable insights that power our digital marketing platform. As a key member of our data infrastructure team, you will design, develop, and maintain robust data pipelines, data warehouses, and analytics platforms that serve as the backbone of our digital marketing product development. Sometimes the hardest choices require the strongest wills. - Thanos (but we promise, our data decisions are much easier! ) In this role, you will collaborate with cross-functional teams including Data Scientists, Product Managers, and Marketing Technology specialists to ensure seamless data flow from various marketing channels, ad platforms, and customer touchpoints to our analytics dashboards and reporting systems. You'll be responsible for building scalable, reliable, and efficient data solutions that can handle high-volume marketing data processing and real-time campaign analytics. What You'll Do: - Design and implement enterprise-grade data pipelines for marketing data ingestion and processing - Build and optimize data warehouses and data lakes to support digital marketing analytics - Ensure data quality, security, and compliance across all marketing data systems - Create data models and schemas that support marketing attribution, customer journey analysis, and campaign performance tracking - Develop monitoring and alerting systems to maintain data pipeline reliability for critical marketing operations - Collaborate with product teams to understand digital marketing requirements and translate them into technical solutions Why This Role Matters: I can do this all day. - Captain America (and you'll want to, because this role is that rewarding!) You'll be the backbone behind the data infrastructure that powers CustomerLabs' digital marketing platform, making marketers' lives easier and better. Your work directly translates to smarter automation, clearer insights, and more successful campaigns - helping marketers focus on what they do best while we handle the complex data heavy lifting. Sometimes you gotta run before you can walk. - Iron Man (and sometimes you gotta build the data pipeline before you can analyze the data! ) Our Philosophy: We believe in the power of data to transform lives, just like the Dragon Warrior transformed the Valley of Peace. Every line of code you write, every pipeline you build, and every insight you enable has the potential to change how marketers work and succeed. We're not just building data systems - we're building the future of digital marketing, one insight at a time. Your story may not have such a happy beginning, but that doesn't make you who you are. It is the rest of your story, who you choose to be. - Soothsayer What Makes You Special: We're looking for someone who embodies the spirit of both Captain America's unwavering dedication and Iron Man's innovative genius. You'll need the patience to build robust systems (like Cap's shield ) and the creativity to solve complex problems (like Tony's suit). Most importantly, you'll have the heart to make a real difference in marketers' lives. Inner peace... Inner peace... Inner peace... - Po (because we know data engineering can be challenging, but we've got your back! ) Key Responsibilities Data Pipeline Development - Design, build, and maintain robust, scalable data pipelines and ETL/ELT processes - Develop data ingestion frameworks to collect data from various sources (databases, APIs, files, streaming sources) - Implement data transformation and cleaning processes to ensure data quality and consistency - Optimize data pipeline performance and reliability Data Infrastructure Management - Design and implement data warehouse architectures - Manage and optimize database systems (SQL and NoSQL) - Implement data lake solutions and data governance frameworks - Ensure data security, privacy, and compliance with regulatory requirements Data Modeling and Architecture - Design and implement data models for analytics and reporting - Create and maintain data dictionaries and documentation - Develop data schemas and database structures - Implement data versioning and lineage tracking Data Quality, Security, and Compliance - Ensure data quality, integrity, and consistency across all marketing data systems - Implement and monitor data security measures to protect sensitive information - Ensure privacy and compliance with regulatory requirements (e.g., GDPR, CCPA) - Develop and enforce data governance policies and best practices Collaboration and Support - Work closely with Data Scientists, Analysts, and Business stakeholders - Provide technical support for data-related issues and queries Monitoring and Maintenance - Implement monitoring and alerting systems for data pipelines - Perform regular maintenance and optimization of data systems - Troubleshoot and resolve data pipeline issues - Conduct performance tuning and capacity planning Required Qualifications Experience - 2+ years of experience in data engineering or related roles - Proven experience with ETL/ELT pipeline development - Experience with cloud data platform (GCP) - Experience with big data technologies Technical Skills - Programming Languages : Python, SQL, Golang (preferred) - Databases: PostgreSQL, MySQL, Redis - Big Data Tools: Apache Spark, Apache Kafka, Apache Airflow, DBT, Dataform - Cloud Platforms: GCP (BigQuery, Dataflow, Cloud run, Cloud SQL, Cloud Storage, Pub/Sub, App Engine, Compute Engine etc.) - Data Warehousing: Google BigQuery - Data Visualization: Superset, Looker, Metabase, Tableau - Version Control: Git, GitHub - Containerization: Docker Soft Skills - Strong problem-solving and analytical thinking - Excellent communication and collaboration skills - Ability to work independently and in team environments - Strong attention to detail and data quality - Continuous learning mindset Preferred Qualifications Additional Experience - Experience with real-time data processing and streaming - Knowledge of machine learning pipelines and MLOps - Experience with data governance and data catalog tools - Familiarity with business intelligence tools (Tableau, Power BI, Looker, etc.) - Experience using AI-powered tools (such as Cursor, Claude, Copilot, ChatGPT, Gemini, etc.) to accelerate coding, automate tasks, or assist in system design ( We belive run with machine, not against machine ) Interview Process 1. Initial Screening: Phone/video call with HR 2. Technical Interview: Deep dive into data engineering concepts 3. Final Interview: Discussion with senior leadership Note: This job description is intended to provide a general overview of the position and may be modified based on organizational needs and candidate qualifications. Our Team Culture We are Groot. - We work together, we grow together, we succeed together. We believe in: - Innovation First - Like Iron Man, we're always pushing the boundaries of what's possible - Team Over Individual - Like the Avengers, we're stronger together than apart - Continuous Learning - Like Po learning Kung Fu, we're always evolving and improving - Making a Difference - Like Captain America, we fight for what's right (in this case, better marketing!) Growth Journey There is no charge for awesomeness... or attractiveness. - Po Your journey with us will be like Po's transformation from noodle maker to Dragon Warrior: - Level 1 : Master the basics of our data infrastructure - Level 2: Build and optimize data pipelines - Level 3 : Lead complex data projects and mentor others - Level 4: Become a data engineering legend (with your own theme music! ) What We Promise I am Iron Man. - We promise you'll feel like a superhero every day! - Work that matters - Every pipeline you build helps real marketers succeed - Growth opportunities - Learn new technologies and advance your career - Supportive team - We've got your back, just like the Avengers - Work-life balance - Because even superheroes need rest! Apply : https://customerlabs.freshteam.com/jobs

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

Armada is an edge computing startup that provides computing infrastructure to remote areas where connectivity and cloud infrastructure is limited, as well as areas where data needs to be processed locally for real-time analytics and AI at the edge. We are looking to bring on the most brilliant minds to help further our mission of bridging the digital divide with advanced technology infrastructure that can be rapidly deployed anywhere. We are seeking a highly skilled and motivated Software Engineer to drive the design and implementation of our on-premises Compute as a Service (CaaS) and GPU as a Service (GPUaaS) offerings. In this role, you will be pivotal in building a robust and scalable infrastructure platform, enabling our engineering teams to efficiently deploy and manage applications. This position offers a fantastic opportunity to grow your career and work with cutting-edge technologies in cloud infrastructure, automation, and AI. You will be part of a collaborative team, tackling real-world challenges in a fast-paced environment and seeing your code have a direct impact on our products. This role is office-based at our Trivandrum, Kerala office. Key Responsibilities: - Build & Automate: Write, test, and maintain code for our edge cloud platform, focusing on automating the provisioning and management of GPU and CPU resources. - Develop Core Services: Contribute to the backend services and APIs that power our compute offerings, helping to implement new features and improve existing ones. - Troubleshoot & Support: Assist in diagnosing and resolving issues across our distributed infrastructure, learning how to ensure reliability and performance for our customers. - Collaborate & Learn: Work closely with senior engineers and architects, participating in code reviews and design discussions to grow your skills in cloud-native technologies. - Implement Tooling: Help develop the tools and scripts that our engineering teams use to deploy, monitor, and manage our edge datacenters. Required Qualifications: - Programming Foundation: Solid programming skills in Golang - Linux and container technology Experience: Comfortable working in a Linux environment and good grasp of containerization technology like Docker, Kubernetes. - Virtualization & IaaS Background: Good knowledge of virtualization technologies (KVM preferred) and cloud orchestration. - Eagerness to Learn: Strong desire to learn about cloud computing, distributed systems, and automation. Curiosity is valued. - Problem-Solving Skills: Enjoy breaking down problems, thinking logically, and collaborating with a team to find solutions. - Education: A bachelor's degree in Computer Science, a related technical field, or equivalent practical experience. (Typically, 3-6 years of experience for this level). Preferred Qualifications: - Familiarity with container technologies like Docker. - Exposure to cloud platforms (AWS, GCP, Azure) or virtualization technologies. - Experience in Kubernetes or other orchestration systems. - Experience with Infrastructure as Code (IaC) tools like Terraform or Ansible. - Exposure to AI/ML space and the role GPUs play. Compensation & Benefits: For India-based candidates: We offer a competitive base salary along with equity options, providing an opportunity to share in the success and growth of Armada. You're a Great Fit if You're: - A go-getter with a growth mindset, intellectually curious, have strong business acumen, and actively seek opportunities to build relevant skills and knowledge. - A detail-oriented problem-solver who can independently gather information, solve problems efficiently, and deliver results with a "get-it-done" attitude. - Thrive in a fast-paced environment, energized by an entrepreneurial spirit, capable of working quickly, and excited to contribute to a growing company. - A collaborative team player who focuses on business success and is motivated by team accomplishment vs personal agenda. - Highly organized and results-driven, with strong prioritization skills and a dedicated work ethic. Equal Opportunity Statement,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 - 0 Lacs

karnataka

On-site

As a Pre-Sales Solution Architect, your primary responsibility will be to lead the pre-sales efforts in collaboration with the sales team to understand client requirements and design effective solutions. You will be managing the preparation and submission of RFPs/RFIs, ensuring technical accuracy and alignment with business needs. Additionally, you will conduct customer demonstrations and oversee the deployment of solutions that meet client specifications. Analyzing prospective customer business challenges and recommending appropriate cloud solutions tailored to their needs will also be a key part of your role. Creating and delivering detailed solution documents, technical presentations, and proposals to key decision-makers will be crucial in your day-to-day tasks. You will also play a vital role in assisting in closing deals by providing technical expertise and supporting the sales team throughout the sales cycle. Staying updated on industry trends, particularly in IaaS and PaaS services, and sharing insights with the team will be essential. Furthermore, mentoring junior team members and facilitating knowledge sharing within the pre-sales team will be part of your responsibilities. In order to succeed in this role, you must have 6-8 years of experience in a Pre-Sales Solution Architect or similar role. Strong knowledge of cloud migration and management of customer cloud environments is a must, along with proven experience with AWS IaaS and PaaS services. You should possess the ability to create compelling solution design documents and technical presentations, as well as familiarity with cloud orchestration tools and strong sizing skills. Excellent written and verbal communication skills are required, capable of engaging with both technical and non-technical stakeholders. Strong analytical and problem-solving skills are also essential for this position. While not mandatory, having exceptional interpersonal and communication abilities, a sales-oriented mindset with a focus on customer success, and a deep understanding of cloud concepts and emerging cloud solutions would be considered nice-to-have requirements. In return, we offer a competitive salary package ranging from 18-24 LPA with fixed compensation and performance-based incentives. Equity options are also available for long-term growth and investment in the company's success. You will be part of a dynamic work environment that encourages innovation and professional growth. Skills required for this role include leadership, team management, communication, professional growth, AWS IaaS, cloud orchestration, PaaS services, management of customer cloud environments, customer success, cloud migration, AWS, verbal communication, equity options, cloud concepts, sizing skills, sales mindset, written communication, problem-solving skills, innovation, analytical skills, interpersonal communication, and cloud sales.,

Posted 1 month ago

Apply

7.0 - 10.0 years

20 - 27 Lacs

Noida

Work from Office

Job Responsibilities: Technical Leadership: • Provide technical leadership and mentorship to a team of data engineers. • Design, architect, and implement highly scalable, resilient, and performant data pipelines, using GCP technologies is a plus (e.g., Dataproc, Cloud Composer, Pub/Sub, BigQuery). • Guide the team in adopting best practices for data engineering, including CI/CD, infrastructure-as-code, and automated testing. • Conduct code reviews, design reviews, and provide constructive feedback to team members. • Stay up-to-date with the latest technologies and trends in data engineering, Data Pipeline Development: • Develop and maintain robust and efficient data pipelines to ingest, process, and transform large volumes of structured and unstructured data from various sources. • Implement data quality checks and monitoring systems to ensure data accuracy and integrity. • Collaborate with cross functional teams, and business stakeholders to understand data requirements and deliver data solutions that meet their needs. Platform Building & Maintenance: • Design and implement secure and scalable data storage solutions • Manage and optimize cloud infrastructure costs related to data engineering workloads. • Contribute to the development and maintenance of data engineering tooling and infrastructure to improve team productivity and efficiency. Collaboration & Communication: • Effectively communicate technical designs and concepts to both technical and non-technical audiences. • Collaborate effectively with other engineering teams, product managers, and business stakeholders. • Contribute to knowledge sharing within the team and across the organization. Required Qualifications: • Bachelor's or Master's degree in Computer Science, Engineering, or a related field. • 7+ years of experience in data engineering and Software Development. • 7+ years of experience of coding in SQL and Python/Java. • 3+ years of hands-on experience building and managing data pipelines in cloud environment like GCP. • Strong programming skills in Python or Java, with experience in developing data-intensive applications. • Expertise in SQL and data modeling techniques for both transactional and analytical workloads. • Experience with CI/CD pipelines and automated testing frameworks. • Excellent communication, interpersonal, and problem-solving skills. • Experience leading or mentoring a team of engineers Roles and Responsibilities Job Responsibilities: Technical Leadership: • Provide technical leadership and mentorship to a team of data engineers. • Design, architect, and implement highly scalable, resilient, and performant data pipelines, using GCP technologies is a plus (e.g., Dataproc, Cloud Composer, Pub/Sub, BigQuery). • Guide the team in adopting best practices for data engineering, including CI/CD, infrastructure-as-code, and automated testing. • Conduct code reviews, design reviews, and provide constructive feedback to team members. • Stay up-to-date with the latest technologies and trends in data engineering, Data Pipeline Development: • Develop and maintain robust and efficient data pipelines to ingest, process, and transform large volumes of structured and unstructured data from various sources. • Implement data quality checks and monitoring systems to ensure data accuracy and integrity. • Collaborate with cross functional teams, and business stakeholders to understand data requirements and deliver data solutions that meet their needs. Platform Building & Maintenance: • Design and implement secure and scalable data storage solutions • Manage and optimize cloud infrastructure costs related to data engineering workloads. • Contribute to the development and maintenance of data engineering tooling and infrastructure to improve team productivity and efficiency. Collaboration & Communication: • Effectively communicate technical designs and concepts to both technical and non-technical audiences. • Collaborate effectively with other engineering teams, product managers, and business stakeholders. • Contribute to knowledge sharing within the team and across the organization. Required Qualifications: • Bachelor's or Master's degree in Computer Science, Engineering, or a related field. • 7+ years of experience in data engineering and Software Development. • 7+ years of experience of coding in SQL and Python/Java. • 3+ years of hands-on experience building and managing data pipelines in cloud environment like GCP. • Strong programming skills in Python or Java, with experience in developing data-intensive applications. • Expertise in SQL and data modeling techniques for both transactional and analytical workloads. • Experience with CI/CD pipelines and automated testing frameworks. • Excellent communication, interpersonal, and problem-solving skills. • Experience leading or mentoring a team of engineers

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 8 Lacs

Chennai

Work from Office

" PLEASE READ THE JOB DESTCRIPTION AND APPLY" Data Engineer Job Description Position Overview Yesterday is history, tomorrow is a mystery, but today is a gift. That's why we call it the present. - Master Oogway Join CustomerLabs' dynamic data team as a Data Engineer and play a pivotal role in transforming raw marketing data into actionable insights that power our digital marketing platform. As a key member of our data infrastructure team, you will design, develop, and maintain robust data pipelines, data warehouses, and analytics platforms that serve as the backbone of our digital marketing product development. Sometimes the hardest choices require the strongest wills. - Thanos (but we promise, our data decisions are much easier! ) In this role, you will collaborate with cross-functional teams including Data Scientists, Product Managers, and Marketing Technology specialists to ensure seamless data flow from various marketing channels, ad platforms, and customer touchpoints to our analytics dashboards and reporting systems. You'll be responsible for building scalable, reliable, and efficient data solutions that can handle high-volume marketing data processing and real-time campaign analytics. What You'll Do: - Design and implement enterprise-grade data pipelines for marketing data ingestion and processing - Build and optimize data warehouses and data lakes to support digital marketing analytics - Ensure data quality, security, and compliance across all marketing data systems - Create data models and schemas that support marketing attribution, customer journey analysis, and campaign performance tracking - Develop monitoring and alerting systems to maintain data pipeline reliability for critical marketing operations - Collaborate with product teams to understand digital marketing requirements and translate them into technical solutions Why This Role Matters: I can do this all day. - Captain America (and you'll want to, because this role is that rewarding!) You'll be the backbone behind the data infrastructure that powers CustomerLabs' digital marketing platform, making marketers' lives easier and better. Your work directly translates to smarter automation, clearer insights, and more successful campaigns - helping marketers focus on what they do best while we handle the complex data heavy lifting. Sometimes you gotta run before you can walk. - Iron Man (and sometimes you gotta build the data pipeline before you can analyze the data! ) Our Philosophy: We believe in the power of data to transform lives, just like the Dragon Warrior transformed the Valley of Peace. Every line of code you write, every pipeline you build, and every insight you enable has the potential to change how marketers work and succeed. We're not just building data systems - we're building the future of digital marketing, one insight at a time. Your story may not have such a happy beginning, but that doesn't make you who you are. It is the rest of your story, who you choose to be. - Soothsayer What Makes You Special: We're looking for someone who embodies the spirit of both Captain America's unwavering dedication and Iron Man's innovative genius. You'll need the patience to build robust systems (like Cap's shield ) and the creativity to solve complex problems (like Tony's suit). Most importantly, you'll have the heart to make a real difference in marketers' lives. Inner peace... Inner peace... Inner peace... - Po (because we know data engineering can be challenging, but we've got your back! ) Key Responsibilities Data Pipeline Development - Design, build, and maintain robust, scalable data pipelines and ETL/ELT processes - Develop data ingestion frameworks to collect data from various sources (databases, APIs, files, streaming sources) - Implement data transformation and cleaning processes to ensure data quality and consistency - Optimize data pipeline performance and reliability Data Infrastructure Management - Design and implement data warehouse architectures - Manage and optimize database systems (SQL and NoSQL) - Implement data lake solutions and data governance frameworks - Ensure data security, privacy, and compliance with regulatory requirements Data Modeling and Architecture - Design and implement data models for analytics and reporting - Create and maintain data dictionaries and documentation - Develop data schemas and database structures - Implement data versioning and lineage tracking Data Quality, Security, and Compliance - Ensure data quality, integrity, and consistency across all marketing data systems - Implement and monitor data security measures to protect sensitive information - Ensure privacy and compliance with regulatory requirements (e.g., GDPR, CCPA) - Develop and enforce data governance policies and best practices Collaboration and Support - Work closely with Data Scientists, Analysts, and Business stakeholders - Provide technical support for data-related issues and queries Monitoring and Maintenance - Implement monitoring and alerting systems for data pipelines - Perform regular maintenance and optimization of data systems - Troubleshoot and resolve data pipeline issues - Conduct performance tuning and capacity planning Required Qualifications Experience - 2+ years of experience in data engineering or related roles - Proven experience with ETL/ELT pipeline development - Experience with cloud data platform (GCP) - Experience with big data technologies Technical Skills - Programming Languages : Python, SQL, Golang (preferred) - Databases: PostgreSQL, MySQL, Redis - Big Data Tools: Apache Spark, Apache Kafka, Apache Airflow, DBT, Dataform - Cloud Platforms: GCP (BigQuery, Dataflow, Cloud run, Cloud SQL, Cloud Storage, Pub/Sub, App Engine, Compute Engine etc.) - Data Warehousing: Google BigQuery - Data Visualization: Superset, Looker, Metabase, Tableau - Version Control: Git, GitHub - Containerization: Docker Soft Skills - Strong problem-solving and analytical thinking - Excellent communication and collaboration skills - Ability to work independently and in team environments - Strong attention to detail and data quality - Continuous learning mindset Preferred Qualifications Additional Experience - Experience with real-time data processing and streaming - Knowledge of machine learning pipelines and MLOps - Experience with data governance and data catalog tools - Familiarity with business intelligence tools (Tableau, Power BI, Looker, etc.) - Experience using AI-powered tools (such as Cursor, Claude, Copilot, ChatGPT, Gemini, etc.) to accelerate coding, automate tasks, or assist in system design ( We belive run with machine, not against machine ) Interview Process 1. Initial Screening: Phone/video call with HR 2. Technical Interview: Deep dive into data engineering concepts 3. Final Interview: Discussion with senior leadership Note: This job description is intended to provide a general overview of the position and may be modified based on organizational needs and candidate qualifications. Our Team Culture We are Groot. - We work together, we grow together, we succeed together. We believe in: - Innovation First - Like Iron Man, we're always pushing the boundaries of what's possible - Team Over Individual - Like the Avengers, we're stronger together than apart - Continuous Learning - Like Po learning Kung Fu, we're always evolving and improving - Making a Difference - Like Captain America, we fight for what's right (in this case, better marketing!) Growth Journey There is no charge for awesomeness... or attractiveness. - Po Your journey with us will be like Po's transformation from noodle maker to Dragon Warrior: - Level 1 : Master the basics of our data infrastructure - Level 2: Build and optimize data pipelines - Level 3 : Lead complex data projects and mentor others - Level 4: Become a data engineering legend (with your own theme music! ) What We Promise I am Iron Man. - We promise you'll feel like a superhero every day! - Work that matters - Every pipeline you build helps real marketers succeed - Growth opportunities - Learn new technologies and advance your career - Supportive team - We've got your back, just like the Avengers - Work-life balance - Because even superheroes need rest!

Posted 1 month ago

Apply

3.0 - 8.0 years

10 - 20 Lacs

Noida, New Delhi, Gurugram

Hybrid

Role & responsibilities Strategically partner with the Customer Cloud Sales Team to identify and qualify business opportunities and identify key customer technical objections. Develop strategies to resolve technical obstacles and architect client solutions to meet complex business and technical requirements Lead the technical aspects of the sales cycle, including technical trainings, client presentations, technical bid responses, product and solution briefings, and proof-of-concept technical work Identify and respond to key technical objections from client, providing prescriptive guidance for successful resolutions tailored to specific client needs May directly work with Customer's Cloud products to demonstrate, design and prototype integrations in customer/partner environments Develop and deliver thorough product messaging to highlight advanced technical value propositions, using techniques such as: whiteboard and slide presentations, technical product demonstrations, white papers, trial management and RFI response documents Assess technical challenges to develop and deliver recommendations on integration strategies, enterprise architectures, platforms and application infrastructure required to successfully implement a complete solution Leverage technical expertise to provide best practice counsel to optimize advanced technical products effectiveness THER CRITICAL FUNCTIONS AND RESPONSIBILTIES Ensure customer data is accurate and actionable using Salesforce.com (SFDC) systems Leverage 3rd party prospect and account intelligence tools to extract meaningful insights and support varying client needs Navigate, analyse and interpret technical documentation for technical products, often including Customer Cloud products Enhance skills and knowledge by using a Learning Management Solution (LMS) for training and certification Serve as a technical and subject matter expert to support advanced trainings for team members on moderate to highly complex technical subjects Offer thought leadership in the advanced technical solutions, such as cloud computing Coach and mentor team members and advise managers on creating business and process efficiencies in internal workflows and training materials Collect and codify best practices between sales, marketing, and sales engineers Preferred candidate profile Required Qualifications Bachelors degree in Computer Science or other technical field, or equivalent practical experience (preferred) 3-5 years of experience serving in a technical Sales Engineer in an advanced technical environment Prior experience with advanced technologies, such as: Big Data, PaaS, and IaaS technologies, etc. Proven strong communication skills with a proactive and positive approach to task management (written and verbal)Confident presenter with excellent presentation and persuasion skills Strong work ethic and ability to work independently Perks and benefits

Posted 1 month ago

Apply

7.0 - 12.0 years

11 - 15 Lacs

Noida

Work from Office

Primary Skill(s): Lead Data Visualization Engineer with experience in Sigma BI Experience: 7+ Years in of experience in Data Visualization with experience in Sigma BI, PowerBI, Tableau or Looker Job Summary: Lead Data Visualization Engineer with deep expertise in Sigma BI and a strong ability to craft meaningful, insight-rich visual stories for business stakeholders. This role will be instrumental in transforming raw data into intuitive dashboards and visual analytics, helping cross-functional teams make informed decisions quickly and effectively. Key Responsibilities: Lead the design, development, and deployment of Sigma BI dashboards and reports tailored for various business functions. Translate complex data sets into clear, actionable insights using advanced visualization techniques. Collaborate with business stakeholders to understand goals, KPIs, and data requirements. Build data stories that communicate key business metrics, trends, and anomalies. Serve as a subject matter expert in Sigma BI and guide junior team members on best practices. Ensure visualizations follow design standards, accessibility guidelines, and performance optimization. Partner with data engineering and analytics teams to source and structure data effectively. Conduct workshops and training sessions to enable business users in consuming and interacting with dashboards. Drive the adoption of self-service BI tools and foster a data-driven decision-making culture. Required Skills & Experience: 7+ years of hands-on experience in Business Intelligence, with at least 2 years using Sigma BI. Proven ability to build end-to-end dashboard solutions that tell a story and influence decisions. Strong understanding of data modeling, SQL, and cloud data platforms (Snowflake, BigQuery, etc.). Demonstrated experience working with business users, gathering requirements, and delivering user-friendly outputs. Proficient in data storytelling, UX design principles, and visualization best practices. Experience integrating Sigma BI with modern data stacks and APIs is a plus. Excellent communication and stakeholder management skills. Preferred Qualifications: Experience with other BI tools (such as Sigma BI, Tableau, Power BI, Looker) is a plus. Familiarity with AWS Cloud Data Ecosystems (AWS Databricks). Background in Data Analysis, Statistics, or Business Analytics. Working Hours: 2 PM 11 PM IST [~4.30 AM ET 1.30 PM ET]. Communication skills: Good Mandatory Competencies BI and Reporting Tools - BI and Reporting Tools - Power BI BI and Reporting Tools - BI and Reporting Tools - Tableau Database - Database Programming - SQL Cloud - GCP - Cloud Data Fusion, Dataproc, BigQuery, Cloud Composer, Cloud Bigtable Data Science and Machine Learning - Data Science and Machine Learning - Databricks Cloud - AWS - ECS DMS - Data Analysis Skills Beh - Communication and collaboration

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

As a member of our team, you will be responsible for the logical design, sizing, interoperability, scaling, and security aspects definition of the solution. Your role will involve managing cloud environments in accordance with company security guidelines, analyzing, optimizing, implementing, and maintaining Multi-Platform cloud-based back-end computing environments. Additionally, you will implement and maintain security technologies and sound engineering design for all cloud deployments. You will be tasked with researching, auditing, testing, and setting standards for AWS and Azure cloud deployment frameworks and best practices to ensure compatibility and functionality in the enterprise. Independently developing reference materials for supported cloud technologies and leading the deployment and operation of various cloud-related infrastructure components will be part of your responsibilities. Reporting on the health of the cloud services to leadership, supporting project execution activities, and making recommendations for improvements to security, scalability, manageability, and performance across a wide variety of cloud network, storage, and compute technologies are key aspects of this role. You will be required to liaise with customers, architecture leadership, and technical teams, including systems and network administrators, security engineers, and IT support teams. Furthermore, you will build and configure build plans, code pipelines, and create automated solutions that can be framed and re-used. Managing assigned projects, meeting deadlines with minimal supervision, and performing other duties as assigned will also be part of your responsibilities. To be successful in this role, you must possess a strong understanding of IT technology, including hardware and software, with a holistic end-to-end focus on applications and services architecture. A minimum of 5 years of IT background, with at least 3 years of cloud infrastructure and engineering experience, is required. Experience with virtualization, cloud, server, storage, and networking technologies is essential. Knowledge of DevOps, SDLC, containers, microservices, APIs design, cloud computing models (IaaS, PaaS, SaaS), cloud orchestration, and monitoring technology is crucial. Experience in AWS, Azure, .Net, ITSM, IT operations, and programming skills is a plus. Excellent communication skills, the ability to work individually, within a team, and partner with other business groups are also essential. Experience in Disaster Recovery and Business Continuity initiatives, the ability to develop policies and procedures, conduct risk assessments, and effectively communicate at the C-level are required. A BS/BA degree in Computer Science or equivalent experience is necessary, while certifications such as TOGAF, AWS, Infrastructure, and Cloud certifications are advantageous.,

Posted 1 month ago

Apply

3.0 - 6.0 years

13 - 18 Lacs

Bengaluru

Work from Office

We are looking to hire Data engineer for the Platform Engineering team. It is a collection of highly skilled individuals ranging from development to operations with a security first mindset who strive to push the boundaries of technology. We champion a DevSecOps culture and raise the bar on how and when we deploy applications to production. Our core principals are centered around automation, testing, quality, and immutability all via code. The role is responsible for building self-service capabilities that improve our security posture, productivity, and reduce time to market with automation at the core of these objectives. The individual collaborates with teams across the organization to ensure applications are designed for Continuous Delivery (CD) and are well-architected for their targeted platform which can be on-premise or the cloud. If you are passionate about developer productivity, cloud native applications, and container orchestration, this job is for you! Principal Accountabilities: The incumbent is mentored by senior individuals on the team to capture the flow and bottlenecks in the holistic IT delivery process and define future tool sets Skills and Software Requirements: Experience with a language such as Python, Go,SQL, Java, or Scala GCP data services (BigQuery; Dataflow; Dataproc; Cloud Composer; Pub/Sub; Google Cloud Storage; IAM) Experience with Jenkins, Maven, Git, Ansible, or CHEF Experience working with containers, orchestration tools (like Kubernetes, Mesos, Docker Swarm etc.) and container registries (GCE, Docker hub etc.) Experience with [SPI]aaS- Software-as-a-Service, Platform-as-a-Service, or Infrastructure-as- a-Service Acquire, cleanse, and ingest structured and unstructured data on the cloud Combine data from disparate sources in to a single, unified, authoritative view of data (e.g., Data Lake) Enable and support data movement from one system service to another system service Experience implementing or supporting automated solutions to technical problems Experience working in a team environment, proactively executing on tasks while meeting agreed delivery timelines Ability to contribute to effective and timely solutions Excellent oral and written communication skills

Posted 2 months ago

Apply

3.0 - 4.0 years

5 - 7 Lacs

Chandigarh

Work from Office

Key Responsibilities Design, develop, and maintain scalable ETL workflows using Cloud Data Fusion and Apache Airflow . Configure and manage various data connectors (e.g., Cloud Storage, Pub/Sub, JDBC, SaaS APIs) for batch and streaming data ingestion. Implement data transformations, cleansing, and enrichment logic in Python (and SQL) to meet analytic requirements. Optimize BigQuery data models (fact/dimension tables, partitioning, clustering) for performance and cost-efficiency. Monitor, troubleshoot, and tune pipeline performance; implement robust error-handling and alerting mechanisms. Collaborate with data analysts, BI developers, and architects to understand data requirements and deliver accurate datasets. Maintain documentation for data pipelines, schemas, and operational runbooks. Ensure data security and governance best practices are followed across the data lifecycle. Minimum Qualifications 3+ years of hands-on experience in data engineering, with a focus on cloud-native ETL. Proven expertise with Google Cloud Data Fusion , including pipeline authoring and custom plugin development. Solid experience building and orchestrating pipelines in Apache Airflow (DAG design, operators, hooks). Strong Python programming skills for data manipulation and automation. Deep understanding of BigQuery : schema design, SQL scripting, performance tuning, and cost management. Familiarity with additional GCP services: Cloud Storage, Pub/Sub, Dataflow, and IAM. Experience with version control (Git), CI/CD pipelines, and DevOps practices for data projects. Excellent problem-solving skills, attention to detail, and the ability to work independently in a fast-paced environment. Immediate availability to join. Preferred (Nice-to-Have) Experience with other data integration tools (e.g., Dataflow, Talend, Informatica). Knowledge of containerization (Docker, Kubernetes) for scalable data workloads. Familiarity with streaming frameworks (Apache Beam, Spark Streaming). Background in data modeling methodologies (Star/Snowflake schemas). Exposure to metadata management, data cataloguing, and data governance frameworks.

Posted 2 months ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Bengaluru

Work from Office

About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google Dataproc, Apache Spark Good to have skills : Apache AirflowMinimum 5 year(s) of experience is required Educational Qualification : minimum 15 years of fulltime education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Google Dataproc. Your typical day will involve working with Apache Spark and collaborating with cross-functional teams to deliver impactful data-driven solutions. Roles & Responsibilities:- Design, build, and configure applications to meet business process and application requirements using Google Dataproc.- Collaborate with cross-functional teams to deliver impactful data-driven solutions.- Utilize Apache Spark for data processing and analysis.- Develop and maintain technical documentation for applications. Professional & Technical Skills: - Strong expereince in Apache Spark and Java for Spark.- Strong experience with multiple database models ( SQL, NoSQL, OLTP and OLAP)- Strong experience with Data Streaming Architecture ( Kafka, Spark, Airflow)- Strong knowledge of cloud data platforms and technologies such as GCS, BigQuery, Cloud Composer, Dataproc and other cloud-native offerings- Knowledge of Infrastructure as Code (IaC) and associated tools (Terraform, ansible etc)- Experience pulling data from a variety of data source types including Mainframe (EBCDIC), Fixed Length and delimited files, databases (SQL, NoSQL, Time-series)- Experience performing analysis with large datasets in a cloud-based environment, preferably with an understanding of Googles Cloud Platform (GCP)- Comfortable communicating with various stakeholders (technical and non-technical)- GCP Data Engineer Certification is a nice to have Additional Information:- The candidate should have a minimum of 5 years of experience in Google Dataproc and Apache Spark.- The ideal candidate will possess a strong educational background in software engineering or a related field.- This position is based at our Mumbai office. Qualification minimum 15 years of fulltime education

Posted 2 months ago

Apply

8.0 - 13.0 years

7 - 12 Lacs

Pune

Work from Office

: Job TitleBusiness Functional Analyst Corporate TitleAssociate LocationPune, India Role Description Business Functional Analysis is responsible for business solution design in complex project environments (e.g. transformational programmes). Work includes: Identifying the full range of business requirements and translating requirements into specific functional specifications for solution development and implementation Analysing business requirements and the associated impacts of the changes Designing and assisting businesses in developing optimal target state business processes Creating and executing against roadmaps that focus on solution development and implementation Answering questions of methodological approach with varying levels of complexity Aligning with other key stakeholder groups (such as Project Management & Software Engineering) to support the link between the business divisions and the solution providers for all aspects of identifying, implementing and maintaining solutions What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Write clear and well-structured business requirements/documents. Convert roadmap features into smaller user stories. Analyse process issues and bottlenecks and to make improvements. Communicate and validate requirements with relevant stakeholders. Perform data discovery, analysis, and modelling. Assist with project management for selected projects. Understand and translate business needs into data models supporting long-term solutions. Understand existing SQL/Python code convert to business requirement. Write advanced SQL and Python scripts. Your skills and experience A minimum of 8+ years of experience in business analysis or a related field. Exceptional analytical and conceptual thinking skills. Proficient in SQL. Proficient in Python for data engineering. Experience in automating ETL testing using python and SQL. Exposure on GCP services corresponding cloud storage, data lake, database, data warehouse; like Big Query, GCS, Dataflow, Cloud Composer, gsutil, Shell Scripting etc. Previous experience in Procurement and Real Estate would be plus. Competency in JIRA, Confluence, draw i/o and Microsoft applications including Word, Excel, Power Point an Outlook. Previous Banking Domain experience is a plus. Good problem-solving skills How well support you .

Posted 2 months ago

Apply

7.0 - 12.0 years

16 - 20 Lacs

Pune

Work from Office

: Job TitleData Engineer (ETL, Big Data, Hadoop, Spark, GCP), AS Location:Pune, India Role Description Engineer is responsible for developing and delivering elements of engineering solutions to accomplish business goals. Awareness is expected of the important engineering principles of the bank. Root cause analysis skills develop through addressing enhancements and fixes 2 products build reliability and resiliency into solutions through early testing peer reviews and automating the delivery life cycle. Successful candidate should be able to work independently on medium to large sized projects with strict deadlines. Successful candidates should be able to work in a cross application mixed technical environment and must demonstrate solid hands-on development track record while working on an agile methodology. The role demands working alongside a geographically dispersed team. The position is required as a part of the buildout of Compliance tech internal development team in India. The overall team will primarily deliver improvements in compliance tech capabilities that are major components of the regular regulatory portfolio addressing various regulatory common commitments to mandate monitors. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Analyzing data sets and designing and coding stable and scalable data ingestion workflows also integrating into existing workflows Working with team members and stakeholders to clarify requirements and provide the appropriate ETL solution. Hands-on experience for various data sourcing in Hadoop also GCP. Ensuring new code is tested both at unit level and system level design develop and peer review new code and functionality. Operate as a team member of an agile scrum team. Root cause analysis skills to identify bugs and issues for failures. Support Prod support and release management teams in their tasks. Your skills and experience: More than 7+ years of coding experience in experience and reputed organizations Hands on experience in Bitbucket and CI/CD pipelines Proficient in Hadoop, Python, Spark, SQL Unix and Hive Basic understanding of on Prem and GCP data security Hands on development experience on large ETL/ big data systems .GCP being a big plus Hands on experience on cloud build, artifact registry ,cloud DNS ,cloud load balancing etc. Hands on experience on Data flow, Cloud composer, Cloud storage ,Data proc etc. Basic understanding of data quality dimensions like Consistency, Completeness, Accuracy, Lineage etc. Hands on business and systems knowledge gained in a regulatory delivery environment. Banking experience regulatory and cross product knowledge. Passionate about test driven development. How well support you . . .

Posted 2 months ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Bengaluru

Work from Office

We are seeking a skilled DevOps Engineer with strong experience in Google Cloud Platform (GCP) to support AI/ML project infrastructure. The ideal candidate will work closely with data scientists, ML engineers, and developers to build and manage scalable, secure, and automated pipelines for AI/ML model training, testing, and deployment. Responsibilities: Design and manage cloud infrastructure to support AI/ML workloads on GCP. Develop and maintain CI/CD pipelines for ML models and applications. Automate model training, validation, deployment, and monitoring processes using tools like Kubeflow, Vertex AI, Cloud Composer, Airflow, etc. Set up and manage infrastructure as code (IaC) using tools such as Terraform or Deployment Manager. Implement robust security, monitoring, logging, and alerting systems using Cloud Monitoring, Cloud Logging, Prometheus, Grafana, etc. Collaborate with ML engineers and data scientists to optimize compute environments (e.g., GPU/TPU instances, notebooks). Manage and maintain containerized environments using Docker and Kubernetes (GKE). Ensure cost-efficient cloud resource utilization and governance. Required Skills Bachelor's degree in engineering or relevant field Must have 4 years of proven experience as DevOps Engineer with at least 1 year on GCP Strong experience with DevOps tools and methodologies in production environments Proficiency in scripting with Python, Bash, or Shell Experience with Terraform, Ansible, or other IaC tools. Deep understanding of Docker, Kubernetes, and container orchestration Knowledge of CI/CD pipelines, automated testing, and model deployment best practices. Familiarity with ML lifecycle tools such as MLflow, Kubeflow Pipelines, or TensorFlow Extended (TFX). Experience in designing conversational flows for AI Agents/chatbot

Posted 2 months ago

Apply

3.0 - 5.0 years

5 - 5 Lacs

Kochi, Thiruvananthapuram

Work from Office

Role Proficiency: Independently develops error free code with high quality validation of applications guides other developers and assists Lead 1 - Software Engineering Outcomes: Understand and provide input to the application/feature/component designs; developing the same in accordance with user stories/requirements. Code debug test document and communicate product/component/features at development stages. Select appropriate technical options for development such as reusing improving or reconfiguration of existing components. Optimise efficiency cost and quality by identifying opportunities for automation/process improvements and agile delivery models Mentor Developer 1 - Software Engineering and Developer 2 - Software Engineering to effectively perform in their roles Identify the problem patterns and improve the technical design of the application/system Proactively identify issues/defects/flaws in module/requirement implementation Assists Lead 1 - Software Engineering on Technical design. Review activities and begin demonstrating Lead 1 capabilities in making technical decisions Measures of Outcomes: Adherence to engineering process and standards (coding standards) Adherence to schedule / timelines Adhere to SLAs where applicable Number of defects post delivery Number of non-compliance issues Reduction of reoccurrence of known defects Quick turnaround of production bugs Meet the defined productivity standards for project Number of reusable components created Completion of applicable technical/domain certifications Completion of all mandatory training requirements Outputs Expected: Code: Develop code independently for the above Configure: Implement and monitor configuration process Test: Create and review unit test cases scenarios and execution Domain relevance: Develop features and components with good understanding of the business problem being addressed for the client Manage Project: Manage module level activities Manage Defects: Perform defect RCA and mitigation Estimate: Estimate time effort resource dependence for one's own work and others' work including modules Document: Create documentation for own work as well as perform peer review of documentation of others' work Manage knowledge: Consume and contribute to project related documents share point libraries and client universities Status Reporting: Report status of tasks assigned Comply with project related reporting standards/process Release: Execute release process Design: LLD for multiple components Mentoring: Mentor juniors on the team Set FAST goals and provide feedback to FAST goals of mentees Skill Examples: Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Develop user interfaces business software components and embedded software components 5 Manage and guarantee high levels of cohesion and quality6 Use data models Estimate effort and resources required for developing / debugging features / components Perform and evaluate test in the customer or target environment Team Player Good written and verbal communication abilities Proactively ask for help and offer help Knowledge Examples: Appropriate software programs / modules Technical designing Programming languages DBMS Operating Systems and software platforms Integrated development environment (IDE) Agile methods Knowledge of customer domain and sub domain where problem is solved Additional Comments: UST is looking for Java Senior developers to build end to end business solutions and to work with one of the leading financial services organization in the UK. The ideal candidate must possess strong background on frontend and backend development technologies. The candidate must possess excellent written and verbal communication skills with the ability to collaborate effectively with domain experts and technical experts in the team. Responsibilities: As a Java developer, you will - Maintain active relationships with Product Owner to understand business requirements, lead requirement gathering meetings and review designs with the product owner - Own backlog items and coordinate with other team members to develop the features planned for each sprint - Perform technical design reviews and code reviews - Mentor, Lead and Guide the team on technical skills - Be Responsible for prototyping, developing, and troubleshooting software in the user interface or service layers - Perform peer reviews on source code to ensure reuse, scalability and the use of best practices - Participate in collaborative technical discussions that focus on software user experience, design, architecture, and development - Perform demonstrations for client stakeholders on project features and sub features, which utilizes the latest Front end and Backend development technologies Requirements: - 5+ years of experience in Java/JEE development - Skills in developing applications using multi-tier architecture - 2+ years of experience in GCP service development is preferred - Skills in developing applications in GCP is preferred - Should be an expert in Cloud Composer, Data Flow, Dataproc, Cloud pub/sub, DAG creation - Python scripting knowledge is preferred - Apache Beam knowledge is mandatory - Java/JEE, Spring, Spring boot, REST/SOAP web services, Hibernate, SQL, Tomcat, Application servers (WebSphere), SONAR, Agile, AJAX, Jenkins... - Skills in UML, application designing/architecture, Design Patterns.. - Skills in Unit testing application using Junit or similar technologies - Capability to support QA teams with test plans, root cause analysis and defect fixing - Strong experience in Responsive design, cross browser web applications - Strong knowledge of web service models - Strong knowledge in creating and working with APIs - Experience with Cloud services, specifically on Google cloud - Strong exposure in Agile, Scaled Agile based development models - Familiar with Interfaces such as REST web services, swagger profiles, JSON payloads. - Familiar with tools/utilities such as Bitbucket / Jira / Confluence. Required Skills Java,Spring ,Spring Boot,Microservices

Posted 2 months ago

Apply
Page 1 of 3
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies