Jobs
Interviews

288 Cloud Storage Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 - 5.0 years

0 Lacs

Kolkata

Work from Office

Data Entry of Bills & Update MS Excel. Smart in MS Office; Outlook.com; MS Excel; MS Word; Sharepoint; Onedrive Should be expert in attending audio & video virtual call using MS Teams & Others. Mandatory Daily Data Entry (60-90 minutes) Daily

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

noida, uttar pradesh

On-site

The ideal candidate for the position will have the responsibility of designing, developing, and maintaining an optimal data pipeline architecture. You will be required to monitor incidents, perform root cause analysis, and implement appropriate actions to solve issues related to abnormal job execution and data corruption conditions. Additionally, you will automate jobs, notifications, and reports to improve efficiency. You should possess the ability to optimize existing queries, reverse engineer for data research and analysis, and calculate the impact of issues on the downstream side for effective communication. Supporting failures, data quality issues, and ensuring environment health will also be part of your role. Furthermore, you will maintain ingestion and pipeline runbooks, portfolio summaries, and DBAR, while enabling infrastructure changes, enhancements, and updates roadmap. Building the infrastructure for optimal extraction, transformation, and loading data from various sources using big data technologies, python, or web-based APIs will be essential. You will participate in code reviews with peers, have excellent communication skills for understanding and conveying requirements effectively. As a candidate, you are expected to have a Bachelor's degree in Engineering/Computer Science or a related quantitative field. Technical skills required include a minimum of 8 years of programming experience with python and SQL, experience with massively parallel processing systems like Spark or Hadoop, and a minimum of 6-7 years of hands-on experience with GCP, BigQuery, Dataflow, Data Warehousing, Data modeling, Apache Beam, and Cloud Storage. Proficiency in source code control systems (GIT) and CI/CD processes, involvement in designing, prototyping, and delivering software solutions within the big data ecosystem, and hands-on experience in generative AI models are also necessary. You should be able to perform code reviews to ensure code meets acceptance criteria, have experience with Agile development methodologies and tools, and work towards improving data governance and quality to enhance data reliability. EXL Analytics offers a dynamic and innovative environment where you will collaborate with experienced analytics consultants. You will gain insights into various business aspects, develop effective teamwork and time-management skills, and receive training in analytical tools and techniques. Our mentoring program provides guidance and coaching to every employee, fostering personal and professional growth. The opportunities for growth and development at EXL Analytics are limitless, setting the stage for a successful career within the company and beyond.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. You would be working on: - Developing and implementing Generative AI / AI solutions on Google Cloud Platform - Working with cross-functional teams to design and deliver AI-powered products and services - Developing, versioning, and executing Python code - Deploying models as endpoints in Dev Environment - Having a solid understanding of python - Utilizing deep learning frameworks such as TensorFlow, PyTorch, or JAX - Working on Natural language processing (NLP) and machine learning (ML) - Utilizing Cloud storage, compute engine, VertexAI, Cloud Function, Pub/Sub, Vertex AI, etc. - Providing Generative AI support in Vertex, specifically hands-on experience with Generative AI models like Gemini, vertex Search, etc. Your Profile should include: - Experience in Generative AI development with Google Cloud Platform - Experience in delivering an AI solution on VertexAI platform - Experience in developing and deploying AI Solutions with ML What you'll love about working here: - You can shape your career with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. - You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage, or new parent support via flexible work. - You will have the opportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market-leading capabilities in AI, generative AI, cloud, and data, combined with its deep industry expertise and partner ecosystem.,

Posted 1 week ago

Apply

3.0 - 5.0 years

25 - 30 Lacs

Gurugram

Work from Office

[{"Salary":null , "Posting_Title":"Backend Developer" , "Is_Locked":false , "City":"Gurgaon" , "Industry":"Technology" , "Job_Description":" Develop and maintain backend services using Go (Golang) Build and scale RESTful APIs using the gin-gonic/gin framework Design NoSQL schemas and manage database operations with Firestore Deploy and manage services on Google Cloud Platform , especially Cloud Run and Cloud Storage Implement secure authentication using JWT , OAuth 2.0 , and API security best practices Ensure code quality through version control (Git), testing, and code reviews Use Docker for containerization and manage multi-stage builds Work within Linux environments and utilize basic shell scripting Requirements 3+ years of production experience with Go (Golang) Strong knowledge of the Gin framework for

Posted 1 week ago

Apply

5.0 - 10.0 years

50 - 75 Lacs

Bengaluru

Work from Office

The Search Platform team is responsible for the Search experience on several product surfaces at Uber, enabling millions of users to find, discover, and explore rides, food, etc. Search has played an increasingly important role in bringing Uber closer to its vision. The ideal candidate will bring extensive expertise in search infrastructure, showcasing a profound understanding of search algorithms, distributed systems, and scalability. What youll do Design and implement new features in our search infrastructure solutions at Uber. Join on call rotation, driving continuous improvements on system availability, scalability, performance and efficiency. Collaborate with other infrastructure teams, product teams and product managers to drive adoption and standardize processes, and to design and implement high impact, cross-product features. What youll need BS or MS in Computer Science and 5+ years of related technical discipline or equivalent experience. Proficient in one of the following programming languages: Java, Go, C/C++ or similar languages. Good scripting skills and ability to pick up new ones. Systematic problem solving approach and knowledge of algorithms, data structures and complexity analysis. Experience with Apache Lucene, ElasticSearch, OpenSearch, Solr and other Search technologies is a plus Experience with highly available/fault tolerant, replicated data storage systems, large scale data processing systems or enterprise/cloud storage systems is also a strong plus Ubers mission is to reimagine the way the world moves for the better. Here, bold ideas create real-world impact, challenges drive growth, and speed fuelds progress. What moves us, moves the world - let s move it forward, together. Offices continue to be central to collaboration and Ubers cultural identity. Unless formally approved to work fully remotely, Uber expects employees to spend at least half of their work time in their assigned office. For certain roles, such as those based at green-light hubs, employees are expected to be in-office for 100% of their time. Please speak with your recruiter to better understand in-office expectations for this role. *Accommodations may be available based on religious and/or medical conditions, or as required by applicable law.

Posted 1 week ago

Apply

4.0 - 7.0 years

12 - 17 Lacs

Pune

Work from Office

Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Senior Software Engineer. In this role, you will: Proven experience as a Cloud Engineer or similar role, with a focus on Google Cloud Platform (GCP) Strong understanding of cloud architecture and services, including Compute Engine, Kubernetes Engine, Cloud Storage, BigQuery, and more. Setup and configure various GCP Services like, Cloud SQL, Cloud PubSub, cloud data store, configure GKE cluster for some multi-tenant environments, etc. Design, deploy, and manage scalable, secure, and reliable cloud infrastructure on Google Cloud Platform (GCP) Collaborate with development teams to ensure applications are designed and optimized for cloud deployment. Implement and manage CI/CD pipelines using tools such as Google Cloud Build, Jenkins, or similar. Monitor and optimize cloud resources for cost, performance, and security. Automate infrastructure provisioning experience with Infrastructure as Code (IaC) tools such as Terraform, Google Cloud Deployment Manager, or similar. Requirements Proficiency in scripting languages such as Python, Bash, or similar. Troubleshoot and resolve issues related to cloud infrastructure and services. Ensure compliance with security policies and best practices. Google Cloud certifications (e. g. , Google Cloud Professional Cloud Architect) are a plus. Stay up to date with the latest GCP features, services, and best practices. Any knowledge on other cloud like AWS, Azure will be added advantage.

Posted 1 week ago

Apply

4.0 - 8.0 years

20 - 35 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Salary: 20 to 35 LPA Exp: 3 to 7 years Location: Gurgaon/Pune/Bengalore Notice: Immediate to 30 days..!! Job Profile: Experienced Data Engineer with a strong foundation in designing, building, and maintaining scalable data pipelines and architectures. Skilled in transforming raw data into clean, structured formats for analytics and business intelligence. Proficient in modern data tools and technologies such as SQL, T-SQL, Python, Databricks, and cloud platforms (Azure). Adept at data wrangling, modeling, ETL/ELT development, and ensuring data quality, integrity, and security. Collaborative team player with a track record of enabling data-driven decision-making across business units. As a Data engineer, Candidate will work on the assignments for one of our Utilities clients. Collaborating with cross-functional teams and stakeholders involves gathering data requirements, aligning business goals, and translating them into scalable data solutions. The role includes working closely with data analysts, scientists, and business users to understand needs, designing robust data pipelines, and ensuring data is accessible, reliable, and well-documented. Regular communication, iterative feedback, and joint problem-solving are key to delivering high-impact, data-driven outcomes that support organizational objectives. This position requires a proven track record of transforming processes, driving customer value, cost savings with experience in running end-to-end analytics for large-scale organizations. Design, build, and maintain scalable data pipelines to support analytics, reporting, and advanced modeling needs. Collaborate with consultants, analysts, and clients to understand data requirements and translate them into effective data solutions. Ensure data accuracy, quality, and integrity through validation, cleansing, and transformation processes. Develop and optimize data models, ETL workflows, and database architectures across cloud and on-premises environments. Support data-driven decision-making by delivering reliable, well-structured datasets and enabling self-service analytics. Provides seamless integration with cloud platforms (Azure), making it easy to build and deploy end-to-end data pipelines in the cloud Scalable clusters for handling large datasets and complex computations in Databricks, optimizing performance and cost management. Must to have Client Engagement Experience and collaboration with cross-functional teams Data Engineering background in Databricks Capable of working effectively as an individual contributor or in collaborative team environments Effective communication and thought leadership with proven record. Candidate Profile: Bachelors/masters degree in economics, mathematics, computer science/engineering, operations research or related analytics areas 3+ years experience must be in Data engineering. Hands on experience on SQL, Python, Databricks, cloud Platform like Azure etc. Prior experience in managing and delivering end to end projects Outstanding written and verbal communication skills Able to work in fast pace continuously evolving environment and ready to take up uphill challenges Is able to understand cross cultural differences and can work with clients across the globe.

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Data Specialist, you will be responsible for utilizing your expertise in ETL Fundamentals, SQL, BigQuery, Dataproc, Python, Data Catalog, Data Warehousing, and various other tools to contribute to the successful implementation of data projects. Your role will involve working with technologies such as Cloud Trace, Cloud Logging, Cloud Storage, and Datafusion to build and maintain a modern data platform. To excel in this position, you should possess a minimum of 5 years of experience in the data engineering field, with a focus on GCP cloud data implementation suite including BigQuery, Pub Sub, Data Flow/Apache Beam, Airflow/Composer, and Cloud Storage. Your strong understanding of very large-scale data architecture and hands-on experience in data warehouses, data lakes, and analytics platforms will be crucial for the success of our projects. Key Requirements: - Minimum 5 years of experience in data engineering - Hands-on experience in GCP cloud data implementation suite - Strong expertise in GBQ Query, Python, Apache Airflow, and SQL (BigQuery preferred) - Extensive hands-on experience with SQL and Python for working with data If you are passionate about data and have a proven track record of delivering results in a fast-paced environment, we invite you to apply for this exciting opportunity to be a part of our dynamic team.,

Posted 1 week ago

Apply

5.0 - 10.0 years

10 - 18 Lacs

Pune

Work from Office

GCP Platform cloud Engineer : GCP Core Service : IAM, VPC, GCE ( Google Compute Engine) , GCS ( Google Cloud Storage) , CloudSQL, MySQL, CI/CD Tool (Code Build/GitHub Action/), Other Tool : GitHub, Terraform, Shell Script, Ansible

Posted 1 week ago

Apply

4.0 - 8.0 years

20 - 35 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Salary: 20 to 35 LPA Exp: 3 to 7 years Location: Gurgaon/Pune/Bengalore Notice: Immediate to 30 days..!! Job Profile: Experienced Data Engineer with a strong foundation in designing, building, and maintaining scalable data pipelines and architectures. Skilled in transforming raw data into clean, structured formats for analytics and business intelligence. Proficient in modern data tools and technologies such as SQL, T-SQL, Python, Databricks, and cloud platforms (Azure). Adept at data wrangling, modeling, ETL/ELT development, and ensuring data quality, integrity, and security. Collaborative team player with a track record of enabling data-driven decision-making across business units. As a Data engineer, Candidate will work on the assignments for one of our Utilities clients. Collaborating with cross-functional teams and stakeholders involves gathering data requirements, aligning business goals, and translating them into scalable data solutions. The role includes working closely with data analysts, scientists, and business users to understand needs, designing robust data pipelines, and ensuring data is accessible, reliable, and well-documented. Regular communication, iterative feedback, and joint problem-solving are key to delivering high-impact, data-driven outcomes that support organizational objectives. This position requires a proven track record of transforming processes, driving customer value, cost savings with experience in running end-to-end analytics for large-scale organizations. Design, build, and maintain scalable data pipelines to support analytics, reporting, and advanced modeling needs. Collaborate with consultants, analysts, and clients to understand data requirements and translate them into effective data solutions. Ensure data accuracy, quality, and integrity through validation, cleansing, and transformation processes. Develop and optimize data models, ETL workflows, and database architectures across cloud and on-premises environments. Support data-driven decision-making by delivering reliable, well-structured datasets and enabling self-service analytics. Provides seamless integration with cloud platforms (Azure), making it easy to build and deploy end-to-end data pipelines in the cloud Scalable clusters for handling large datasets and complex computations in Databricks, optimizing performance and cost management. Must to have Client Engagement Experience and collaboration with cross-functional teams Data Engineering background in Databricks Capable of working effectively as an individual contributor or in collaborative team environments Effective communication and thought leadership with proven record. Candidate Profile: Bachelors/masters degree in economics, mathematics, computer science/engineering, operations research or related analytics areas 3+ years experience must be in Data engineering. Hands on experience on SQL, Python, Databricks, cloud Platform like Azure etc. Prior experience in managing and delivering end to end projects Outstanding written and verbal communication skills Able to work in fast pace continuously evolving environment and ready to take up uphill challenges Is able to understand cross cultural differences and can work with clients across the globe.

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Cloud Engineer at AVP level in Bangalore, India, you will be responsible for designing, implementing, and managing cloud infrastructure and services on Google Cloud Platform (GCP). Your key responsibilities will include designing, deploying, and managing scalable, secure, and cost-effective cloud environments on GCP, developing Infrastructure as Code (IaC) using tools like Terraform, ensuring security best practices, IAM policies, and compliance with organizational and regulatory standards, configuring and managing VPCs, subnets, firewalls, VPNs, and interconnects for secure cloud networking, setting up CI/CD pipelines for automated deployments, implementing monitoring and alerting using tools like Stackdriver, optimizing cloud spending, designing disaster recovery and backup strategies, deploying and managing GCP databases, and managing containerized applications using GKE and Cloud Run. You will be part of the Platform Engineering Team, which is responsible for building and maintaining foundational infrastructure, tooling, and automation to enable efficient, secure, and scalable software development and deployment. The team focuses on creating a self-service platform for developers and operational teams, ensuring reliability, security, and compliance while improving developer productivity. To excel in this role, you should have strong experience with GCP services, proficiency in scripting and Infrastructure as Code, knowledge of DevOps practices and CI/CD tools, understanding of security, IAM, networking, and compliance in cloud environments, experience with monitoring tools, strong problem-solving skills, and Google Cloud certifications would be a plus. You will receive training, development, coaching, and support to help you excel in your career, along with a culture of continuous learning and a range of flexible benefits tailored to suit your needs. The company strives for a positive, fair, and inclusive work environment where employees are empowered to excel together every day. For further information about the company and its teams, please visit the company website: https://www.db.com/company/company.htm. The Deutsche Bank Group welcomes applications from all individuals and promotes a culture of shared successes and collaboration.,

Posted 1 week ago

Apply

4.0 - 8.0 years

6 - 11 Lacs

Mumbai

Work from Office

Your Role We are hiring a GCP Kubernetes Engineer with 912 years of experience. Ideal candidates should have strong expertise in cloud-native technologies, container orchestration, and infrastructure automation. This is a Pan India opportunity offering flexibility and growth. Join us to build scalable, secure, and innovative cloud solutions across diverse industries. Design, implement, and manage scalable, highly available systems on Google Cloud Platform (GCP). Work with GCP IaaS componentsCompute Engine, VPC, VPN, Cloud Interconnect, Load Balancing, Cloud CDN, Cloud Storage, and Backup/DR solutions. Utilize GCP PaaS servicesCloud SQL, App Engine, Cloud Functions, Pub/Sub, Firestore/Cloud Spanner, and Dataflow. Deploy and manage containerized applications using Google Kubernetes Engine (GKE), Helm charts, and Kubernetes tooling. Automate infrastructure provisioning using gcloud CLI, Deployment Manager, or Terraform. Implement CI/CD pipelines using Cloud Build for automated deployments. Monitor infrastructure and applications using Cloud Monitoring, Logging, and related tools. Manage IAM, VPC Service Controls, Cloud Armor, and Security Command Center. Troubleshoot and resolve complex infrastructure and application issues. Your Profile 6+ years of cloud engineering experience with a strong focus on GCP. Proven hands-on expertise in GCP IaaS, PaaS, and GKE. Experience with monitoring, logging, and automation tools in GCP. Strong problem-solving, analytical, and communication skills. What you"ll love about working here You can shape yourcareer with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. At Capgemini, you can work oncutting-edge projects in tech and engineering with industry leaders or createsolutions to overcome societal and environmental challenges.

Posted 1 week ago

Apply

3.0 - 5.0 years

5 - 8 Lacs

Chennai

Work from Office

" PLEASE READ THE JOB DESTCRIPTION AND APPLY" Data Engineer Job Description Position Overview Yesterday is history, tomorrow is a mystery, but today is a gift. That's why we call it the present. - Master Oogway Join CustomerLabs' dynamic data team as a Data Engineer and play a pivotal role in transforming raw marketing data into actionable insights that power our digital marketing platform. As a key member of our data infrastructure team, you will design, develop, and maintain robust data pipelines, data warehouses, and analytics platforms that serve as the backbone of our digital marketing product development. Sometimes the hardest choices require the strongest wills. - Thanos (but we promise, our data decisions are much easier! ) In this role, you will collaborate with cross-functional teams including Data Scientists, Product Managers, and Marketing Technology specialists to ensure seamless data flow from various marketing channels, ad platforms, and customer touchpoints to our analytics dashboards and reporting systems. You'll be responsible for building scalable, reliable, and efficient data solutions that can handle high-volume marketing data processing and real-time campaign analytics. What You'll Do: - Design and implement enterprise-grade data pipelines for marketing data ingestion and processing - Build and optimize data warehouses and data lakes to support digital marketing analytics - Ensure data quality, security, and compliance across all marketing data systems - Create data models and schemas that support marketing attribution, customer journey analysis, and campaign performance tracking - Develop monitoring and alerting systems to maintain data pipeline reliability for critical marketing operations - Collaborate with product teams to understand digital marketing requirements and translate them into technical solutions Why This Role Matters: I can do this all day. - Captain America (and you'll want to, because this role is that rewarding!) You'll be the backbone behind the data infrastructure that powers CustomerLabs' digital marketing platform, making marketers' lives easier and better. Your work directly translates to smarter automation, clearer insights, and more successful campaigns - helping marketers focus on what they do best while we handle the complex data heavy lifting. Sometimes you gotta run before you can walk. - Iron Man (and sometimes you gotta build the data pipeline before you can analyze the data! ) Our Philosophy: We believe in the power of data to transform lives, just like the Dragon Warrior transformed the Valley of Peace. Every line of code you write, every pipeline you build, and every insight you enable has the potential to change how marketers work and succeed. We're not just building data systems - we're building the future of digital marketing, one insight at a time. Your story may not have such a happy beginning, but that doesn't make you who you are. It is the rest of your story, who you choose to be. - Soothsayer What Makes You Special: We're looking for someone who embodies the spirit of both Captain America's unwavering dedication and Iron Man's innovative genius. You'll need the patience to build robust systems (like Cap's shield ) and the creativity to solve complex problems (like Tony's suit). Most importantly, you'll have the heart to make a real difference in marketers' lives. Inner peace... Inner peace... Inner peace... - Po (because we know data engineering can be challenging, but we've got your back! ) Key Responsibilities Data Pipeline Development - Design, build, and maintain robust, scalable data pipelines and ETL/ELT processes - Develop data ingestion frameworks to collect data from various sources (databases, APIs, files, streaming sources) - Implement data transformation and cleaning processes to ensure data quality and consistency - Optimize data pipeline performance and reliability Data Infrastructure Management - Design and implement data warehouse architectures - Manage and optimize database systems (SQL and NoSQL) - Implement data lake solutions and data governance frameworks - Ensure data security, privacy, and compliance with regulatory requirements Data Modeling and Architecture - Design and implement data models for analytics and reporting - Create and maintain data dictionaries and documentation - Develop data schemas and database structures - Implement data versioning and lineage tracking Data Quality, Security, and Compliance - Ensure data quality, integrity, and consistency across all marketing data systems - Implement and monitor data security measures to protect sensitive information - Ensure privacy and compliance with regulatory requirements (e.g., GDPR, CCPA) - Develop and enforce data governance policies and best practices Collaboration and Support - Work closely with Data Scientists, Analysts, and Business stakeholders - Provide technical support for data-related issues and queries Monitoring and Maintenance - Implement monitoring and alerting systems for data pipelines - Perform regular maintenance and optimization of data systems - Troubleshoot and resolve data pipeline issues - Conduct performance tuning and capacity planning Required Qualifications Experience - 2+ years of experience in data engineering or related roles - Proven experience with ETL/ELT pipeline development - Experience with cloud data platform (GCP) - Experience with big data technologies Technical Skills - Programming Languages : Python, SQL, Golang (preferred) - Databases: PostgreSQL, MySQL, Redis - Big Data Tools: Apache Spark, Apache Kafka, Apache Airflow, DBT, Dataform - Cloud Platforms: GCP (BigQuery, Dataflow, Cloud run, Cloud SQL, Cloud Storage, Pub/Sub, App Engine, Compute Engine etc.) - Data Warehousing: Google BigQuery - Data Visualization: Superset, Looker, Metabase, Tableau - Version Control: Git, GitHub - Containerization: Docker Soft Skills - Strong problem-solving and analytical thinking - Excellent communication and collaboration skills - Ability to work independently and in team environments - Strong attention to detail and data quality - Continuous learning mindset Preferred Qualifications Additional Experience - Experience with real-time data processing and streaming - Knowledge of machine learning pipelines and MLOps - Experience with data governance and data catalog tools - Familiarity with business intelligence tools (Tableau, Power BI, Looker, etc.) - Experience using AI-powered tools (such as Cursor, Claude, Copilot, ChatGPT, Gemini, etc.) to accelerate coding, automate tasks, or assist in system design ( We belive run with machine, not against machine ) Interview Process 1. Initial Screening: Phone/video call with HR 2. Technical Interview: Deep dive into data engineering concepts 3. Final Interview: Discussion with senior leadership Note: This job description is intended to provide a general overview of the position and may be modified based on organizational needs and candidate qualifications. Our Team Culture We are Groot. - We work together, we grow together, we succeed together. We believe in: - Innovation First - Like Iron Man, we're always pushing the boundaries of what's possible - Team Over Individual - Like the Avengers, we're stronger together than apart - Continuous Learning - Like Po learning Kung Fu, we're always evolving and improving - Making a Difference - Like Captain America, we fight for what's right (in this case, better marketing!) Growth Journey There is no charge for awesomeness... or attractiveness. - Po Your journey with us will be like Po's transformation from noodle maker to Dragon Warrior: - Level 1 : Master the basics of our data infrastructure - Level 2: Build and optimize data pipelines - Level 3 : Lead complex data projects and mentor others - Level 4: Become a data engineering legend (with your own theme music! ) What We Promise I am Iron Man. - We promise you'll feel like a superhero every day! - Work that matters - Every pipeline you build helps real marketers succeed - Growth opportunities - Learn new technologies and advance your career - Supportive team - We've got your back, just like the Avengers - Work-life balance - Because even superheroes need rest!

Posted 1 week ago

Apply

3.0 - 8.0 years

10 - 20 Lacs

Noida, New Delhi, Gurugram

Hybrid

Role & responsibilities Strategically partner with the Customer Cloud Sales Team to identify and qualify business opportunities and identify key customer technical objections. Develop strategies to resolve technical obstacles and architect client solutions to meet complex business and technical requirements Lead the technical aspects of the sales cycle, including technical trainings, client presentations, technical bid responses, product and solution briefings, and proof-of-concept technical work Identify and respond to key technical objections from client, providing prescriptive guidance for successful resolutions tailored to specific client needs May directly work with Customer's Cloud products to demonstrate, design and prototype integrations in customer/partner environments Develop and deliver thorough product messaging to highlight advanced technical value propositions, using techniques such as: whiteboard and slide presentations, technical product demonstrations, white papers, trial management and RFI response documents Assess technical challenges to develop and deliver recommendations on integration strategies, enterprise architectures, platforms and application infrastructure required to successfully implement a complete solution Leverage technical expertise to provide best practice counsel to optimize advanced technical products effectiveness THER CRITICAL FUNCTIONS AND RESPONSIBILTIES Ensure customer data is accurate and actionable using Salesforce.com (SFDC) systems Leverage 3rd party prospect and account intelligence tools to extract meaningful insights and support varying client needs Navigate, analyse and interpret technical documentation for technical products, often including Customer Cloud products Enhance skills and knowledge by using a Learning Management Solution (LMS) for training and certification Serve as a technical and subject matter expert to support advanced trainings for team members on moderate to highly complex technical subjects Offer thought leadership in the advanced technical solutions, such as cloud computing Coach and mentor team members and advise managers on creating business and process efficiencies in internal workflows and training materials Collect and codify best practices between sales, marketing, and sales engineers Preferred candidate profile Required Qualifications Bachelors degree in Computer Science or other technical field, or equivalent practical experience (preferred) 3-5 years of experience serving in a technical Sales Engineer in an advanced technical environment Prior experience with advanced technologies, such as: Big Data, PaaS, and IaaS technologies, etc. Proven strong communication skills with a proactive and positive approach to task management (written and verbal)Confident presenter with excellent presentation and persuasion skills Strong work ethic and ability to work independently Perks and benefits

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

maharashtra

On-site

The Data Chapter at our organization serves as a strategic partner, utilizing cutting-edge AI and ML technologies to drive data-led initiatives that help mitigate risk, generate revenue, and enhance operational efficiency. Our primary focus lies in ensuring data integrity and governance, delivering high-quality analysis, insights, and automation to facilitate data-driven decision-making throughout the organization. Through a collaborative approach, we provide timely and actionable solutions that align with the strategic objectives of the company. As a member of our team, your responsibilities will include developing and implementing advanced data science models and algorithms to derive business insights and support decision-making processes. You will collaborate with cross-functional teams to identify and address complex business challenges using statistical analysis and machine learning techniques. Additionally, you will be involved in conducting risk and business model analysis to evaluate the impact of data-driven initiatives on the organization. We encourage exploration and experimentation with emerging technologies like generative AI to enhance our data science capabilities and foster innovation. Problem-solving skills, both independently and as part of a team, are essential for success in this role. To qualify for this position, you should possess 2-6 years of experience in data science, machine learning, or a related field. Proficiency in statistical analysis, data mining, and the application of advanced analytical techniques is required. Hands-on experience in Python coding is a must, along with familiarity with cloud storage and computing. Proficiency in Excel is preferred. You should also have practical experience in developing and deploying data science models and algorithms, including those involving generative AI. Strong problem-solving and critical thinking skills are highly valued, as well as effective communication and collaboration abilities to work efficiently with cross-functional teams. If you are seeking a challenging opportunity where you can contribute your expertise and grow professionally, we invite you to apply now. We offer a competitive salary and benefits package, along with the opportunity to work in a dynamic environment that supports your development and recognizes your achievements.,

Posted 1 week ago

Apply

7.0 - 11.0 years

0 Lacs

bhubaneswar

On-site

You are a Senior Data Engineer with 7 to 11 years of experience who is proficient in building and managing scalable, secure, and high-performance data pipelines and storage solutions. Your primary focus will be on Microsoft Azure data services. In Requirement 1 (Azure Focus), your key responsibilities will include designing and implementing robust data pipelines using Azure Data Factory, Azure Data Lake, and other Azure services. You will also develop and optimize large-scale data processing using Databricks and PySpark. Additionally, you will work with both relational (SQL) and NoSQL databases, write clean code in Python and/or Scala, and collaborate with cross-functional teams. To excel in this role, you must have 7-11 years of professional experience as a Data Engineer, deep expertise with Azure Data Services, solid experience with Databricks and PySpark, proficiency in SQL and NoSQL databases, and a strong background in Python, Scala, and object-oriented programming. In Requirement 2 (GCP & Databricks Specialist), you will be expected to build, maintain, and optimize scalable data pipelines using Databricks and Google Cloud (BigQuery, Cloud Storage, etc.). You will collaborate with analytics and data science teams, perform exploratory data analysis (EDA), and ensure best practices in data modeling, governance, and lifecycle management. For this role, you should have 7-11 years of professional experience in data engineering, expert-level proficiency in Databricks, BigQuery, and Python, strong SQL skills with large datasets, exposure to data science concepts and EDA methodologies, and familiarity with CI/CD processes and version control systems.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As an AI/ML Engineer, you will be responsible for identifying, defining, and delivering AI/ML and GenAI use cases in collaboration with business and technical stakeholders. Your role will involve designing, developing, and deploying models using Google Cloud's Vertex AI platform. You will be tasked with fine-tuning and evaluating Large Language Models (LLMs) for domain-specific applications and ensuring responsible AI practices and governance in solution delivery. Collaboration with data engineers and architects is essential to ensure robust and scalable pipelines. It will be your responsibility to document workflows and experiments for reproducibility and handover readiness. Your expertise in supervised, unsupervised, and reinforcement learning will be applied to develop solutions using Vertex AI features including AutoML, Pipelines, Model Registry, and Generative AI Studio. In this role, you will work on GenAI workflows, which includes prompt engineering, fine-tuning, and model evaluation. Proficiency in Python is required for developing in ML frameworks such as TensorFlow, PyTorch, scikit-learn, and Hugging Face Transformers. Effective communication and collaboration across product, data, and business teams are crucial for the success of the projects. The ideal candidate should have hands-on experience with Vertex AI on GCP for model training, deployment, endpoint management, and MLOps. Practical knowledge of PaLM, Gemini, or other LLMs via Vertex AI or open-source tools is preferred. Proficiency in Python for ML pipeline scripting, data preprocessing, and evaluation is necessary. Expertise in ML/GenAI libraries like scikit-learn, TensorFlow, PyTorch, Hugging Face, and LangChain is expected. Experience with CI/CD for ML, containerization using Docker/Kubernetes, and familiarity with GCP services like BigQuery, Cloud Functions, and Cloud Storage are advantageous. Knowledge of media datasets and real-world ML applications in OTT, DTH, and Web platforms will be beneficial in this role. Qualifications required for this position include a Bachelors or Masters degree in Computer Science, Artificial Intelligence, Data Science, or related fields. The candidate should have at least 3 years of hands-on experience in ML/AI or GenAI projects. Any relevant certifications in ML, GCP, or GenAI technologies are considered a plus.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

kochi, kerala

On-site

Beinex is seeking a skilled and motivated Google Cloud Consultant to join our dynamic team. As a Google Cloud Consultant, you will play a pivotal role in assisting our clients in harnessing the power of Google Cloud technologies to drive innovation and transformation. If you are passionate about cloud solutions, client collaboration, and cutting-edge technology, we invite you to join our journey. Responsibilities - Collaborate with clients to understand their business objectives and technology needs, translating them into effective Google Cloud solutions - Design, implement, and manage Google Cloud Platform (GCP) architectures, ensuring scalability, security, and performance - Provide technical expertise and guidance to clients on GCP services, best practices, and cloud-native solutions and adopt an Infrastructure as Code (IaC) approach to establish an advanced infrastructure for both internal and external stakeholders - Conduct cloud assessments and create migration strategies for clients looking to transition their applications and workloads to GCP - Work with cross-functional teams to plan, execute, and optimise cloud migrations, deployments, and upgrades - Assist clients in optimising their GCP usage by analysing resource utilisation, recommending cost-saving measures, and enhancing overall efficiency - Collaborate with development teams to integrate cloud-native technologies and solutions into application design and development processes - Stay updated with the latest trends, features, and updates in the Google Cloud ecosystem and provide thought leadership to clients - Troubleshoot and resolve technical issues related to GCP services and configurations - Create and maintain documentation for GCP architectures, solutions, and best practices - Conduct training sessions and workshops for clients to enhance their understanding of GCP technologies and usage Key Skills Requirements - Profound expertise in Google Cloud Platform services, including but not limited to Compute Engine, App Engine, Kubernetes Engine, Cloud Storage, BigQuery, Pub/Sub, Cloud Functions, VPC, IAM, and Cloud Security - Strong understanding of GCP networking concepts, including VPC peering, firewall rules, VPN, and hybrid cloud configurations - Experience with Infrastructure as Code (IaC) tools such as Terraform, Deployment Manager, or Google Cloud Deployment Manager - Hands-on experience with containerisation technologies like Docker and Kubernetes - Proficiency in scripting languages such as Python and Bash - Familiarity with cloud monitoring, logging, and observability tools and practices - Knowledge of DevOps principles and practices, including CI/CD pipelines and automation - Strong problem-solving skills and the ability to troubleshoot complex technical issues - Excellent communication skills to interact effectively with clients, team members, and stakeholders - Previous consulting or client-facing experience is a plus - Relevant Google Cloud certifications are highly desirable Perks: Careers at Beinex - Comprehensive Health Plans - Learning and development - Workation and outdoor training - Hybrid working environment - On-site travel Opportunity - Beinex Branded Merchandise,

Posted 1 week ago

Apply

12.0 - 15.0 years

55 - 60 Lacs

Ahmedabad, Chennai, Bengaluru

Work from Office

Dear Candidate, Were hiring a Cloud Network Engineer to design and manage secure, performant cloud networks. Key Responsibilities: Design VPCs, subnets, and routing policies. Configure load balancers, firewalls, and VPNs. Optimize traffic flow and network security. Required Skills & Qualifications: Experience with cloud networking in AWS/Azure/GCP. Understanding of TCP/IP, DNS, VPNs. Familiarity with tools like Palo Alto, Cisco, or Fortinet. Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Srinivasa Reddy Kandi Delivery Manager Integra Technologies

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

chennai, tamil nadu

On-site

You will be joining as a GCP Data Architect at TechMango, a rapidly growing IT Services and SaaS Product company located in Madurai and Chennai. With over 12 years of experience, you are expected to start immediately and work from the office. TechMango specializes in assisting global businesses with digital transformation, modern data platforms, product engineering, and cloud-first initiatives. In this role, you will be leading data modernization efforts for a prestigious client, Livingston, in a highly strategic project. As a GCP Data Architect, your primary responsibility will be to design and implement scalable, high-performance data solutions on Google Cloud Platform. You will collaborate closely with stakeholders to define data architecture, implement data pipelines, modernize legacy data systems, and guide data strategy aligned with enterprise goals. Key Responsibilities: - Lead end-to-end design and implementation of scalable data architecture on Google Cloud Platform (GCP) - Define data strategy, standards, and best practices for cloud data engineering and analytics - Develop data ingestion pipelines using Dataflow, Pub/Sub, Apache Beam, Cloud Composer (Airflow), and BigQuery - Migrate on-prem or legacy systems to GCP (e.g., from Hadoop, Teradata, or Oracle to BigQuery) - Architect data lakes, warehouses, and real-time data platforms - Ensure data governance, security, lineage, and compliance (using tools like Data Catalog, IAM, DLP) - Guide a team of data engineers and collaborate with business stakeholders, data scientists, and product managers - Create documentation, high-level design (HLD) and low-level design (LLD), and oversee development standards - Provide technical leadership in architectural decisions and future-proofing the data ecosystem Required Skills & Qualifications: - 10+ years of experience in data architecture, data engineering, or enterprise data platforms - Minimum 3-5 years of hands-on experience in GCP Data Service - Proficient in: BigQuery, Cloud Storage, Dataflow, Pub/Sub, Composer, Cloud SQL/Spanner - Python / Java / SQL - Data modeling (OLTP, OLAP, Star/Snowflake schema) - Experience with real-time data processing, streaming architectures, and batch ETL pipelines - Good understanding of IAM, networking, security models, and cost optimization on GCP - Prior experience in leading cloud data transformation projects - Excellent communication and stakeholder management skills Preferred Qualifications: - GCP Professional Data Engineer / Architect Certification - Experience with Terraform, CI/CD, GitOps, Looker / Data Studio / Tableau for analytics - Exposure to AI/ML use cases and MLOps on GCP - Experience working in agile environments and client-facing roles What We Offer: - Opportunity to work on large-scale data modernization projects with global clients - A fast-growing company with a strong tech and people culture - Competitive salary, benefits, and flexibility - Collaborative environment that values innovation and leadership,

Posted 1 week ago

Apply

8.0 - 12.0 years

35 - 60 Lacs

Bengaluru

Work from Office

Job Summary As a Software Engineer, your responsibilities will include developing and maintaining cloud-based solutions. You will focus on solving complex problems, developing, testing, automating and collaborating with the Software Engineering team to deploy features in a production environment. Additionally, you will be responsible for designing and implementing managed Cloud Services based on given requirements. We expect you to have excellent coding skills and take a lead role in designing and implementing managed Cloud Services. Prior experience in Filesystems would be an added advantage, and you should also have the ability to quickly learn existing code and architecture. Job Requirements Lead to deliver features, including participating in the full software development lifecycle. Participate in product design, development, verification, troubleshooting, and delivery of a system or major subsystems, including authoring project specifications. Excellent Problem solver, proficient coder and a designer. Thorough understanding and extensive experience with Block/File technologies having hands-on experience in designing and developing software solutions. Proficient with any of the languages C, C++ or Golang. Experience with Python, Java/C-sharp is added advantage. Thorough understanding of Linux or other Unix-like Operating Systems. Strong in Data Structure and algorithms. Expertise in REST API design and implementation. Prior experience with Filesystem development and Distributed system design is desirable. Understanding of Container based technologies preferably Kubernetes & Dockers and experience with Cloud service APIs (e.g. AWS, Azure or GCP) is desirable. Continuously monitor, analyze, and measure system health, availability, and latency using Google native tooling. Develop and implement steps to improve system and application performance, availability, and reliability. Knowledge of infrastructure like hypervisor, Cloud Storage and experience with cloud services including Databases, Caching, Object and Block Storage, Scaling, Monitoring, Load Balancers, Networking etc. is added advantage. Mentor junior members, participate in interviews, and contribute to building high-performance teams. Work on development, bug fixes/updates, spec updates, customer RCAs and automation. Strong oral and written communication skills. Engage in incident management processes including the 24X7 Oncall rotations ( as per the sun model) to resolve production issues within agreed SLAs/SLOs. Education B.E/B.Tech or M.S in Computer Science or related technical field 8 to 12 years of experience and must be hands-on with coding

Posted 2 weeks ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Chennai

Work from Office

Job Summary Synechron is seeking an experienced Data Processing Engineer to lead the development of large-scale data processing solutions using Java, Apache Flink/Storm/Beam, and Google Cloud Platform (GCP). In this role, you will collaborate across teams to design, develop, and optimize data-intensive applications that support strategic business objectives. Your expertise will help evolve our data architecture, improve processing efficiency, and ensure the delivery of reliable, scalable solutions in an Agile environment. Software Requirements Required: Java (version 8 or higher) Apache Flink, Storm, or Beam for streaming data processing Google Cloud Platform (GCP) services, especially BigQuery and related data tools Experience with databases such as BigQuery, Oracle, or equivalent Familiarity with version control tools such as Git Preferred: Cloud deployment experience with GCP in particular Additional familiarity with containerization (Docker/Kubernetes) Knowledge of CI/CD pipelines and DevOps practices Overall Responsibilities Collaborate closely with cross-functional teams to understand data and system requirements, then design scalable solutions aligned with business needs. Develop detailed technical specifications, implementation plans, and documentation for new features and enhancements. Implement, test, and deploy data processing applications using Java and Apache Flink/Storm/Beam within GCP environments. Conduct code reviews to ensure quality, security, and maintainability, supporting team members' growth and best practices. Troubleshoot technical issues, resolve bottlenecks, and optimize application performance and resource utilization. Stay current with advancements in data processing, cloud technology, and Java development to continuously improve solutions. Support testing teams to verify data workflows and validation processes, ensuring reliability and accuracy. Participate in Agile ceremonies, including sprint planning, stand-ups, and retrospectives to ensure continuous delivery and process improvement. Technical Skills (By Category) Programming Languages: Required: Java (8+) Preferred: Python, Scala, or Node.js for scripting or auxiliary processing Databases/Data Management: Experience with BigQuery, Oracle, or similar relational data stores Cloud Technologies: GCP (BigQuery, Cloud Storage, Dataflow etc.) with hands-on experience in cloud data solutions Frameworks and Libraries: Apache Flink, Storm, or Beam for stream processing Java SDKs, APIs, and data integration libraries Development Tools and Methodologies: Git, Jenkins, JIRA, and Agile/Scrum practices Familiarity with containerization (Docker, Kubernetes) is a plus Security and Compliance: Understanding of data security principles in cloud environments Experience Requirements 4+ years of experience in software development, with a focus on data processing and Java-based backend development Proven experience working with Apache Flink, Storm, or Beam in production environments Strong background in managing large data workflows and pipeline optimization Experience with GCP data services and cloud-native development Demonstrated success in Agile projects, including collaboration with cross-functional teams Previous leadership or mentorship experience is a plus Day-to-Day Activities Design, develop, and deploy scalable data processing applications in Java using Flink/Storm/Beam on GCP Collaborate with data engineers, analysts, and architects to translate business needs into technical solutions Conduct code reviews, optimize data pipelines, and troubleshoot system issues swiftly Document technical specifications, data schemas, and process workflows Participate actively in Agile ceremonies, provide updates on task progress, and suggest process improvements Support continuous integration and deployment of data applications Mentor junior team members, sharing best practices and technical insights Qualifications Bachelors or Masters degree in Computer Science, Information Technology, or equivalent Relevant certifications in cloud technologies or data processing (preferred) Evidence of continuous professional development and staying current with industry trends Professional Competencies Strong analytical and problem-solving skills focused on data processing challenges Leadership abilities to guide, mentor, and develop team members Excellent communication skills for technical documentation and stakeholder engagement Adaptability to rapidly changing technologies and project priorities Capacity to prioritize tasks and manage time efficiently under tight deadlines Innovative mindset to leverage new tools and techniques for performance improvements

Posted 2 weeks ago

Apply

5.0 - 10.0 years

6 - 16 Lacs

Kolkata, Bengaluru, Mumbai (All Areas)

Work from Office

Responsibilities A day in the life of an Infoscion • As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. • You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. • You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design • You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organizations financial guidelines • Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability • Good knowledge on software configuration management systems • Awareness of latest technologies and Industry trends • Logical thinking and problem solving skills along with an ability to collaborate • Understanding of the financial processes for various types of projects and the various pricing models available • Ability to assess the current processes, identify improvement areas and suggest the technology solutions • One or two industry domain knowledge • Client Interfacing skills • Project and Team management Technical and Professional Requirements : Technology->Cloud Platform->GCP Data Analytics->Looker,Technology->Cloud Platform->GCP Database->Google BigQuery Preferred Skills: Technology->Cloud Platform->Google Big Data Technology->Cloud Platform->GCP Data Analytics

Posted 2 weeks ago

Apply

1.0 - 5.0 years

0 Lacs

karnataka

On-site

Capgemini Invent is the digital innovation, consulting, and transformation brand of the Capgemini Group, a global business line that combines market-leading expertise in strategy, technology, data science, and creative design to help CxOs envision and build what's next for their businesses. In this role, you should have developed/worked on at least one Gen AI project and have experience in data pipeline implementation with cloud providers such as AWS, Azure, or GCP. You should also be familiar with cloud storage, cloud database, cloud data warehousing, and Data lake solutions like Snowflake, BigQuery, AWS Redshift, ADLS, and S3. Additionally, a good understanding of cloud compute services, load balancing, identity management, authentication, and authorization in the cloud is essential. Your profile should include a good knowledge of infrastructure capacity sizing, costing of cloud services to drive optimized solution architecture, leading to optimal infra investment vs. performance and scaling. You should be able to contribute to making architectural choices using various cloud services and solution methodologies. Proficiency in programming using Python is required along with expertise in cloud DevOps practices such as infrastructure as code, CI/CD components, and automated deployments on the cloud. Understanding networking, security, design principles, and best practices in the cloud is also important. At Capgemini, we value flexible work arrangements to provide support for maintaining a healthy work-life balance. You will have opportunities for career growth through various career growth programs and diverse professions tailored to support you in exploring a world of opportunities. Additionally, you can equip yourself with valuable certifications in the latest technologies such as Generative AI. Capgemini is a global business and technology transformation partner with a rich heritage of over 55 years. We have a diverse team of 340,000 members in more than 50 countries, working together to accelerate the dual transition to a digital and sustainable world while creating tangible impact for enterprises and society. Trusted by clients to unlock the value of technology, we deliver end-to-end services and solutions leveraging strengths from strategy and design to engineering, fueled by market-leading capabilities in AI, cloud, and data, combined with deep industry expertise and partner ecosystem. Our global revenues in 2023 were reported at 22.5 billion.,

Posted 2 weeks ago

Apply

4.0 - 9.0 years

12 - 17 Lacs

Chennai

Work from Office

Job Summary Synechron is seeking an experienced Data Processing Engineer to lead the development of large-scale data processing solutions using Java, Apache Flink/Storm/Beam, and Google Cloud Platform (GCP). In this role, you will collaborate across teams to design, develop, and optimize data-intensive applications that support strategic business objectives. Your expertise will help evolve our data architecture, improve processing efficiency, and ensure the delivery of reliable, scalable solutions in an Agile environment. Software Requirements Required: Java (version 8 or higher) Apache Flink, Storm, or Beam for streaming data processing Google Cloud Platform (GCP) services, especially BigQuery and related data tools Experience with databases such as BigQuery, Oracle, or equivalent Familiarity with version control tools such as Git Preferred: Cloud deployment experience with GCP in particular Additional familiarity with containerization (Docker/Kubernetes) Knowledge of CI/CD pipelines and DevOps practices Overall Responsibilities Collaborate closely with cross-functional teams to understand data and system requirements, then design scalable solutions aligned with business needs. Develop detailed technical specifications, implementation plans, and documentation for new features and enhancements. Implement, test, and deploy data processing applications using Java and Apache Flink/Storm/Beam within GCP environments. Conduct code reviews to ensure quality, security, and maintainability, supporting team members' growth and best practices. Troubleshoot technical issues, resolve bottlenecks, and optimize application performance and resource utilization. Stay current with advancements in data processing, cloud technology, and Java development to continuously improve solutions. Support testing teams to verify data workflows and validation processes, ensuring reliability and accuracy. Participate in Agile ceremonies, including sprint planning, stand-ups, and retrospectives to ensure continuous delivery and process improvement. Technical Skills (By Category) Programming Languages: Required: Java (8+) Preferred: Python, Scala, or Node.js for scripting or auxiliary processing Databases/Data Management: Experience with BigQuery, Oracle, or similar relational data stores Cloud Technologies: GCP (BigQuery, Cloud Storage, Dataflow etc.) with hands-on experience in cloud data solutions Frameworks and Libraries: Apache Flink, Storm, or Beam for stream processing Java SDKs, APIs, and data integration libraries Development Tools and Methodologies: Git, Jenkins, JIRA, and Agile/Scrum practices Familiarity with containerization (Docker, Kubernetes) is a plus Security and Compliance: Understanding of data security principles in cloud environments Experience Requirements 4+ years of experience in software development, with a focus on data processing and Java-based backend development Proven experience working with Apache Flink, Storm, or Beam in production environments Strong background in managing large data workflows and pipeline optimization Experience with GCP data services and cloud-native development Demonstrated success in Agile projects, including collaboration with cross-functional teams Previous leadership or mentorship experience is a plus Day-to-Day Activities Design, develop, and deploy scalable data processing applications in Java using Flink/Storm/Beam on GCP Collaborate with data engineers, analysts, and architects to translate business needs into technical solutions Conduct code reviews, optimize data pipelines, and troubleshoot system issues swiftly Document technical specifications, data schemas, and process workflows Participate actively in Agile ceremonies, provide updates on task progress, and suggest process improvements Support continuous integration and deployment of data applications Mentor junior team members, sharing best practices and technical insights Qualifications Bachelors or Masters degree in Computer Science, Information Technology, or equivalent Relevant certifications in cloud technologies or data processing (preferred) Evidence of continuous professional development and staying current with industry trends Professional Competencies Strong analytical and problem-solving skills focused on data processing challenges Leadership abilities to guide, mentor, and develop team members Excellent communication skills for technical documentation and stakeholder engagement Adaptability to rapidly changing technologies and project priorities Capacity to prioritize tasks and manage time efficiently under tight deadlines Innovative mindset to leverage new tools and techniques for performance improvements S YNECHRONS DIVERSITY & INCLUSION STATEMENT Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative Same Difference is committed to fostering an inclusive culture promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies