Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 13.0 years
20 - 30 Lacs
Gurugram
Work from Office
Hi, Wishes from GSN!!! Pleasure connecting with you!!! We been into Corporate Search Services for Identifying & Bringing in Stellar Talented Professionals for our reputed IT / Non-IT clients in India. We have been successfully providing results to various potential needs of our clients for the last 20 years. At present, GSN is hiring GCP ENGINEER for one of our leading MNC client. PFB the details for your better understanding: 1. WORK LOCATION : Gurugram 2. Job Role: GCP ENGINEER 3. EXPERIENCE : 8+ yrs 4. CTC Range: Rs. 20 LPA to Rs. 30 LPA 5. Work Type : WFO (Hybrid) ****** Looking for IMMEDIATE JOINER ****** Who are we looking for ? MLOPS Engineer with AWS Experience. Required Skills : GCP Arch. Certification Terraform GitLab Shell Scripting GCP Services Compute Engine Cloud Storage Data Flow Big Query IAM . ****** Looking for IMMEDIATE JOINER ****** Best regards, Kaviya | GSN | Kaviya@gsnhr.net | 9150016092 | Google review : https://g.co/kgs/UAsF9W
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
You will be responsible for leading the migration of on-premises applications to AWS/Azure, optimizing cloud infrastructure, and ensuring seamless transitions. Your main duties will include planning and executing migrations, developing migration tools for large-scale applications, designing and implementing automated application migrations, and collaborating with cross-functional teams to troubleshoot and resolve migration issues. To excel in this role, you should have at least 8 years of experience in AWS/Azure cloud migrations. You must possess proficiency in Cloud compute (EC2, EKS, Azure VM, AKS) and Storage (S3, EBS, EFS, Azure Blob, Azure Managed Disks, Azure Files). Additionally, you should have a strong knowledge of AWS and Azure cloud services and migration tools, as well as expertise in Terraform. AWS/Azure certification is preferred but not mandatory. If you are a skilled Cloud Engineer with expertise in AWS Cloud migrations and are looking for an opportunity to lead migration projects, optimize cloud infrastructure, and ensure successful transitions to the cloud, we encourage you to apply for this role.,
Posted 2 weeks ago
2.0 - 5.0 years
4 - 7 Lacs
Mumbai
Work from Office
Role not for you, but know the perfect person for it? Refer a friend, and make Rs 10K if successfully placed :) Refer & Earn! About the Brand: A fast-growing, D2C consumer brand redefining comfort and wellness in the sleep space. With a mission to blend thoughtful design, cutting-edge technology, and superior customer experience, we are changing how India sleeps one mattress at a time. Our content-first approach helps us connect deeply with a modern, digital-first audience across platforms. Key Responsibilities: Lead the post-production process for both short-form and long-form videos from raw footage to final output. Collaborate with the content, performance, and brand teams to bring compelling visual narratives to life. Develop performance marketing videos , thumb-stopper content , and motion graphic-heavy reels optimized for Instagram, YouTube, and other digital platforms. Maintain visual and tonal consistency across all video assets, aligned with brand guidelines. Own end-to-end video projects from conceptualization, scripting assistance, editing, sound design, to final delivery. Incorporate modern editing trends, transitions, sound cues, typography , and visual effects that engage and convert. Handle cloud-based editing workflows and collaborate with internal/external stakeholders for feedback and revisions. Regularly research and adopt trending formats and storytelling techniques relevant to D2C and e-commerce audiences. Must-Have Skills: Proficiency in Adobe Premiere Pro , After Effects , DaVinci Resolve , and Photoshop . Hands-on experience with motion graphics, text animation , and sound design . Strong storytelling ability with a keen eye for pacing, transitions , and emotional flow . Familiarity with cloud storage solutions (e.g., Drive, Frame.io, Dropbox) for collaborative editing workflows.
Posted 2 weeks ago
10.0 - 15.0 years
35 - 40 Lacs
Bengaluru
Work from Office
Proficiency in Google Cloud Platform (GCP) services, including Dataflow , DataStream , Dataproc , Big Query , and Cloud Storage . Strong experience with Apache Spark and Apache Flink for distributed data processing. Knowledge of real-time data streaming technologies (e.g., Apache Kafka , Pub/Sub ). Familiarity with data orchestration tools like Apache Airflow or Cloud Composer . Expertise in Infrastructure as Code (IaC) tools like Terraform or Cloud Deployment Manager . Experience with CI/CD tools like Jenkins , GitLab CI/CD , or Cloud Build . Knowledge of containerization and orchestration tools like Docker and Kubernetes . Strong scripting skills for automation (e.g., Bash , Python ). Experience with monitoring tools like Cloud Monitoring , Prometheus , and Grafana . Familiarity with logging tools like Cloud Logging or ELK Stack .
Posted 2 weeks ago
3.0 - 8.0 years
5 - 15 Lacs
Chennai
Work from Office
Role & responsibilities: Assist in executing cloud migration tasks including VM migrations, database transfers, and application re-platforming. Perform GCP resource provisioning using Deployment Manager or Terraform. Collaborate with senior engineers on lift-and-shift or re-architecture engagements. Troubleshoot basic networking, IAM, and storage issues in GCP. Preferred candidate profile: GCP Associate Cloud Engineer Certification (mandatory) Hands-on experience in at least one GCP migration project (even as a support resource) Strong understanding of GCP core services: Compute Engine, Cloud Storage, VPC, IAM. Familiarity with CI/CD tools and scripting (Bash, Python) Nice-to-Have Skills: Exposure to Kubernetes (GKE) or Docker Familiarity with hybrid/multi-cloud tools Interested candidate share your resume to valarmathi.venkatesan@securekloud.com
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Engineer, your primary responsibility will be to design and develop robust ETL pipelines using Python, PySpark, and various Google Cloud Platform (GCP) services. You will be tasked with building and optimizing data models and queries in BigQuery to support analytics and reporting needs. Additionally, you will play a crucial role in ingesting, transforming, and loading structured and semi-structured data from diverse sources. Collaboration with data analysts, scientists, and business teams is essential to grasp and address data requirements effectively. Ensuring data quality, integrity, and security across cloud-based data platforms will be a key part of your role. You will also be responsible for monitoring and troubleshooting data workflows and performance issues. Automation of data validation and transformation processes using scripting and orchestration tools will be a significant aspect of your day-to-day tasks. Your hands-on experience with Google Cloud Platform (GCP), particularly BigQuery, will be crucial. Proficiency in Python and/or PySpark programming, along with experience in designing and implementing ETL workflows and data pipelines, is required. A strong command of SQL and data modeling for analytics is essential. Familiarity with GCP services like Cloud Storage, Dataflow, Pub/Sub, and Composer will be beneficial. An understanding of data governance, security, and compliance in cloud environments is also expected. Experience with version control using Git and agile development practices will be advantageous for this role.,
Posted 2 weeks ago
10.0 - 20.0 years
18 - 33 Lacs
Bengaluru
Remote
Minimum 3-4 years of hands-on experience designing, deploying, and managing solutions on Google Cloud. Proven understanding and practical application of core GCP services including Virtual Private Cloud (VPC), Identity and Access Management (IAM), Cloud Operations (formerly Stackdriver) for monitoring and logging, and GCP Security best practices. Demonstrated experience working as a cloud engineer within a financial institution, understanding its specific regulatory and security demands. Solid experience in deploying, managing, and scaling containerized applications using Google Kubernetes Engine. Proficiency with tools commonly used with Kubernetes, such as Helm for package management and Istio for service mesh capabilities. Extensive experience with Terraform for provisioning and managing cloud infrastructure. Experience with Terraform Sentinel for policy-as-code and governance is a significant advantage. Proven experience in building and maintaining CI/CD pipelines using GitHub Actions. Deep understanding of cloud security principles and experience implementing security controls within a GCP environment, particularly relevant to financial services. Experience in monitoring, troubleshooting, and optimizing cloud environments for performance, reliability, and cost-effectiveness. Ability to work effectively within a team, communicate technical concepts clearly, and collaborate with developers and other stakeholders. Track record of identifying and resolving complex technical issues in cloud environments, especially within the constraints of a regulated financial industry.
Posted 2 weeks ago
16.0 - 25.0 years
32 - 47 Lacs
Pune
Work from Office
MS Dynamics 365 CRM Architect Function: Software Engineering Backend Development, Big Data / DWH / ETL, Solution Architecture / Presales Microsoft CRM Architecture As a Dynamics 365 CRM Solution Architect, you are considered a Solution Specialist and have in depth knowledge of D365 CRM/CE and Power Platform. You are energetic and passionate about Dynamics 365 CE and the Power Platform and how it can be used to solve complex business problems. You have in-depth knowledge of Dynamics 365 CE Apps including Field Services. You will help define the technical solution, providing expert services to solution large complex projects. You will help provide leadership and direction to the delivery team and provide Architecture guidance and direction leveraging your deep knowledge of Dynamics 365 CE and Power Platform. You are also seen as a technology leader within the Dynamics 365 community and you routinely share your knowledge on leading edge topics in community and internal events. Responsibilities: Pre-sales i.e., responding to RFI/RFP, proposal writing, man-days estimate. CRM Application Lead will lead the delivery of our Microsoft Dynamics 365 solutions. Minimum of 10+ years of proven experience in implementing End-To-End Large scale solutions involving Dynamics CE(CRM) with multi country roll out of at least 7-8 full scale industry solutions using Dynamics CE (CRM), Integrations, Power Platform Be a champion/SME (Subject Matter Expert) for both technical and functional solutions across the Microsoft platform, driving the adoption of new features and technology. Possess strong understanding of Dynamics 365 & power platform stack and stay relevant with new features. Design and development of aviation domain specific implementations; Influence usage of out of the box Dynamics 365 functionality, giving NCS a competitive advantage. Meet and exceed customer expectations on business knowledge, skills, and behaviour. Identify business/project risk and mitigate or communicate as necessary. Ensure progress updates are communicated to relevant parties both formally and informally. In-depth knowledge of all modules of Dynamics 365 CE, including appropriate Architect level certifications. Relevant experience on Dynamics 365 Sales, Marketing & Customer Insights. Strong implementation experience with Power Platform (Power Apps, Power Automate, Power Virtual Agents, Power BI, RPA etc.), including appropriate Architect level certification. Broad understanding of software development lifecycle, Development management, Release management, Data migration, cut-over planning, and Go-Live support. Exposure in Cloud Storage Services (Azure SQL, Blob storage etc.) Experience with Azure Platform and Integration Services including LogicApps, Functions, Azure Service Bus etc. Experience working with onsite offshore teams. Experience of Agile / Waterfall Methodology Ability to architect solutions that involve multi-region implementations. Strong technical leadership for engagements and ability to present to the C-Suite stakeholders. Ability to handle distributed large teams and deliver quality solutions Roles and Responsibilities
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Engineer, you will be responsible for designing and developing robust ETL pipelines using Python, PySpark, and Google Cloud Platform (GCP) services. Your role will involve building and optimizing data models and queries in BigQuery for analytics and reporting purposes. You will also be responsible for ingesting, transforming, and loading structured and semi-structured data from various sources. Collaboration with data analysts, scientists, and business teams to comprehend data requirements will be a key aspect of your job. Ensuring data quality, integrity, and security across cloud-based data platforms is crucial. Monitoring and troubleshooting data workflows and performance issues will also be part of your responsibilities. Automation of data validation and transformation processes using scripting and orchestration tools will be an essential aspect of your role. You are required to have hands-on experience with Google Cloud Platform (GCP), especially BigQuery. Strong programming skills in Python and/or PySpark are necessary for this position. Your experience in designing and implementing ETL workflows and data pipelines will be valuable. Proficiency in SQL and data modeling for analytics is required. Familiarity with GCP services such as Cloud Storage, Dataflow, Pub/Sub, and Composer is preferred. Understanding data governance, security, and compliance in cloud environments is essential. Experience with version control tools like Git and agile development practices will be beneficial for this role. If you are looking for a challenging opportunity to work on cutting-edge data engineering projects, this position is ideal for you.,
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
As a skilled Cloud Engineer with expertise in AWS Cloud migrations, you will play a crucial role in leading the migration of on-premises applications to AWS/Azure. Your primary responsibilities will include planning and executing migrations, optimizing cloud infrastructure, and ensuring seamless transitions for large-scale application migrations. You will be expected to utilize or develop migration tools, design and implement automated application migrations, and collaborate effectively with cross-functional teams to troubleshoot and resolve migration issues. Your 8+ years of experience in AWS/Azure cloud migration, proficiency in Cloud compute and storage services, strong knowledge of AWS and Azure cloud services and migration tools, and expertise in terraform will be key assets in successfully fulfilling these responsibilities. If you possess AWS/Azure certification, it will be considered a strong advantage. This role offers an exciting opportunity to make significant contributions to the cloud migration process and play a vital role in optimizing cloud infrastructure for enhanced performance and efficiency.,
Posted 2 weeks ago
3.0 - 8.0 years
10 - 15 Lacs
Navi Mumbai
Work from Office
Roles & Responsibilities: Front-End DevelopmentDesign and develop user interfaces using HTML, CSS, JavaScript, and frameworks like React or Angular. Back-End DevelopmentBuild and maintain server-side applications using languages like Node.js, Python, or Java. Database ManagementDesign and manage databases, ensuring data integrity and security. API DevelopmentDevelop and integrate RESTful APIs for seamless communication between front-end and back-end systems. Cloud IntegrationImplement cloud solutions using Google Cloud Platform (GCP) services such as Cloud Functions, Cloud Storage, and BigQuery. UI/UX DesignEnsure a user-friendly and visually appealing design for all applications. CollaborationWork closely with data engineers, data scientists, and other stakeholders to understand requirements and deliver solutions. Testing and DebuggingPerform thorough testing and debugging to ensure high-quality code.Technical Skills Required: Front-End TechnologiesProficiency in HTML, CSS, JavaScript, and front-end frameworks like React or Angular. Back-End TechnologiesExperience with server-side languages like Node.js, Python, or Java. Database ManagementKnowledge of SQL and NoSQL databases such as MySQL, PostgreSQL, MongoDB, or Firebase. API DevelopmentSkills in developing and integrating RESTful APIs. Cloud ServicesFamiliarity with Google Cloud Platform (GCP) services like Cloud Functions, Cloud Storage, and BigQuery. UI/UX DesignUnderstanding of UI/UX principles and best practices. Version ControlProficiency in version control systems like Git. Qualifications Experience or Prerequisites: EducationBachelor's degree in Computer Science, Software Engineering, or a related field. Experience3+ years of experience in full stack development. ProjectsProven experience in developing and deploying full stack applications. CertificationsRelevant certifications in software development or cloud computing are a plus. Job Location
Posted 2 weeks ago
12.0 - 20.0 years
5 - 9 Lacs
Pune
Work from Office
The IBM Storage Protect Support (Spectrum Protect or TSM erstwhile) team is supporting complex integrated storage products end to end, including Spectrum Protect, Spectrum Protect Plus, Copy Data Management. This position involves working with our IBM customers remotely, which are some of the world's top research, automotive, banks, health care and technology providers. The candidates must be able to assist with operating systems (AIX,Linux, Unix, Windows), SAN, network protocols, clouds and storage devices. They will work in a virtual environment working with colleagues around the globe and will be exposed to many different types of technologies. Responsibilitiesmust include but not limited to Provide remote troubleshooting and analysis assistance for usage and configuration questions Review diagnostic information to assist in isolation of a problem cause (which could include assistance interpreting traces and dumps) Identify known defects and fixes to resolve problems Develops best practice articles and support utilities to improve support quality and productivity Respond to escalated customer calls, complaints, and queries The job will require flexible schedule to ensure 24x7 support operations and weekend on-call coverage, including extending/taking shift to cover North America working hours. Required education Bachelor's Degree Preferred education Bachelor's Degree Required technical and professional expertise Must have worked in at least 12-20 years on data protection or storage software’s as administrator or solution architect or client server technologies. Debugging and analysis are performed via the telephone as well as electronically. So candidates must possess strong customer interaction skills and be able to clearly articulate solutions and options. Must be familiar with and able to interpret complex software problems that span across multiple client and server platforms including UNIX, Linux, AIX, and Windows. Focus on storage area networks (SAN), network protocols, Cloud, and storage devices is preferred. Hands on experience with storage virtualization is a plus. Candidates must be flexible in schedule and availability. Second shift and weekend scheduling will be required. Preferred technical and professional experience Provide remote troubleshooting and analysis assistance for usage and configuration questions Preferred Professional and Technical Expertise: At least 15 years of in-depth experience with Spectrum Protect (Storage Protect) or its competition products in data protection domain Working knowledge on RedHat, Openshift or Ansible administration will be preferred. Good in networking and troubleshooting. Cloud Certification will be added advantage, Knowledge of Object Storage and Cloud Storage will be preferred.
Posted 2 weeks ago
8.0 - 13.0 years
5 - 9 Lacs
Pune
Work from Office
The IBM Storage Protect Support (Spectrum Protect or TSM erstwhile) team is supporting complex integrated storage products end to end, including Spectrum Protect, Spectrum Protect Plus, Copy Data Management. This position involves working with our IBM customers remotely, which are some of the world's top research, automotive, banks, health care and technology providers. The candidates must be able to assist with operating systems (AIX,Linux, Unix, Windows), SAN, network protocols, clouds and storage devices. They will work in a virtual environment working with colleagues around the globe and will be exposed to many different types of technologies. Responsibilitiesmust include but not limited to Provide remote troubleshooting and analysis assistance for usage and configuration questions Review diagnostic information to assist in isolation of a problem cause (which could include assistance interpreting traces and dumps) Identify known defects and fixes to resolve problems Develops best practice articles and support utilities to improve support quality and productivity Respond to escalated customer calls, complaints, and queries The job will require flexible schedule to ensure 24x7 support operations and weekend on-call coverage, including extending/taking shift to cover North America working hours. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Following minimum experience are required for the role - Must have worked in at least 8 - 15 years on data protection or storage software’s as administrator or solution architect or client server technologies. Debugging and analysis are performed via the telephone as well as electronically. So candidates must possess strong customer interaction skills and be able to clearly articulate solutions and options. Must be familiar with and able to interpret complex software problems that span across multiple client and server platforms including UNIX, Linux, AIX, and Windows. Focus on storage area networks (SAN), network protocols, Cloud, and storage devices is preferred. Hands on experience with storage virtualization is a plus. Candidates must be flexible in schedule and availability. Second shift and weekend scheduling will be required. Preferred technical and professional experience Following minimum experience are required for the role - Excellent communication skills - both verbal and written Provide remote troubleshooting and analysis assistance for usage and configuration questions Preferred Professional and Technical Expertise: At least 5-10 years of in-depth experience with Spectrum Protect (Storage Protect) or its competition products in data protection domain Working knowledge on RedHat, Openshift or Ansible administration will be preferred. Good in networking and troubleshooting. Cloud Certification will be added advantage. Knowledge about Object Storage and Cloud Storage will be preferred.
Posted 2 weeks ago
2.0 - 3.0 years
3 - 3 Lacs
Chennai
Work from Office
Role & responsibilities: Software technical knowledge -Python, HTML Etc.., Daily price monitoring and update the price -As per the HOD instruction. Follow update the API partner and co-ordination. Maintain portal security,cloud storage and server-Related operations. Troubleshoot and follow up on payment gateway issues or technical glitches. Preferred candidate profile: 2 to 3 Years of Experience Experience in Python , HTML . Maintain Cloud Storage.
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
You should have a strong understanding of the tech stack including GCP Services such as BigQuery, Cloud Dataflow, Pub/Sub, Dataproc, and Cloud Storage. Experience with Data Processing tools like Apache Beam (batch/stream), Apache Kafka, and Cloud Dataprep is crucial. Proficiency in programming languages like Python, Java/Scala, and SQL is required. Your expertise should extend to Orchestration tools like Apache Airflow (Cloud Composer) and Terraform, and Security aspects including IAM, Cloud Identity, and Cloud Security Command Center. Knowledge of Containerization using Docker and Kubernetes (GKE) is essential. Familiarity with Machine Learning platforms such as Google AI Platform, TensorFlow, and AutoML is expected. Candidates with certifications like Google Cloud Data Engineer and Cloud Architect are preferred. You should have a proven track record of designing scalable AI/ML systems in production, focusing on high-performance and cost-effective solutions. Strong experience with cloud platforms (Google Cloud, AWS, Azure) and cloud-native AI/ML services like Vertex AI and SageMaker is important. Your role will involve implementing MLOps practices, including model deployment, monitoring, retraining, and version control. Leadership skills are key to guide teams, mentor engineers, and collaborate effectively with cross-functional teams to achieve business objectives. A deep understanding of frameworks like TensorFlow, PyTorch, and Scikit-learn for designing, training, and deploying models is necessary. Experience with data engineering principles, scalable pipelines, and distributed systems (e.g., Apache Kafka, Spark, Kubernetes) is also required. Nice to have requirements include strong leadership and mentorship capabilities to guide teams towards best practices and high-quality deliverables. Excellent problem-solving skills focusing on designing efficient, high-performance systems are valued. Effective project management abilities are necessary to handle multiple initiatives and ensure timely delivery. Collaboration and teamwork are emphasized to foster a positive and productive work environment.,
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
About NetApp NetApp is the intelligent data infrastructure company, turning a world of disruption into opportunity for every customer. No matter the data type, workload, or environment, NetApp helps customers identify and realize new business possibilities. The company values its people and believes in embracing diversity and openness. NetApp encourages fresh ideas and a collaborative approach to challenges, fostering a culture of belonging and innovation. Job Summary As the Engineering team manager for the GCNV service utilizing NetApp ONTAP Hardware in Bangalore, you will oversee all software development aspects, operational metrics, and production support. Your responsibilities include defining, managing, and enhancing software and product development processes in close collaboration with high-performance engineers. Leading the day-to-day activities of the engineering team, ensuring project plans are met, coaching and mentoring engineers for growth, and developing innovative technical designs aligned with product goals are key aspects of this role. Additionally, driving operational excellence, fostering a culture of learning and experimentation, as well as actively building talent and teams through hiring, coaching, and mentoring are essential responsibilities. Collaboration with cross-functional teams to create customer-centric software and drive innovation is a crucial part of this role. Job Requirements The ideal candidate should have experience in software development, hands-on experience in object-oriented design, coding, and developing highly scalable web services. Previous experience in managing software engineering teams, cloud-native architectures, and proficiency in Go, C++, C#, Python, or Java is required. Strong knowledge of infrastructure components, cloud services, and best practices for the software development life cycle is necessary. The candidate should be experienced in designing fault-tolerant and high-scale distributed architectures on major cloud providers and building CD/CI infrastructure. A Bachelor of Science Degree in Computer Science or equivalent experience, along with a minimum of 8 years of relevant experience in technical leadership, is required. Demonstrated ability to manage multiple critical projects is essential for this role. Education A Bachelor of Science Degree in Computer Science, a masters degree, or equivalent experience is required. A minimum of 8 years of relevant experience and experience in technical leadership, people management is required. Demonstrated ability to manage multiple critical projects. At NetApp, a hybrid working environment is promoted to enhance connection, collaboration, and culture among employees. Most roles will have some in-office and/or in-person expectations, which will be communicated during the recruitment process. Equal Opportunity Employer NetApp encourages all qualified candidates to apply, regardless of meeting 100% of the qualifications. The company values diversity and inclusion and looks forward to hearing from all applicants. Why NetApp NetApp is dedicated to helping customers turn challenges into business opportunities through innovative solutions and fresh thinking. The company offers a healthy work-life balance, volunteer time off, comprehensive benefits, and support for professional growth. NetApp provides access to various discounts and perks to enhance employees" quality of life. If you are passionate about building knowledge and solving significant problems, NetApp welcomes you to join the team.,
Posted 2 weeks ago
7.0 - 12.0 years
9 - 14 Lacs
Bengaluru
Work from Office
R esponsible for developing and maintaining scalable backend systems, ensuring optimal performance, and contributing to a seamless user experience. Responsibilities: Design, develop, and maintain backend services and APIs using Java and Spring Boot. Implement and maintain GCP services such as Cloud SQL, Cloud Storage, Cloud run and Cloud Pub/Sub. Collaborate with frontend developers and other team members to integrate user-facing elements with server-side logic. Optimize application performance, scalability, and security. Write clean, maintainable, and efficient code. Participate in code reviews Work Experience 7+ years of experience in backend development using Java and Spring Boot. Strong understanding of GCP services such as Cloud SQL, Cloud Storage, Cloud Pub/Sub, and Cloud Functions. Experience with RESTful API design and development. Knowledge of database systems and SQL. Familiarity with version control systems (eg, Git, Github actions). Familiarity with Terraform Excellent problem-solving and communication skills.
Posted 2 weeks ago
8.0 - 11.0 years
35 - 37 Lacs
Kolkata, Ahmedabad, Bengaluru
Work from Office
Dear Candidate, We are hiring a FinOps Analyst to bring financial accountability to our cloud operations. Key Responsibilities: Track cloud expenditures and identify savings. Collaborate with engineering and finance teams. Build cloud cost dashboards and forecasts. Required Skills & Qualifications: Familiarity with cloud billing tools and reports. Strong Excel or BI reporting skills. Cloud certification is a plus. Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Reddy Delivery Manager Integra Technologies
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
As a Product Owner for the GCP Data Migration Project at Clairvoyant, you will play a crucial role in leading the initiative and ensuring successful delivery of data migration solutions on Google Cloud Platform. With your deep understanding of cloud platforms, data migration processes, and Agile methodologies, you will collaborate with cross-functional teams to define the product vision, gather requirements, and prioritize backlogs to align with business objectives and user needs. Your key responsibilities will include defining and communicating the product vision and strategy, leading requirement gathering sessions with stakeholders, collaborating with business leaders and technical teams to gather and prioritize requirements, creating user stories and acceptance criteria, participating in sprint planning, establishing key performance indicators, identifying and mitigating risks, and fostering a culture of continuous improvement through feedback collection and iteration on product features and processes. To be successful in this role, you should have 10-12 years of experience in product management or product ownership, particularly in data migration or cloud projects. You must possess a strong understanding of Google Cloud Platform (GCP) services such as BigQuery, Cloud Storage, and Data Transfer Services, as well as experience with data migration strategies and tools including ETL processes and data integration methodologies. Proficiency in Agile methodologies, excellent analytical and problem-solving skills, strong communication skills, and a Bachelor's degree in Computer Science, Information Technology, Business, or a related field are essential qualifications. Additionally, experience with data governance and compliance in cloud environments, familiarity with project management and collaboration tools like JIRA and Confluence, understanding of data architecture and database management, and Google Cloud certifications such as Professional Cloud Architect and Professional Data Engineer are considered good to have qualifications. At Clairvoyant, we provide opportunities for engineers to develop and grow, work with a team of hardworking and dedicated peers, and offer growth and mentorship opportunities. We value diversity and encourage individuals with varying skills and qualities to apply, as we believe there might be a suitable role for you in the future. Join us in driving innovation and growth in the technology consulting and services industry!,
Posted 3 weeks ago
8.0 - 10.0 years
6 - 9 Lacs
Hyderabad
Work from Office
Detailed job description - Skill Set: Technically strong hands-on Self-driven Good client communication skills Able to work independently and good team player Flexible to work in PST hour(overlap for some hours) Past development experience for Cisco client is preferred.
Posted 3 weeks ago
4.0 - 7.0 years
5 - 9 Lacs
Pune
Work from Office
":" Job Description We\u2019re looking for a Frontend Engineer to translate polished UI designs into a fast, responsive React/TypeScript interface for our internal support portal. You\u2019ll collaborate with backend, UX, and operations teams to deliver seamless data experiences for support agents. Responsibilities Convert Figma and Weave-style design prototypes into production-ready, responsive React/TypeScript components. Integrate a real-time chatbot widget for automated user interactions. Build UI workflows for role and permission management, including Directory Sync status views. Embed feature flags in the client to control new features. Configure static asset hosting via cloud storage and CDN. Write unit and integration tests; participate in code reviews. Work in an agile team to ship features to production. Requirements Minimum Qualifications 4\u20137 years of front-end development experience. Expert in React and TypeScript, with strong HTML5, CSS3 (SASS/LESS), and modern JavaScript. Proven ability to implement responsive, data-driven UIs from Figma and Weave-style designs. Familiarity with cloud-hosted asset delivery (S3 & CDN) and CI/CD basics. Solid understanding of RESTful APIs and JSON data handling. Preferred Qualifications Experience embedding serverless calls (AWS Lambda) in the client. Prior work on admin or support-portal UIs. Familiarity with design systems and translating Figma or similar prototypes. Strong debugging skills across full-stack environments. Education Bachelor\u2019s or Master\u2019s Degree in Computer Science, Information Systems, or equivalent experience. Benefits Opportunity to work with a dynamic and fast-paced IT organization. Make a real impact on the companys success by shaping a positive and engaging work culture. Work with a talented and collaborative team. Be part of a company that is passionate about making a difference through technology. \u200b ","
Posted 3 weeks ago
3.0 - 7.0 years
6 - 16 Lacs
Chennai
Hybrid
Greetings from Getronics! We have permanent opportunities for GCP Data Engineers for Chennai Location . Hope you are doing well! This is Jogeshwari from Getronics Talent Acquisition team. We have multiple opportunities for GCP Data Engineers. Please find below the company profile and Job Description. If interested, please share your updated resume, recent professional photograph and Aadhaar proof at the earliest to jogeshwari.k@getronics.com. Company : Getronics (Permanent role) Client : Automobile Industry Experience Required : 3+ Years in IT and minimum 2+ years in GCP Data Engineering Location : Chennai Skill Required: - GCP Data Engineer, Hadoop, Spark/Pyspark, Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. - 6+ years of professional experience: Data engineering, data product development and software product launches. - 4+ years of cloud data engineering experience building scalable, reliable, and cost- effective production batch and streaming data pipelines using: Data warehouses like Google BigQuery. Workflow orchestration tools like Airflow. Relational Database Management System like MySQL, PostgreSQL, and SQL Server. Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub. LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Regards, Jogeshwari Senior Specialist
Posted 3 weeks ago
3.0 - 5.0 years
13 - 17 Lacs
Gurugram
Work from Office
Senior Analyst-GCP Data Engineer: Elevate Your Impact Through Innovation and Learning Evalueserve is a global leader in delivering innovative and sustainable solutions to a diverse range of clients, including over 30% of Fortune 500 companies. With a presence in more than 45 countries across five continents, we excel in leveraging state-of-the-art technology, artificial intelligence, and unparalleled subject matter expertise to elevate our clients' business impact and strategic decision-making. Our team of over 4, 500 talented professionals operates in countries such as India, China, Chile, Romania, the US, and Canada. Our global network also extends to emerging markets like Colombia, the Middle East, and the rest of Asia-Pacific. Recognized by Great Place to Work in India, Chile, Romania, the US, and the UK in 2022, we offer a dynamic, growth-oriented, and meritocracy-based culture that prioritizes continuous learning and skill development and work-life balance. About Data Analytics (DA) Data Analytics is one of the highest growth practices within Evalueserve, providing you rewarding career opportunities. Established in 2014, the global DA team extends beyond 1000+ (and growing) data science professionals across data engineering, business intelligence, digital marketing, advanced analytics, technology, and product engineering. Our more tenured teammates, some of whom have been with Evalueserve since it started more than 20 years ago, have enjoyed leadership opportunities in different regions of the world across our seven business lines. What you will be doing at Evalueserve Data Pipeline Development: Design and implement scalable ETL (Extract, Transform, Load) pipelines using tools like Cloud Dataflow, Apache Beam or Spark and BigQuery. Data Integration: Integrate various data sources into unified data warehouses or lakes, ensuring seamless data flow. Data Transformation: Transform raw data into analyzable formats using tools like dbt (data build tool) and Dataflow. Performance Optimization: Continuously monitor and optimize data pipelines for speed, scalability, and cost-efficiency. Data Governance: Implement data quality standards, validation checks, and anomaly detection mechanisms. Collaboration: Work closely with data scientists, analysts, and business stakeholders to align data solutions with organizational goals. Documentation: Maintain detailed documentation of workflows and adhere to coding standards. What were looking for Proficiency in **Python/PySpark and SQL for data processing and querying. Expertise in GCP services like BigQuery, Cloud Storage, Pub/Sub, Cloud composure and Dataflow. Familiarity with Datawarehouse and lake house principles and distributed data architectures. Strong problem-solving skills and the ability to handle complex projects under tight deadlines. Knowledge of data security and compliance best practices. Certification: GCP Professional Data engineer. Follow us on https://www.linkedin.com/compan y/evalueserve/ Click here to learn more about what our Leaders talking onachievements AI-poweredsupply chain optimization solution built on Google Cloud. HowEvalueserve isnow Leveraging NVIDIA NIM to enhance our AI and digital transformationsolutions and to accelerate AI Capabilities . Knowmore about how Evalueservehas climbed 16 places on the 50 Best Firms for Data Scientists in 2024! Want to learn more about our culture and what its like to work with us? Write to us at: careers@evalueserve.com Disclaimer : The following job description serves as an informative reference for the tasks you may be required to perform. However, it does not constitute an integral component of your employment agreement and is subject to periodic modifications to align with evolving circumstances. Please Note: We appreciate the accuracy and authenticity of the information you provide, as it plays a key role in your candidacy. As part of the Background Verification Process, we verify your employment, education, and personal details. Please ensure all information is factual and submitted on time. For any assistance, your TA SPOC is available to support you.
Posted 3 weeks ago
5.0 - 9.0 years
9 - 18 Lacs
Bengaluru
Hybrid
Job Description 5+ yrs of IT experience Good understanding of analytics tools for effective analysis of data Should be able to lead teams Should have been part of the production deployment team, Production Support team Experience with Big Data tools- Hadoop, Spark, Apache Beam, Kafka, etc. Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc. Experience with any DW tools like BQ, Redshift, Synapse, or Snowflake Experience in ETL and Data Warehousing. Experience and firm understanding of relational and non-relational databases like MySQL, MS SQL Server, Postgres, MongoDB, Cassandra, etc. Experience with cloud platforms like AWS, GCP and Azure. Experience with workflow management using tools like Apache Airflow. Roles & Responsibilities Develop high-performance and scalable solutions using GCP that extract, transform, and load big data. Designing and building production-grade data solutions from ingestion to consumption using Java / Python Design and optimize data models on the GCP cloud using GCP data stores such as BigQuery Should be able to handle the deployment process Optimizing data pipelines for performance and cost for large-scale data lakes. Writing complex, highly optimized queries across large data sets and creating data processing layers. Closely interact with Data Engineers to identify the right tools to deliver product features by performing POC Collaborative team player that interacts with business, BAs, and other Data/ML engineers Research new use cases for existing data. Preferred: Need to be Aware of Design Best practices for OLTP and OLAP Systems Should be part of team designing the DB and pipeline Should have exposure to Load testing methodologies, Debugging pipelines and Delta load handling Worked on heterogeneous migration projects
Posted 3 weeks ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
As a Data Engineer (ETL, Big Data, Hadoop, Spark, GCP) at Assistant Vice President level, located in Pune, India, you will be responsible for developing and delivering engineering solutions to achieve business objectives. You are expected to have a strong understanding of crucial engineering principles within the bank, and be skilled in root cause analysis through addressing enhancements and fixes in product reliability and resiliency. Working independently on medium to large projects with strict deadlines, you will collaborate in a cross-application technical environment, demonstrating a solid hands-on development track record within an agile methodology. Furthermore, this role involves collaborating with a globally dispersed team and is integral to the development of the Compliance tech internal team in India, delivering enhancements in compliance tech capabilities to meet regulatory commitments. Your key responsibilities will include analyzing data sets, designing and coding stable and scalable data ingestion workflows, integrating them with existing workflows, and developing analytics algorithms on ingested data. You will also be working on data sourcing in Hadoop and GCP, owning unit testing, UAT deployment, end-user sign-off, and production go-live. Root cause analysis skills will be essential for identifying bugs and issues, and supporting production support and release management teams. You will operate in an agile scrum team and ensure that new code is thoroughly tested at both unit and system levels. To excel in this role, you should have over 10 years of coding experience with reputable organizations, hands-on experience in Bitbucket and CI/CD pipelines, and proficiency in Hadoop, Python, Spark, SQL, Unix, and Hive. A basic understanding of on-prem and GCP data security, as well as hands-on development experience with large ETL/big data systems (with GCP experience being a plus), are required. Familiarity with cloud services such as cloud build, artifact registry, cloud DNS, and cloud load balancing, along with data flow, cloud composer, cloud storage, and data proc, is essential. Additionally, knowledge of data quality dimensions and data visualization is beneficial. You will receive comprehensive support, including training and development opportunities, coaching from experts in your team, and a culture of continuous learning to facilitate your career progression. The company fosters a collaborative and inclusive work environment, empowering employees to excel together every day. As part of Deutsche Bank Group, we encourage applications from all individuals and promote a positive and fair workplace culture. For further details about our company and teams, please visit our website: https://www.db.com/company/company.htm.,
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough