Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Fullstack Engineer at Fractal, you will be a vital part of our dynamic team dedicated to accelerating growth and delivering exceptional AI & analytics solutions. You will have the opportunity to work in a hybrid mode at one of our various locations including Bangalore, Mumbai, Pune, Gurgaon, Noida, Chennai, Hyderabad, or Coimbatore. Your role will involve collaborating with a diverse team comprising of Scrum Masters, Cloud Engineers, AI/ML Engineers, and UI/UX Engineers to develop end-to-end Data to Decision Systems. Reporting to a Lead Engineer, you will be responsible for managing, developing, and maintaining both backend and frontend components for Data to Decision projects catering to our Fortune 500 clients. Your tasks will include integrating algorithmic outputs from backend REST APIs, creating dynamic infographics with user-friendly controls, participating in UAT, troubleshooting bugs and application integration issues, and documenting the developed processes and applications. To excel in this role, you should possess 5-10 years of demonstrable experience in designing, building, and working as a Fullstack Engineer for enterprise web applications. Your expertise should cover JavaScript (ES6), HTML5, CSS, ReactJS or VueJS, Node.js, Python, Django or Flask, as well as familiarity with common databases and REST concepts. Additionally, you should have hands-on experience with test-driven development, visualization libraries, core backend concepts, and performance optimization. It would be advantageous if you have familiarity with Microsoft Azure Cloud Services, AWS, or GCP Services, experience working with UX designers, Microservices, Messaging Brokers, reverse proxy engines, and CI/CD tools. A degree in B.E/B.Tech, BCA, or MCA equivalent is required for this role. If you are a problem-solver with a passion for technology and enjoy working in a collaborative, high-performing environment, Fractal is the perfect place for you to unleash your potential and embark on a rewarding career journey.,
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
You are a highly skilled Cloud Architect with expertise in Java/Python and Google Cloud Platform (GCP). Your main responsibility will be to design, implement, and oversee robust, scalable, and secure cloud solutions. You should have strong hands-on experience in cloud architecture, application development, and deployment, along with a deep understanding of GCP services and cloud best practices. Your key responsibilities will include designing and implementing scalable and secure cloud architectures using GCP, developing cloud-native applications and microservices using Java and/or Python, collaborating with cross-functional teams to gather requirements and deliver solutions, leading cloud migration efforts to GCP, optimizing cloud resources for performance and cost-efficiency, establishing and enforcing cloud governance and security standards, creating detailed documentation of cloud architecture and processes, and mentoring development teams on cloud best practices and emerging technologies. To excel in this role, you must have proven experience as a Cloud Architect focusing on GCP, strong programming skills in Java and/or Python, expertise in GCP services like Compute Engine, App Engine, BigQuery, Cloud Storage, Pub/Sub, Cloud Functions, etc., hands-on experience with Kubernetes, Docker, and other containerization tools, proficiency in designing and implementing CI/CD pipelines, strong knowledge of cloud security, IAM, and network architecture, excellent problem-solving skills, and the ability to work in a fast-paced environment. Additionally, you should possess strong interpersonal and communication skills to effectively interact with stakeholders. Preferred qualifications for this role include GCP certifications, such as Professional Cloud Architect. This job description is sourced from hirist.tech.,
Posted 4 days ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
As a Fullstack Engineer at Fractal, a leading AI & analytics organization, you will be an integral part of a dynamic team dedicated to building end-to-end Data to Decision Systems for Fortune 500 clients. Reporting to a Senior Fullstack Engineer, your responsibilities will include managing, developing, and maintaining both the backend and frontend components of various projects, collaborating with cross-functional teams such as Scrum Master, Cloud Engineers, AI/ML Engineers, and UI/UX Engineers. Your role will involve working closely with the data science & engineering team to integrate algorithmic outputs from backend REST APIs, as well as partnering with business and product owners to create dynamic infographics with intuitive user controls. The ideal candidate should have a minimum of 4 years of demonstrable experience in designing, building, and working as a Fullstack Engineer for enterprise web applications. Key qualifications for this role include expert-level proficiency with Angular, ReactJS or VueJS, Node.js, MongoDB, and data warehousing concepts (OLAP, OLTP). Additionally, a solid understanding of REST concepts, cross-browser compatibility, responsive web design, and familiarity with code versioning tools such as Github are essential. Preferred qualifications include familiarity with Microsoft Azure Cloud Services, AWS, GCP Services, Github Actions, or other CI/CD tools like Jenkins. This position offers the opportunity to work in diverse locations such as BLR, MUM, GUR, PUNE, and CHENNAI. If you thrive in a collaborative, high-performing environment and are passionate about leveraging technology to drive innovation, Fractal is the perfect place for you to grow and excel in your career. Join us if you are ready for wild growth and enjoy working with a team of enthusiastic over-achievers!,
Posted 6 days ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
We are seeking a skilled Frontend Developer with expertise in Angular and proficiency in Google Cloud Platform (GCP) CI/CD pipelines. Your primary responsibility will involve building high-quality, scalable, and performant web applications that seamlessly integrate with backend systems and cloud infrastructure. As a Frontend Developer, you will design and develop responsive web applications using Angular (v12+). Collaboration with UX/UI designers to implement visually appealing user interfaces will be crucial. You will also work on integrating frontend components with RESTful APIs and backend services effectively. In addition, you will be responsible for setting up and managing CI/CD pipelines using Google Cloud Build, Cloud Source Repositories, and other GCP DevOps tools. Writing clean, maintainable, and testable code following best practices is an essential part of this role. Monitoring application performance, troubleshooting production issues, participating in code reviews, and potentially mentoring junior developers are also key responsibilities. To excel in this role, you should have at least 3 years of experience in Frontend Development using Angular. A strong knowledge of TypeScript, JavaScript, HTML5, CSS3, and SCSS is required. Hands-on experience with GCP services related to CI/CD such as Cloud Build, Artifact Registry, Cloud Functions, and IAM is essential. Experience with Git-based workflows, automated testing, and code quality tools is highly valued. Moreover, familiarity with containerization (Docker) and orchestration (Kubernetes/GKE) would be advantageous. Experience in Agile/Scrum environments, excellent communication skills, and strong problem-solving abilities are also important qualities for this position. An added advantage would be holding a GCP certification, such as Associate Cloud Engineer or higher. Stay updated with emerging frontend and cloud technologies to contribute effectively to our projects.,
Posted 1 week ago
5.0 - 7.0 years
5 - 14 Lacs
Pune, Gurugram, Bengaluru
Work from Office
• Handson experience in objectoriented programming using Python, PySpark, APIs, SQL, BigQuery, GCP • Building data pipelines for huge volume of data • Dataflow Dataproc and BigQuery • Deep understanding of ETL concepts
Posted 1 week ago
10.0 - 17.0 years
10 - 20 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Google Cloud Certification Associate Cloud Engineer or Professional Cloud Architect/Engineer Hands-on experience with GCP services (Compute Engine, GKE, Cloud SQL, BigQuery, etc.) Strong command of Linux , shell scripting , and networking fundamentals Proficiency in Terraform , Cloud Build , Cloud Functions , or other GCP-native tools Experience with containers and orchestration – Docker, Kubernetes (GKE) Familiarity with monitoring/logging – Cloud Monitoring , Prometheus , Grafana Understanding of IAM , VPCs , firewall rules , service accounts , and Cloud Identity Excellent written and verbal communication skills
Posted 1 week ago
10.0 - 15.0 years
10 - 19 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Google Cloud Certification Associate Cloud Engineer or Professional Cloud Architect/Engineer Hands-on experience with GCP services (Compute Engine, GKE, Cloud SQL, BigQuery, etc.) Strong command of Linux , shell scripting , and networking fundamentals Proficiency in Terraform , Cloud Build , Cloud Functions , or other GCP-native tools Experience with containers and orchestration – Docker, Kubernetes (GKE) Familiarity with monitoring/logging – Cloud Monitoring , Prometheus , Grafana Understanding of IAM , VPCs , firewall rules , service accounts , and Cloud Identity Excellent written and verbal communication skills
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As a Cloud Engineer at AVP level in Bangalore, India, you will be responsible for designing, implementing, and managing cloud infrastructure and services on Google Cloud Platform (GCP). Your key responsibilities will include designing, deploying, and managing scalable, secure, and cost-effective cloud environments on GCP, developing Infrastructure as Code (IaC) using tools like Terraform, ensuring security best practices, IAM policies, and compliance with organizational and regulatory standards, configuring and managing VPCs, subnets, firewalls, VPNs, and interconnects for secure cloud networking, setting up CI/CD pipelines for automated deployments, implementing monitoring and alerting using tools like Stackdriver, optimizing cloud spending, designing disaster recovery and backup strategies, deploying and managing GCP databases, and managing containerized applications using GKE and Cloud Run. You will be part of the Platform Engineering Team, which is responsible for building and maintaining foundational infrastructure, tooling, and automation to enable efficient, secure, and scalable software development and deployment. The team focuses on creating a self-service platform for developers and operational teams, ensuring reliability, security, and compliance while improving developer productivity. To excel in this role, you should have strong experience with GCP services, proficiency in scripting and Infrastructure as Code, knowledge of DevOps practices and CI/CD tools, understanding of security, IAM, networking, and compliance in cloud environments, experience with monitoring tools, strong problem-solving skills, and Google Cloud certifications would be a plus. You will receive training, development, coaching, and support to help you excel in your career, along with a culture of continuous learning and a range of flexible benefits tailored to suit your needs. The company strives for a positive, fair, and inclusive work environment where employees are empowered to excel together every day. For further information about the company and its teams, please visit the company website: https://www.db.com/company/company.htm. The Deutsche Bank Group welcomes applications from all individuals and promotes a culture of shared successes and collaboration.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
We are looking for a Java Developer to produce scalable software solutions on distributed systems like Hadoop using Spark Framework. You will be part of a cross-functional team responsible for the full software development life cycle, from conception to deployment. As a Developer, you should be comfortable with back-end coding, development frameworks, third party libraries, and Spark APIs required for application development on distributed platforms like Hadoop. Being a team player with a knack for visual design and utility is essential. Familiarity with Agile methodologies will be an added advantage. A large part of the workloads and applications will be cloud-based, so knowledge and experience with Google Cloud Platform (GCP) will be handy. As part of our flexible scheme, here are some of the benefits you'll enjoy: - Best in class leave policy - Gender-neutral parental leaves - 100% reimbursement under childcare assistance benefit (gender-neutral) - Sponsorship for industry-relevant certifications and education - Employee Assistance Program for you and your family members - Comprehensive Hospitalization Insurance for you and your dependents - Accident and Term life Insurance - Complementary Health screening for 35 years and above Your key responsibilities will include working with development teams and product managers to ideate software solutions, designing client-side and server-side architecture, building features and applications capable of running on distributed platforms and/or the cloud, developing and managing well-functioning applications supporting micro-services architecture, testing software for responsiveness and efficiency, troubleshooting, debugging, and upgrading software, creating security and data protection settings, and writing technical and design documentation. Additionally, you will be responsible for writing effective APIs (REST & SOAP). To be successful in this role, you should have proven experience as a Java Developer or similar role as an individual contributor or development lead, familiarity with common stacks, strong knowledge and working experience of Core Java, Spring Boot, Rest APIs, and Spark API, knowledge of React framework and UI experience, knowledge of Junit, Mockito, or other frameworks, familiarity with GCP services, design/architecture, and security frameworks, experience with databases (e.g., Oracle, PostgreSQL, BigQuery), familiarity with developing on distributed application platforms like Hadoop with Spark, excellent communication and teamwork skills, organizational skills, an analytical mind, a degree in Computer Science, Statistics, or a relevant field, and experience working in Agile environments. Good to have skills include knowledge of JavaScript frameworks (e.g., Angular, React, Node.js) and UI/UX design, knowledge of Python, and knowledge of NoSQL databases like HBASE, MONGO. You should have 4-7 years of prior working experience in a global banking/insurance/financial organization. You will receive training and development to help you excel in your career, coaching and support from experts in your team, and a culture of continuous learning to aid progression. We strive for a culture in which we are empowered to excel together every day, acting responsibly, thinking commercially, taking initiative, and working collaboratively. Together we share and celebrate the successes of our people. We welcome applications from all people and promote a positive, fair, and inclusive work environment.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As a Machine Learning Engineer, your primary responsibility will be to design and implement end-to-end ML pipelines using GCP services and ETL tools. You will be deploying and maintaining predictive ML models through containerization (Docker) and orchestration (Kubernetes). It is essential for you to implement MLOps practices including CI/CD pipelines using Git. Additionally, you will create and maintain technical architecture diagrams for GCP service implementations and develop frontend applications that integrate with ML and AI backend services. Your expertise will be crucial in optimizing big data processing workflows and improving ML model performance. To excel in this role, you must possess a strong background in Machine Learning and Deep Learning concepts. Extensive experience with GCP services, particularly AI/ML offerings, is a must. Proficiency in MLOps practices and tools is highly desirable. Hands-on experience with Docker, Kubernetes, and Git is required for this position. Your familiarity with frontend development and API integration will be beneficial. Knowledge of big data processing and analytics is essential. Furthermore, your ability to create clear technical documentation and architecture diagrams will be a significant asset to the team. Preferred qualifications for this role include Google Cloud certifications, experience with AI and GenAI models and applications, background in full-stack development, understanding of microservices architecture, experience with data pipeline optimization, as well as knowledge of Python, HTML, and Javascript. Join our team and contribute to cutting-edge projects that leverage machine learning technologies in a cloud-based environment.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
faridabad, haryana
On-site
The role of a seasoned Full Stack Engineer with 3-4 years of experience is crucial in a startup environment to contribute significantly to scaling efforts. As a Full Stack Engineer, you will work across the stack, from creating intuitive user interfaces to designing robust backend systems and integrating advanced data solutions. Your influence on key architectural decisions, optimization of performance, and utilization of AI-driven approaches to solve complex problems will be pivotal. If you thrive in a fast-paced setting and are passionate about building scalable products, we are interested in hearing from you. Success in this role will be determined by your ability to deliver high-quality, maintainable, and scalable products capable of handling rapid growth. You will play a key role in ensuring seamless user experiences, solid backend performance, and secure data management. By proactively tackling technical challenges, enhancing code quality, and mentoring junior engineers, you will have a direct impact on both our product offering and the overall team's efficiency. Collaboration is essential as a Full Stack Engineer, working closely with product managers, designers, DevOps engineers, and data analysts to develop features that address real customer needs. Your work will directly influence product evolution, positioning us for long-term success as we expand into new markets, scale existing solutions, and incorporate cutting-edge AI into our applications. **Responsibilities:** **Frontend:** - Develop responsive, intuitive interfaces using HTML, CSS (SASS), React, and Vanilla JS. - Implement real-time features using sockets for dynamic, interactive user experiences. - Collaborate with designers to ensure consistent UI/UX patterns and deliver visually compelling products. **Backend:** - Design, implement, and maintain APIs using Python (FastAPI). - Integrate AI-driven features to enhance user experience and streamline processes. - Ensure code adherence to best practices in performance, scalability, and security. - Troubleshoot and resolve production issues to minimize downtime and enhance reliability. **Database & Data Management:** - Work with PostgreSQL for relational data, focusing on optimal queries and indexing. - Utilize ClickHouse or MongoDB for specific data workloads and analytics needs. - Contribute to the development of dashboards and tools for analytics and reporting. - Apply AI/ML concepts to derive insights from data and improve system performance. **General:** - Utilize Git for version control; conduct code reviews, ensure clean commit history, and maintain robust documentation. - Collaborate with cross-functional teams to deliver features aligned with business goals. - Stay updated with industry trends, especially in AI and emerging frameworks, to enhance the platform. - Mentor junior engineers and contribute to continuous improvement in team processes and code quality. **Qualifications:** **Required:** - 3-4 years of full-stack development experience in a startup or scaling environment. - Proficiency in frontend technologies: HTML, CSS (SASS), React, Vanilla JS. - Strong backend experience with Python (FastAPI). - Solid understanding of relational databases (PostgreSQL) and performance optimization. - Experience with sockets for real-time applications. - Familiarity with integrating AI or ML-powered features. - Strong problem-solving abilities, attention to detail, and effective communication skills. **Ideal:** - Exposure to Webpack, Handlebars, and GCP services. - Experience in building dashboards and analytics tools. - Knowledge of ClickHouse and MongoDB for specialized workloads. - Prior experience with video calls, AI chatbots, or widgets. - Understanding of cloud environments, deployment strategies, and CI/CD pipelines. - Ability to leverage AI/ML frameworks and tools (e.g., TensorFlow, PyTorch) to improve product features. **Preferred but Not Mandatory:** - Advanced experience in AI-driven optimizations like personalized user experiences, predictive analytics, and automated decision-making. - Familiarity with analytics and monitoring tools for performance tracking. - Prior exposure to a high-growth startup environment, meeting rapid iteration and scaling demands. **Our Process:** - Upon shortlisting, you will receive a project assignment with a one-week deadline. - Successful projects will proceed with two rounds of interviews. - If not shortlisted, feedback will be provided to ensure transparency and respect for your time. **Why Join Us ** - Work with cutting-edge technologies, including AI-driven solutions, in a rapidly scaling environment. - Be part of a collaborative and inclusive team valuing impact, ownership, and growth. - Continuous learning and professional development opportunities. - Competitive compensation and benefits aligned with your experience and contributions. If you are passionate about technology, enjoy solving complex problems, and are eager to contribute to the next phase of a scaling product, apply now and be part of our journey!,
Posted 1 week ago
8.0 - 13.0 years
20 - 30 Lacs
Gurugram
Work from Office
Hi, Wishes from GSN!!! Pleasure connecting with you!!! We been into Corporate Search Services for Identifying & Bringing in Stellar Talented Professionals for our reputed IT / Non-IT clients in India. We have been successfully providing results to various potential needs of our clients for the last 20 years. At present, GSN is hiring GCP ENGINEER for one of our leading MNC client. PFB the details for your better understanding: 1. WORK LOCATION : Gurugram 2. Job Role: GCP ENGINEER 3. EXPERIENCE : 8+ yrs 4. CTC Range: Rs. 20 LPA to Rs. 30 LPA 5. Work Type : WFO (Hybrid) ****** Looking for IMMEDIATE JOINER ****** Who are we looking for ? MLOPS Engineer with AWS Experience. Required Skills : GCP Arch. Certification Terraform GitLab Shell Scripting GCP Services Compute Engine Cloud Storage Data Flow Big Query IAM . ****** Looking for IMMEDIATE JOINER ****** Best regards, Kaviya | GSN | Kaviya@gsnhr.net | 9150016092 | Google review : https://g.co/kgs/UAsF9W
Posted 2 weeks ago
10.0 - 17.0 years
10 - 20 Lacs
Pune, Chennai, Bengaluru
Hybrid
Job Description: Cloud Infrastructure & Deployment Design and implement secure, scalable, and highly available cloud infrastructure on GCP. Provision and manage compute, storage, network, and database services. Automate infrastructure using Infrastructure as Code (IaC) tools such as Terraform or Deployment Manager. Architecture & Design Translate business requirements into scalable cloud solutions. Recommend GCP services aligned with application needs and cost optimization. Participate in high-level architecture and solution design discussions. DevOps & Automation Build and maintain CI/CD pipelines (e.g., using Cloud Build, Jenkins, GitLab CI). Integrate monitoring, logging, and alerting (e.g., Stackdriver / Cloud Operations Suite). Enable autoscaling, load balancing, and zero-downtime deployments. Security & Compliance Ensure compliance with security standards and best Migration & Optimization Support cloud migration projects from on-premise or other cloud providers to GCP. Optimize performance, reliability, and cost of GCP workloads. Documentation & Support Maintain technical documentation and architecture diagrams. Provide L2/L3 support for GCP-based services and incidents. Required Skills and Qualifications: Google Cloud Certification Associate Cloud Engineer or Professional Cloud Architect/Engineer Hands-on experience with GCP services (Compute Engine, GKE, Cloud SQL, BigQuery, etc.) Strong command of Linux , shell scripting , and networking fundamentals Proficiency in Terraform , Cloud Build , Cloud Functions , or other GCP-native tools Experience with containers and orchestration – Docker, Kubernetes (GKE) Familiarity with monitoring/logging – Cloud Monitoring , Prometheus , Grafana Understanding of IAM , VPCs , firewall rules , service accounts , and Cloud Identity Excellent written and verbal communication skills
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Tech Lead, Software Development Engineering at Fiserv, you will play a crucial role in leading a team to design, develop, and maintain high-quality software applications that align with our business goals and client needs. Your responsibilities will include collaborating with cross-functional teams to gather requirements, translating them into technical specifications, coding, testing, and debugging applications to ensure optimal performance, as well as maintaining and improving existing codebases. Additionally, you will lead peer review processes, mentor junior developers, and provide technical guidance to enhance team capabilities. To excel in this role, you will need a minimum of 4 years of experience in software development, with expertise in Java, C, C#, C++, or similar programming languages, SQL databases, AWS, Azure, or GCP services, CI/CD pipelines, DevOps practices, and Agile methodologies. An equivalent combination of educational background, related experience, and/or military experience will also be considered. Experience in the financial services industry would be advantageous. If you are passionate about innovation and transforming financial services technology, Fiserv welcomes your application. Please apply using your legal name, complete the step-by-step profile, and attach your resume to be considered for this exciting opportunity. Fiserv is committed to diversity and inclusion, and we do not accept resume submissions from agencies outside of existing agreements. Be cautious of fraudulent job postings not affiliated with Fiserv, as we prioritize the security of your personal information and financial data. All communications from Fiserv representatives will originate from legitimate Fiserv email addresses.,
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
As a Full-Stack AI App Developer at EMO Energy, you will play a key role in reimagining urban mobility, energy, and fleet operations through our AI-driven super app. You will have the opportunity to take full ownership of building and deploying a cutting-edge energy infrastructure startup in India. Your responsibilities will include architecting and developing a full-stack AI-enabled application, designing modular frontend views using React.js or React Native, creating intelligent agent interfaces, building secure backend APIs for managing energy and fleet operations, integrating real-time data workflows, implementing fleet tracking dashboards, and optimizing performance across various platforms. Collaboration with the founding team, ops team, and hardware teams will be essential to iterate fast and solve real-world logistics problems. The ideal candidate for this role should have a strong command of front-end frameworks such as React.js, experience with back-end technologies like FastAPI, Node.js, or Django, proficiency in TypeScript or Python, familiarity with GCP services, Docker, GitHub Actions, and experience with mobile integrations and AI APIs. End-to-end ownership of previous applications, strong UI/UX product sensibility, and experience in building dashboards or internal tools will be valuable assets. Additionally, the ability to adapt to ambiguity, communicate technical decisions to non-engineers, and a passion for clean code and impactful work are crucial for success in this role. If you are a highly motivated individual with a passion for AI-driven applications and a desire to lead the development of a cutting-edge fleet/energy platform, then this role at EMO Energy is the perfect opportunity for you. Join us in revolutionizing the future of urban mobility and energy infrastructure in India.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
As a Java Developer at our Pune, India location, you will be responsible for producing scalable software solutions on distributed systems like Hadoop using the Spark Framework. You will work within a cross-functional team involved in the full software development life cycle, from conception to deployment. Your role will require expertise in back-end coding, development frameworks, third-party libraries, and Spark APIs essential for application development on distributed platforms like Hadoop. Being a team player with a flair for visual design and utility is crucial, along with familiarity with Agile methodologies. Given that a significant portion of the workloads and applications will be cloud-based, knowledge and experience with Google Cloud Platform (GCP) will be beneficial. Your responsibilities will include collaborating with development teams and product managers to brainstorm software solutions, designing client-side and server-side architecture, building features and applications capable of running on distributed platforms or the cloud, managing well-functioning applications supporting micro-services architecture, testing software for responsiveness and efficiency, troubleshooting, debugging, and upgrading software, establishing security and data protection settings, writing technical and design documentation, and creating effective APIs (REST & SOAP). To be successful in this role, you should have proven experience as a Java Developer or in a similar role, familiarity with common stacks, strong knowledge and working experience of Core Java, Spring Boot, Rest APIs, Spark API, etc. Knowledge of the React framework and UI experience would be advantageous. Proficiency in Junit, Mockito, or other frameworks is necessary, while familiarity with GCP services, design/architecture, and security frameworks is an added advantage. Experience with databases like Oracle, PostgreSQL, and BigQuery, as well as developing on distributed application platforms like Hadoop with Spark, is expected. Excellent communication, teamwork, organizational, and analytical skills are essential, along with a degree in Computer Science, Statistics, or a relevant field and experience working in Agile environments. It would be beneficial to have knowledge of JavaScript frameworks (e.g., Angular, React, Node.js) and UI/UX design, Python, and NoSQL databases like HBASE and MONGO. The ideal candidate should have 4-7 years of prior working experience in a global banking/insurance/financial organization. We offer a supportive environment with training and development opportunities, coaching from experts in the team, a culture of continuous learning, and a range of flexible benefits tailored to suit your needs. If you are looking to excel in your career and contribute to a collaborative and inclusive work environment, we invite you to apply for the Java Developer position at our organization. For further information about our company and teams, please visit our company website at https://www.db.com/company/company.htm. We strive to create a culture where every individual is empowered to excel together, act responsibly, think commercially, take initiative, and work collaboratively. We celebrate the successes of our people and promote a positive, fair, and inclusive work environment. Join us at Deutsche Bank Group and be part of a team where together, we achieve excellence every day.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
faridabad, haryana
On-site
As a seasoned Full Stack Engineer with 3-4 years of experience, you will excel in a startup environment and play a significant role in our scaling efforts. Working across the stack, you will be involved in crafting intuitive user interfaces, designing robust backend systems, and integrating advanced data solutions. Your responsibilities will include influencing key architectural decisions, optimizing performance, and leveraging AI-driven approaches to solve complex problems. If you thrive in a fast-paced setting and are passionate about building scalable products, we are excited to hear from you. Your success will be measured by your ability to deliver high-quality, maintainable, and scalable products capable of handling rapid growth. You will play a crucial role in ensuring seamless user experiences, solid backend performance, and secure data management. By proactively addressing technical challenges, improving code quality, and mentoring junior engineers, you will have a direct impact on both our product offering and the efficiency of the broader team. Collaborating closely with product managers, designers, DevOps engineers, and data analysts, you will create features that address real customer needs. Your work will directly influence product evolution and position us for long-term success as we enter new markets, scale existing solutions, and incorporate cutting-edge AI into our applications. **Responsibilities** **Frontend:** - Develop responsive, intuitive interfaces using HTML, CSS (SASS), React, and Vanilla JS. - Implement real-time features using sockets for dynamic, interactive user experiences. - Collaborate with designers to ensure consistent UI/UX patterns and deliver visually compelling products. **Backend:** - Design, implement, and maintain APIs using Python (FastAPI). - Integrate AI-driven features to enhance user experience and streamline processes. - Ensure the code adheres to best practices in performance, scalability, and security. - Troubleshoot and resolve production issues, minimizing downtime and improving reliability. **Database & Data Management:** - Work with PostgreSQL for relational data, ensuring optimal queries and indexing. - Utilize ClickHouse or MongoDB where appropriate to handle specific data workloads and analytics needs. - Contribute to building dashboards and tools for analytics and reporting. - Leverage AI/ML concepts to derive insights from data and improve system performance. **General:** - Use Git for version control; conduct code reviews, ensure clean commit history, and maintain robust documentation. - Collaborate with cross-functional teams to deliver features that align with business goals. - Stay updated with industry trends, particularly in AI and emerging frameworks, and apply them to enhance our platform. - Mentor junior engineers and contribute to continuous improvement in team processes and code quality. **Qualifications** **Required:** - 3-4 years of full-stack development experience in a startup or scaling environment. - Proficiency in frontend technologies: HTML, CSS (SASS), React, Vanilla JS. - Strong backend experience with Python (FastAPI). - Solid understanding of relational databases (PostgreSQL) and performance optimization. - Experience with sockets for real-time applications. - Familiarity with integrating AI or ML-powered features. - Strong problem-solving abilities, attention to detail, and effective communication skills. **Ideal:** - Exposure to Webpack, Handlebars, and GCP services. - Experience in building dashboards and analytics tools. - Knowledge of ClickHouse and MongoDB for specialized workloads. - Prior experience with video calls, AI chatbots, or widgets. - Understanding of cloud environments, deployment strategies, and CI/CD pipelines. - Ability to leverage AI/ML frameworks and tools (e.g., TensorFlow, PyTorch) to improve product features. **Preferred but Not Mandatory:** - Advanced experience in AI-driven optimizations, such as personalized user experiences, predictive analytics, and automated decision-making. - Familiarity with analytics and monitoring tools for performance tracking. - Prior exposure to a high-growth startup environment, meeting rapid iteration and scaling demands. We follow a structured process where upon shortlisting, you will receive a project assignment with a one-week deadline. Successful candidates will proceed with two rounds of interviews, while those not shortlisted will receive feedback to ensure transparency and respect for your time. Join us to work with cutting-edge technologies, including AI-driven solutions, in a rapidly scaling environment. Be part of a collaborative and inclusive team that values impact, ownership, and growth. Enjoy continuous learning and professional development opportunities, along with competitive compensation and benefits aligned with your experience and contributions. If you are passionate about technology, enjoy solving complex problems, and are eager to shape the next phase of a scaling product, apply now and join our journey!,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
As a Machine Learning Engineer, you will be responsible for designing and implementing end-to-end ML pipelines using GCP services and ETL tools. Your role will involve deploying and maintaining predictive ML models through containerization (Docker) and orchestration (Kubernetes). It is essential to implement MLOps practices, including CI/CD pipelines using Git, to ensure seamless operations. Additionally, creating and maintaining technical architecture diagrams for GCP service implementations will be part of your routine tasks. Your expertise in Machine Learning and Deep Learning concepts will be crucial for this role. Extensive experience with GCP services, particularly AI/ML offerings, is required to excel in this position. Proficiency in MLOps practices and tools is essential, along with hands-on experience in Docker, Kubernetes, and Git. Furthermore, experience in frontend development and API integration will be beneficial for developing frontend applications that integrate with ML and AI backend services. In this role, you will also be responsible for optimizing big data processing workflows and ML model performance. Knowledge of big data processing and analytics is expected, along with the ability to create clear technical documentation and architecture diagrams to ensure effective communication within the team. Preferred qualifications for this position include Google Cloud certifications, experience with AI and GenAI models and applications, and a background in full-stack development. An understanding of microservices architecture, experience with data pipeline optimization, and proficiency in Python, HTML, and JavaScript will further strengthen your candidacy for this role.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Technical Integration lead for Google Cloud App Migration, you will be responsible for overseeing the end-to-end migration/modernization of applications from existing data centers to Google Cloud. This role requires having in-depth understanding of GCP Services, Software development, deployments and transitioning to operations. This role involves leading cross-functional teams, managing project timelines, and ensuring a successful and efficient migration.,
Posted 3 weeks ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
The FinOps Application Engineer & ITAO, AVP position based in Pune, India involves driving financial accountability of cloud consumption within the DB Cloud FinOps function. The role requires managing the FinOps Application platform on Google Cloud Platform (GCP) environment, which includes the GCP landing zone and FinOps Looker instance. Responsibilities encompass end-to-end platform management, creation and maintenance of BigQuery datasets and views, administration of billing exports, deployment of Cloudfunctions, API integrations, and compliance adherence. The applicant is expected to ensure that the FinOps application complies with the Bank's IT Security Risk, Audit, and Compliance requirements. Provisioning and maintenance of GCP billing, recommender, and database services, along with serverless GCP services for automation of key FinOps insights, are crucial aspects of the role. Additionally, managing FinOps GCP IAM roles, permissions, Github repositories, and supporting the integration of the FinOps platform into other bank tools like Jira and Looker are essential responsibilities. Key Skills and Experience: - 7+ years of Infrastructure & Cloud technology industry experience - Strong understanding of Software Development Lifecycle methodology - 3+ years of proficiency in Terraform, Python, and Github - Proficient in GCP infrastructure and services - Proficient in data analysis tools such as Excel, PowerQuery, and SQL - Strong analytical and problem-solving skills - Experience in FinOps is preferred Benefits offered include a best-in-class leave policy, gender-neutral parental leaves, childcare assistance benefit, sponsorship for industry certifications, Employee Assistance Program, comprehensive insurance coverage, and health screening for individuals above 35 years. Training, coaching, and a culture of continuous learning are provided to support career growth and development. The company, Deutsche Bank Group, promotes a positive, fair, and inclusive work environment where employees are empowered to excel together. The culture emphasizes responsibility, commercial thinking, initiative, and collaboration. Applications from all individuals are welcome to contribute to the diverse and successful teams within the organization. For further information, visit the company website at https://www.db.com/company/company.htm.,
Posted 3 weeks ago
6.0 - 11.0 years
20 - 35 Lacs
Hyderabad, Gurugram, Bengaluru
Hybrid
Greetings from BCforward INDIA TECHNOLOGIES PRIVATE LIMITED. Contract To Hire(C2H) Role Location: Gurgaon/Bengaluru/Hyderabad Payroll: BCforward Work Mode: Hybrid JD Preferred Skills: 6+ years relevant experience into Big Data; ETL - Big Data / Data Warehousing; GCP; Java Skills: Big Data; ETL - Big Data / Data Warehousing; GCP; Java We are looking for a highly skilled Engineer with a solid experience of building Bigdata, GCP Cloud based real time data pipelines and REST APIs with Java frameworks. The Engineer will play a crucial role in designing, implementing, and optimizing data solutions to support our organizations data-driven initiatives. This role requires expertise in data engineering, strong problem-solving abilities, and a collaborative mindset to work effectively with various stakeholders. This role will be focused on the delivery of innovative solutions to satisfy the needs of our business. As an agile team we work closely with our business partners to understand what they require, and we strive to continuously improve as a team. Technical Skills 1. Core Data Engineering Skills Proficiency in using GCPs big data tools like: BigQuery : For data warehousing and SQL analytics . Dataproc : For running Spark and Hadoop clusters. GCP Dataflow : For stream and batch data processing.(High level Idea)GCP Pub/Sub: For real-time messaging and event ingestion.(High level Idea) Expertise in building automated, scalable, and reliable pipelines using custom Python/Scala solutions or Cloud Data Functions. 2. Programming and Scripting Strong coding skills in SQL, and Java. Familiarity with APIs and SDKs for GCP services to build custom data solutions. 3. Cloud Infrastructure Understanding of GCP services such as Cloud Storage, Compute Engine, and Cloud Functions. Familiarity with Kubernetes ( GKE )and containerization for deploying data pipelines. (Optional but Good to have) 4. DevOps and CI/ CD Experience setting up CI/CD pipelines using Cloud Build, GitHub Actions, or other tools. Monitoring and logging tools like Cloud Monitoringand Cloud Logging for production workflows. 5. Backend Development (Spring Boot & Java)-; Design and develop RESTful APIs and microservices using Spring Boot. Implement business logic, security, authentication (JWT/OAuth), and database operations. Work with relational databases (MySQL, PostgreSQL, MongoDB, Cloud SQL). Optimize backend performance, scalability, and maintainability. Implement unit testing and integration testing. Soft Skills 1. Innovation and Problem-Solving Ability to think creatively and design innovative solutions for complex data challenges. Experience in prototyping and experimenting with cutting-edge GCP tools or third-party integrations. Strong analytical mindset to transform raw data into actionable insights. 2. Collaboration Teamwork: Ability to collaborate effectively with data analysts, and business stakeholders. Communication: Strong verbal and written communication skills to explain technical concepts to non-technical audiences. 3. Adaptability and Continuous Learning Open to exploring new GCP features and rapidly adapting to changes in cloud technology. Please share your Updated Resume, PAN card soft copy, Passport size Photo & UAN History. Interested applicants can share updated resume to g.sreekanth@bcforward.com Note: Looking for Immediate to 15-Days joiners at most. All the best
Posted 3 weeks ago
8.0 - 13.0 years
15 - 20 Lacs
Bengaluru
Hybrid
Hello, We are hiring for "GCP Cloud Engineer" for Bangalore location. Exp :8+Years Loc : Bangalore(Hybrid) Shift Timings : 7:30 AM to 4:00 PM Notice Period: Immediate joiners(notice period served or serving candidate) NOTE: We are looking for Immediate joiners(notice period served or serving candidate) Apply only If you are Immediate joiners(notice period served or serving candidate) Job Description: 3 years of experience with building modern applications utilizing GCP services like Cloud Build, Cloud Functions/ Cloud Run, GKE, Logging, GCS, CloudSQL & IAM. Primary proficiency in Python and experience with a secondary language such as Golang or Java. In-depth knowledge and hands-on experience with GKE/K8s. You place a high emphasis on Software Engineering fundamentals such as code and configuration management, CICD/Automation and automated testing. Working with operations, security, compliance and architecture groups to develop secure, scalable and supportable solutions.
Posted 3 weeks ago
7.0 - 12.0 years
20 - 30 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
(Immediate Joiners Preferred) What's this role about? Looking for a self-driven individual who has a high quotient for problem solving, technical zeal in learning newer areas and has worked in a high demand work environment. Candidate would be involved in various projects GCP technical background having extensive experience in Data engineering services and should have knowledge OR worked on GCP with Python/Java/React.js, AirFlow ETL skills - GCP services (BigQuery, Dataflow, Cloud SQL, Cloud Functions) Here's how you'll contribute: Work with the team in capacity of GCP Data Engineer on day-to-day activities. Solve problems at hand with utmost clarity and speed. Work with Data analysts and architects to help them solve any specific issues with tooling/processes. Design, build and operationalize large scale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties - Python/Java/React.js, AirFlow ETL skills - GCP services (BigQuery, Dataflow, Cloud SQL, Cloud Functions, Data Lake. Design and build production data pipelines from ingestion to consumption within a big data architecture. GCP BQ modeling and performance tuning techniques. RDBMS and No-SQL database experience. Knowledge on orchestrating workloads on cloud. Skills required to contribute: ( Must have) 6+ years of working experience: Python,Pyspark, AirFlow ETL skills - GCP services (BigQuery, Dataflow, Cloud SQL, Cloud Functions), ETL Secondary skills: Java/React.js ,Google Cloud Composer,Google Cloud Storage - SQL - MS SQL Server
Posted 1 month ago
4.0 - 9.0 years
8 - 15 Lacs
Hyderabad, Pune
Hybrid
Looking for candidates with Any GCP certification.Expertise with GCP services, including GKE, Cloud Functions, Cloud Storage, BigQuery, and Cloud Pub/Sub.Proficiency in DevOps tools like Terraform, Ansible, Stratus, GitLab CI/CD, or Cloud Build.
Posted 1 month ago
8.0 - 12.0 years
3 - 8 Lacs
Bengaluru
Remote
Analyze the current GCP setup and DevOps workflows Propose and implement improvements in infrastructure, security, and automation Build and maintain scalable, secure GKE clusters and CI/CD pipelines
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough