Home
Jobs

15888 Gcp Jobs - Page 27

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

175.0 years

7 - 8 Lacs

Gurgaon

On-site

GlassDoor logo

At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. How will you make an impact in this role? GTLS MIS & Analytics (within GS MIS) team provides analytical and reporting support to T&LS stakeholders in US and 22 international markets. The candidate will be part of Enablement team in GTLS MIS & Analytics team. The main objective is to provide support for automation and platform enablement, manage platform and data governance, ETL builds, Data Quality, BI & downstream data enablement in collaboration with tech organization. Key responsibilities: Understanding business use cases and be able to convert to technical design Part of a cross-disciplinary team, working closely with other data engineers, software engineers, data scientists, data managers and business partners. You will be designing scalable, testable and maintainable data pipelines Identify areas for data governance improvements and help to resolve data quality problems through the appropriate choice of error detection and correction, process control and improvement, or process design changes Developing metrics to measure effectiveness and drive adoption of Data Governance policies and standards that will be applied to mitigate identified risks across the data lifecycle (e.g., capture / production, aggregation / processing, reporting / consumption). You will continuously monitor, troubleshoot, and improve data pipelines and workflows to ensure optimal performance and cost-effectiveness. Reviewing architecture and design on various aspects like scalability, security, design patterns, user experience, non-functional requirements and ensure that all relevant best practices are followed. Key Skills required : 2-4 years of experience in data engineering roles. Advanced SQL skills with a focus on optimisation techniques Big data and Hadoop experience, with a focus on Spark, Hive (or other query engines), big data storage formats (such as Parquet, ORC, Avro). Cloud experience (GCP preferred) with solutions designed and implemented at production scale Strong understanding of key GCP services, especially those related to data processing [Batch/Real Time] Big Query, Cloud Scheduler, Airflow, Cloud Logging and Monitoring Hands-on experience with Git, advanced automation capabilities & shell scripting. Experience in design, development and implementation of data pipelines for Data Warehousing applications Hands on experience in performance tuning and debugging ETL jobs We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.

Posted 1 day ago

Apply

1.0 years

0 Lacs

Haryana

On-site

GlassDoor logo

As a Business Analyst & Data Annotator, you will play a crucial role in gathering and analyzing business requirements, acting as a bridge between stakeholder needs and technical teams. You will also handle the data annotation process, ensuring the production of high-quality, accurately labeled datasets necessary for training machine learning models. This role involves close collaboration with ML engineers, data scientists, and business teams to ensure that data aligns with project goals. Your work will center on translating complex business needs and technical specifications into clear instructions, managing data labeling workflows, and maintaining data quality standards. A junior-level candidate with strong English skills (B2 or higher, ideally C1), attention to detail, and a good understanding of business and technical concepts can be successful in this role, especially when working with reports containing specialized terminology. Responsibilities: Develop and implement detailed guidelines and instructions for data labeling and annotation to ensure consistency and accuracy across datasets; Review and validate labeled data, providing constructive feedback to annotation teams to improve data quality and adherence to project standards; Collaborate with data scientists and ML engineers to prepare, organize, and support the creation of high-quality annotated datasets for model training; Manage the annotation workflow, prioritize tasks, and track progress to ensure timely delivery of labeled data; Maintain high standards of data privacy, security, and compliance throughout all annotation processes; Gather and analyze business requirements, workflows, and terminology to understand data needs and improve annotation processes; Facilitate communication between technical teams and stakeholders by translating complex technical or domain-specific language into clear, accessible instructions and explanations; Offer insights into business processes that could benefit from automation or ML solutions, supporting the design and implementation of such projects; Support continuous improvement of data annotation guidelines, workflows, and overall business analysis practices to enhance efficiency and data quality. Requirements: At least 1 year of experience in the relevant role; Excellent English language skills (B2 level or higher, ideally C1), especially when working with reports containing complex terminology; Strong analytical skills and an understanding of business workflows; Attention to detail and ability to create clear instructions and guidelines for annotation teams; Understanding of data privacy, security standards, and compliance requirements. Nice to Have: Basic knowledge of machine learning concepts and data management principles; Familiarity with ML workflows, data pipelines, and MLOps tools; Experience with cloud platforms such as AWS, GCP, or Azure; Experience with data labeling or annotation; Experience in creating markups for AI; Insurance industry background.

Posted 1 day ago

Apply

5.0 years

0 Lacs

Delhi

Remote

GlassDoor logo

All roles at JumpCloud are Remote unless otherwise specified in the Job Description. About JumpCloud JumpCloud® delivers a unified open directory platform that makes it easy to securely manage identities, devices, and access across your organization. With JumpCloud, IT teams and MSPs enable users to work securely from anywhere and manage their Windows, Apple, Linux, and Android devices from a single platform. JumpCloud is IT Simplified. Do you enjoy solving challenging problems using the latest technologies within a great team? Is knowing your work will be highly visible and mission critical a key component for the next step in your career? At JumpCloud, we’re looking for best-in-class talent to help define the future of modern identity and device management from the ground up. About the role: JumpCloud is looking for an experienced Software Engineer to join an engineering team focusing on various applications, services running on Windows, Mac or Linux machines/servers, their interaction with the OS/kernel and working with back end services that these applications/services interact with. Device Management services are key parts of the entire JumpCloud product portfolio. Along with our Identity and Directory services, Device Management provides the foundation for our solutions, both cloud and device based. This team’s work will make using JumpCloud easier and frictionless for the management of the fleet of devices while providing a very high level of security. What you’ll be doing: Primarily working with Go, along with Swift, C#, C++, and Node.js for cross-platform applications on Windows, macOS, and Linux. Gaining or utilizing expertise in areas like Windows services, kernels, Event Loggers, Mac Launch daemons, and macOS internals. Collaborating with architects, UX designers, and DevOps to ensure our systems are highly available, scalable, and deliver exceptional user experiences. Working within a Scrum framework to drive agile development. Learning and working with mTLS protocols and related security concepts. Prior experience in these areas is a plus. Using OAuth/OIDC flows for secure user authentication and service access. Writing Unit test cases, Functional test cases, acceptance tests along with automating these test cases. Contributing to the future of our Device Management services by participating in strategic planning and scoping sessions with product managers. Embodying our core values: building strong connections, thinking big, and striving to improve by 1% every day. We’re looking for: 5-10 years experience developing MAC, Windows, or Linux applications (including integration with third-party applications) in a variety of programming languages like Swift, Node JS, C Sharp, C++ and Golang. Experience in one of them is a must. Experience using one of the public cloud providers (AWS, GCP or Azure) with CI/CD pipelines (preferably Github Action) to build, test and deploy. Willingness to mentor junior members of the team. Bonus points if you have experience with Services, event logger, Kernel in Windows OS and/or Launch demon, app hosting in Mac #LI-MS1 Where you’ll be working/Location: JumpCloud is committed to being Remote First, meaning that you are able to work remotely within the country noted in the Job Description. You must be located in and authorized to work in the country noted in the job description to be considered for this role. Please note: There is an expectation that our engineers participate in on-call shifts. You will be expected commit to being ready and able to respond during your assigned shift, so that alerts don't go unaddressed. Language: JumpCloud has teams in 15+ countries around the world and conducts our internal business in English. The interview and any additional screening process will take place primarily in English. To be considered for a role at JumpCloud, you will be required to speak and write in English fluently. Any additional language requirements will be included in the details of the job description. Why JumpCloud? If you thrive working in a fast, SaaS-based environment and you are passionate about solving challenging technical problems, we look forward to hearing from you! JumpCloud is an incredible place to share and grow your expertise! You’ll work with amazing talent across each department who are passionate about our mission. We’re out of the box thinkers, so your unique ideas and approaches for conceiving a product and/or feature will be welcome. You’ll have a voice in the organization as you work with a seasoned executive team, a supportive board and in a proven market that our customers are excited about. One of JumpCloud's three core values is to “Build Connections.” To us that means creating " human connection with each other regardless of our backgrounds, orientations, geographies, religions, languages, gender, race, etc. We care deeply about the people that we work with and want to see everyone succeed." - Rajat Bhargava, CEO Please submit your résumé and brief explanation about yourself and why you would be a good fit for JumpCloud. Please note JumpCloud is not accepting third party resumes at this time. JumpCloud is an equal opportunity employer. All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status. Scam Notice: Please be aware that there are individuals and organizations that may attempt to scam job seekers by offering fraudulent employment opportunities in the name of JumpCloud. These scams may involve fake job postings, unsolicited emails, or messages claiming to be from our recruiters or hiring managers. Please note that JumpCloud will never ask for any personal account information, such as credit card details or bank account numbers, during the recruitment process. Additionally, JumpCloud will never send you a check for any equipment prior to employment. All communication related to interviews and offers from our recruiters and hiring managers will come from official company email addresses (@jumpcloud.com) and will never ask for any payment, fee to be paid or purchases to be made by the job seeker. If you are contacted by anyone claiming to represent JumpCloud and you are unsure of their authenticity, please do not provide any personal/financial information and contact us immediately at recruiting@jumpcloud.com with the subject line "Scam Notice" #LI-Remote #BI-Remote

Posted 1 day ago

Apply

3.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

About KeyValue KeyValue is a trusted product engineering partner for Start Ups & Scale Ups - unlocking their passion, developing ideas, and creating abundant value to all stakeholders in the ecosystem. We have ideated, conceived, strategized and built some of the globe’s most innovative Fintech, Payments, Financial Services, Digital Commerce, Madtech, Edtech, Socialtech, Logistics, High Technology, Blockchain, Crypto, NFT and Healthcare companies, helping them conceive, scale, pivot, and enhance their businesses. KeyValue’s mission is to be the world’s most trusted product development hub – delivering high-value outcomes for start-ups & scale-ups – with a talented skilled team – in a thriving and inclusive culture. Our inclusive culture is engaging & experiential, creating an environment to learn & collaborate with freedom to think, create, explore, grow and thrive. An ownership mindset with growth orientation forms the bedrock of exceptional client success! We are looking for an experienced and passionate Blockchain Engineer with 3+ years of experience to join our team. The ideal candidate is highly proactive, a problem-solver, and has a strong interest in learning and implementing AI and ML alongside blockchain technologies. What you will do: Design, develop, and maintain blockchain-based applications and smart contracts. Implement security protocols, cryptography, and consensus algorithms. Collaborate with cross-functional teams to integrate blockchain solutions into existing systems. Optimise blockchain-based systems for performance, scalability, and reliability. Conduct research to explore and propose improvements to existing blockchain frameworks. Stay updated on the latest blockchain trends and technologies. What makes you a great fit: Bachelor’s degree in Computer Science, Information Technology, or a related field. 3+ years of experience in blockchain development, including hands-on experience with frameworks like Ethereum, Hyperledger, or Solana. Proficiency in programming languages such as Solidity, Rust, JavaScript, or Python. Experience with Web3.js, Express.js, Truffle, or other blockchain development tools. Strong understanding of smart contracts, consensus algorithms, and decentralised applications (dApps). Knowledge of cryptography, security protocols, and data structures. Excellent problem-solving skills and a proactive approach to challenges. Strong written and verbal communication skills for team collaboration and technical documentation. Familiarity with machine learning and AI concepts, and a willingness to integrate these technologies with blockchain solutions. Knowledge of cloud platforms like AWS, Azure, or GCP for deploying blockchain solutions. Experience in developing or integrating NFTs, DeFi, or other blockchain-based use cases. At KeyValue, you’ll be part of a dynamic team working on innovative projects that leverage cutting-edge blockchain technology. We provide an environment that encourages creativity, growth, and collaboration, helping you stay at the forefront of tech advancements. Show more Show less

Posted 1 day ago

Apply

3.0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Software Development Engineer (Backend Expert) Location: Remote Experience: 1–3 Years Compensation: ₹3.5 – ₹5 LPA About Recrivio We’re building systems that run lean, fast, and reliably at scale. As a Software Development Engineer (Backend Expert) , you’ll work on the brains of our products — designing backend services, APIs, and infrastructure that power everything we build. We want strong engineers with backend mastery and a deep understanding of how software works under the hood. Responsibilities Design and implement scalable backend services using Node.js , NestJS , or Golang Develop RESTful and GraphQL APIs for our SaaS platforms Own database design, optimization, and performance (PostgreSQL, MongoDB, Redis) Implement secure and scalable authentication/authorization (JWT, OAuth2) Collaborate with frontend and DevOps teams for seamless integrations Optimize services for speed, uptime, and observability Maintain high standards of code quality, documentation, and modular design Requirements 1–3 years of hands-on experience in backend development Strong understanding of Computer Science fundamentals : DSA , OOP , Databases , Operating Systems , Networking Expertise in Node.js , Golang , or similar backend frameworks Experience working with relational and NoSQL databases Comfortable with Git, REST APIs, and Linux-based environments Nice to Have Exposure to cloud infrastructure (AWS preferred; GCP/Azure welcome) Familiarity with Docker , CI/CD pipelines, and deployment practices Experience with message brokers (Kafka, RabbitMQ) Knowledge of observability stacks (Prometheus, Grafana, ELK) Contributions to backend projects or open-source work Show more Show less

Posted 1 day ago

Apply

3.0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Software Development Engineer (Frontend Expert) Location: Remote Experience: 1–3 Years Compensation: ₹3.5 – ₹5 LPA About Recrivio Recrivio is building modern SaaS products to automate and scale critical workflows in HRTech, staffing, and workforce operations. We are hiring a Software Development Engineer with a strong command over frontend technologies and core computer science fundamentals. If you believe that great UI is not just about design but also architecture, scalability, and performance you’ll thrive with us. Responsibilities Own the end-to-end development of responsive, high-performance web applications Build and maintain scalable UIs using React.js , Next.js , and TailwindCSS Translate high-fidelity designs (Figma) into production-grade frontend code Collaborate with backend, product, and design teams to create seamless user experiences Optimize performance, loading speed, SEO, and web accessibility Write reusable components and establish frontend coding standards Participate in system design, architecture discussions, and code reviews Requirements 1–3 years of software engineering experience with a frontend focus Strong understanding of Computer Science fundamentals : DSA , OOP , Operating Systems , Networking , and System Design Proficiency in JavaScript , React.js , Next.js , and CSS frameworks (Tailwind, SASS) Clear understanding of SSR , SSG , and performance optimization Working knowledge of REST API integration, Git workflows, and frontend testing Nice to Have Experience with animation libraries (Framer Motion, GSAP) Familiarity with accessibility standards (WCAG) Exposure to cloud platforms ( AWS preferred , or GCP/Azure) CI/CD experience for frontend deployments (GitHub Actions, Vercel) Personal or open-source projects showcasing frontend depth Show more Show less

Posted 1 day ago

Apply

5.0 - 9.0 years

6 - 9 Lacs

Ahmedabad

On-site

GlassDoor logo

About the Role: Grade Level (for internal use): 10 Title: Senior Database Application Developer Team S&P Global Marketplace technology team consists of geographically diversified software engineers responsible to develop scalable solutions by working directly with product development team. Our team culture is oriented towards equality in the realm of software engineering irrespective of hierarchy promoting innovation. One should feel empowered to iterate over ideas and experimentation without being afraid of failure. Impact You will enable S&P business to showcase our proprietary S&P Global data, combine it with “curated” alternative data, further enrich it with value-add services from Kensho and others, and deliver it via the clients’ channel of choice to help them make better investment and business decisions, with confidence. What you can expect An unmatched experience in handling huge volumes of data, analytics, visualization, and services over cloud technologies along with appreciation in product development life cycle to convert an idea into revenue generating stream. Responsibilities We are looking for a self-motivated, enthusiastic and passionate software engineer to develop technology solutions for S&P global Xpressfeed product. The ideal candidate thrives in a highly technical role and will design and develop software using cutting edge technologies consisting of Java, data pipelines, big data, machine learning and multi-cloud. The development is already underway so the candidate would be expected to get up to speed very quickly & start contributing. Active participation in all scrum ceremonies, follow AGILE best practices effectively. Play a key role in the development team to build high-quality, high-performance, scalable code Produce technical design documents and conduct technical walkthroughs Document and demonstrate solutions using technical design docs, diagrams and stubbed code Collaborate effectively with technical and non-technical stakeholders Respond to and resolve production issues. What we are looking for A minimum of 5 to 9 years of significant experience in application development. Proficient with software development lifecycle (SDLC) methodologies like Agile, Test-driven development. Experience working with high volume data and computationally intensive system. Garbage collection friendly programming experience - tuning java garbage collection & performance is a must. Proficiency in the development environment, including IDE, GIT, Autosys, Continuous Integration, unit-testing tool and defect management tools Domain knowledge in Financial Industry and Capital Markets is a plus. Excellent communication skills are essential, with strong verbal and writing proficiencies. Mentor teams, innovate and experiment, give face to business ideas and present to key stakeholders Required technical skills: Excellent skills developing solutions involving relational database technologies on SQL Server and/or Oracle Platforms. Build data pipelines Utilize platforms like Snowflake, Databricks, GCP Fabric, Big Query, etc. Utilize cloud managed services like AWS Step functions, AWS Lambda, AWS DynamoDB Develop custom solutions using Apache nifi, Airflow, Spark, Kafka, Hive, and/or Spring Cloud Data Flow Develop federated data services to provide scalable and performant data APIs, REST, Java, Scala Write infrastructure as code to develop sandbox environments Provide analytical capabilities using BI tools like tableau, power BI etc. Feed data at scale to clients that are geographically distributed Desirable technical skills: React, HTML/CSS, API development, micro-services pattern, cloud technologies and managed services preferably AWS, Big Data and Analytics, Relational databases preferably Postgresql, NoSql databases, nifi, Airflow, Spark, Hive, Spring Cloud, Spring Cloud Data Flow, Netty, Akka, Esper, Redis, Google protobuf, Google Guice, Google Guava, Spring Cloud. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here . ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 316188 Posted On: 2025-06-18 Location: Hyderabad, Telangana, India

Posted 1 day ago

Apply

3.0 years

0 - 0 Lacs

India

On-site

GlassDoor logo

Job Opening: Senior Full-Stack Engineer Company: DevForge Technology Private Limited Location: On-site Experience Required: 3 to 7 Years Employment Type: Full-time About Us At DevForge Technology, we specialize in building world-class web and mobile solutions. We're a product-first, JavaScript-focused company driven by innovation, performance, and user experience. Our team thrives in a fast-paced, problem-solving environment and is looking for passionate engineers who want to create impact. Role Overview We are looking for a Senior Full-Stack Engineer who can independently lead dedicated client projects with clarity, confidence, and excellent communication. You’ll be responsible for end-to-end ownership—from architecting solutions to delivering scalable, performant, and maintainable code. What You’ll Do Lead full-stack development across web and mobile platforms Collaborate with clients to understand project goals and translate them into technical execution Manage entire SDLC—from architecture decisions to production deployment Build scalable backend systems with NodeJS, MongoDB, and MySQL Develop intuitive frontend interfaces using React, NextJS, Tailwind CSS, and Typescript Contribute to and maintain React Native applications Utilize AWS or GCP for scalable infrastructure and deployment pipelines Implement job schedulers and background task processing Optimize performance, identify bottlenecks, and deliver clean, reusable code Communicate project updates, risks, and blockers clearly with stakeholders Tech Stack Frontend: React, Next.js, Tailwind CSS, React Native, Typescript Backend: Node.js, Express.js Databases: MongoDB, MySQL Cloud & DevOps: AWS, GCP Others: Job Scheduling (cron, queues), Git, REST/GraphQL APIs What We’re Looking For 3–7 years of strong full-stack development experience Excellent problem-solving skills and logical thinking Proactive communication and ability to work directly with clients Experience managing deployments and debugging in production Bonus: Experience with Docker, CI/CD, and scalable architecture patterns Why Join Us? Work on impactful global projects Opportunity to take technical ownership Friendly work culture Supportive team of driven engineers Room to innovate and grow fast Ready to Forge the Future with Us? Apply at: connect@trydevforge.com or +91 93277 80842 Job Type: Full-time Pay: ₹25,000.00 - ₹50,000.00 per month Schedule: Day shift Work Location: In person

Posted 1 day ago

Apply

5.0 years

0 - 0 Lacs

Surat

On-site

GlassDoor logo

Fullstack Developer Job Description: We are looking for a highly motivated Full Stack Developer with at least 5 years of experience in designing, developing, and maintaining scalable applications. The ideal candidate should have hands-on experience with Java, Spring Boot, React.js, and AWS/GCP cloud platforms.. Key Responsibilities: Develop and maintain end-to-end web applications using Java, Spring Boot, and React.js. Design, develop, and integrate RESTful APIs and microservices. Deploy, monitor, and maintain Write clean, efficient, and scalable code while following best practices. Collaborate with cross-functional teams, including designers, developers, and DevOps engineers. Optimize applications for performance, scalability, and security. Participate in code reviews and contribute to the continuous improvement of development processes. (Optional) Work on Mendix low-code development for rapid application delivery. Required Skills:Frontend: React.js, JavaScript/TypeScript, HTML, CSSBackend: Java, Spring Boot, REST APIs, MicroservicesDatabase: MySQL, PostgreSQL, MongoDBCloud: AWS or GCP (Basic deployment and infrastructure knowledge)Version Control: Git, GitHub/GitLabCI/CD: Exposure to Jenkins, Docker, or Kubernetes is a plusGood to Have: Mendix low-code platform experience. Qualifications: Bachelor's degree in Computer Science, Engineering, or a related field. 5+ years of hands-on experience in full-stack development. Strong problem-solving skills and the ability to work independently and collaboratively. Experience with Agile development methodologies. Location : Surat Job Type: Full-time Pay: ₹16,073.17 - ₹72,430.84 per month Schedule: Day shift Work Location: In person

Posted 1 day ago

Apply

10.0 years

0 Lacs

Andhra Pradesh

On-site

GlassDoor logo

Principal Lead-Identity Management and Governance Key Responsibilities Principal Lead-Identity Management and Governance is a high-visibility role responsible for leading Privileged Account Management (PAM) and governance initiatives in an Independent Contributor level, with a strong focus on cloud governance, including securing credential/key management tools. This individual will drive governance activities that ensure appropriate access controls and safeguard the confidentiality, integrity, and availability of enterprise systems and data through effective security controls. The role also includes validating compliance with information security policies and standards and raising awareness among stakeholders to help maintain a secure application and infrastructure environment. Key Responsibilities: Major Areas of accountability: Information Security Governance Identity & Access Governance Privileged Account Governance Policy/Procedure Management and Enforcement Reporting/Metrics Incident Management Education of Security Standards Provide timely and effective governance for the firm's information security tools, processes and practices in the Identity space. Use standard technology monitoring tools to monitor assigned environments and/or technical assets and identify/detect behavior outside of established standards. Escalate key security issues to the appropriate team to be addressed. Assist with security assurance testing activities. Monitor compliance with information security and identity policies and practices and any applicable laws. Assist with internal and external security risk assessments, risk analysis and application or system-level access reviews and attestations. Coordinate / Facilitate reviews for different platforms across the enterprise on a periodic basis. Assist with the research, development, continuous improvement and implementation of identity policies, procedures, standards, and processes based on compliance requirements and industry best practices. Document the identity governance requirements, processes and procedures with focus on continuous improvement using Automation [script / process based]. Enforce information security and identity policies and procedures by reviewing violation reports, investigating possible exceptions, and documenting controls. Prepare status reports on identity and access matters that are used for a variety of purposes - tracking and monitoring security breaches, investigative activities, remediation plan management and risk management & compliance reporting. Location: Noida/Hyderabad Shift Timings: 2:00 -10:30 pm Cab Provided: Yes Required Qualifications: Bachelor’s degree in Computer Science, Management Information Systems, or related technical field; or equivalent work experience. 10+ years of experience in Information Security Services or related technical field. Work experience that spans the Identity & Access Management or Governance, Risk, and Compliance security domains. Working knowledge of information security and computer network/system access technologies. Experience working in the financial services industry or other highly regulated/compliance-oriented environments. Effective verbal and written communication skills that include the ability to describe highly technical concepts in non-technical terms. Very good understanding of security controls, monitoring systems and regulatory/business drivers that impact security policies and practices. Working with business users on platform related questions/issues The successful candidate will need to demonstrate proficiency in atleast one of below verticals: Privileged Account Management Capabilities, Services and Processes using tools such as CyberArk and / or Competitor tools like Delinea, Arcon, BeyondTrust, Hashicorp. At least one of the following Cloud Governance technologies: AWS, Azure, GCP with experience in securing key services such as AWS Secrets Manager or Azure KeyVault. In addition, the successful candidate will need to meet below requirements: Interested in gaining broad experience in Information Security Services [must have] First level knowledge and/or demonstrated technical ability to understand code and technology infrastructure in multiple environments with experience in the below languages [Powershell, Python, Regular expressions-based programming] Demonstrated basic understanding of the Software Development Lifecycle (SDLC) and programming/development procedures. Effective oral and written communication skills along with logical, analytical, and abstract thinking skills. Strong attention to detail, follow-through, and time management skills. Demonstrated aptitude to quickly learn and apply new tools and processes Defining business, user, and systems requirements Developing user acceptance test plans Developing, document, test and modify new and existing code Developing working knowledge of systems and processes Business Analysis Building Process Flows Presentations (Creating and Delivering) Risk Identification and Remediation Project Management Project Coordination Reporting (SQL queries to databases) / Correlation ITIL (Change, Problem, Incident, Configuration) Management Preferred Qualifications: Basic knowledge and experience with: Operating Systems (Windows, UNIX, Mainframe, etc.) Directories/LDAP Constructs (Active Directory, Oracle, etc.) Databases/RDBMS Constructs (Oracle, SQL, DB2, MS SQL Server etc.) Authentication / Authorization Constructs (Directory, Hybrid, Native Source) Data Formats (XML, CSV, etc.) Identity & Access Governance Capabilities: o Role Based Access Controls (RBAC) o Provision / De-Provisioning o Access Request Privileged Access/Credential Management Privileged Access Management Suites o CyberArk Development / Programming / Scripting o SQL for Oracle or MS SQL o Java EE Compliance Types (GLBA, HIPAA, IT Compliance, NERC, PCI, SOX, etc.) Service Organization Controls (SOC1, SOC2) About Our Company Ameriprise India LLP has been providing client based financial solutions to help clients plan and achieve their financial objectives for 125 years. We are a U.S. based financial planning company headquartered in Minneapolis with a global presence. The firm’s focus areas include Asset Management and Advice, Retirement Planning and Insurance Protection. Be part of an inclusive, collaborative culture that rewards you for your contributions and work with other talented individuals who share your passion for doing great work. You’ll also have plenty of opportunities to make your mark at the office and a difference in your community. So if you're talented, driven and want to work for a strong ethical company that cares, take the next step and create a career at Ameriprise India LLP. Ameriprise India LLP is an equal opportunity employer. We consider all qualified applicants without regard to race, color, religion, sex, genetic information, age, sexual orientation, gender identity, disability, veteran status, marital status, family status or any other basis prohibited by law. Full-Time/Part-Time Full time Timings (4:45p-1:15a) India Business Unit AWMPO AWMP&S President's Office Job Family Group Technology

Posted 1 day ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Role: Java Developer Experience: 4-12 Years Location: Hyderabad This is a WFO (Work from Office) role. Mandatory Skills: Java Development, Core Java, Spring, Hibernate, Microservices, RESTful APIs Must have Finance background experience Interview Process First round - Online test Second round - Virtual technical discussion Manager/HR round - Virtual discussion Company Overview It is a globally recognized leader in the fintech industry, delivering cutting-edge trading solutions for professional traders worldwide. With over 15 years of excellence, a robust international presence, and a team of over 300+ skilled professionals, we continually push the boundaries of technology to remain at the forefront of financial innovation. Committed to fostering a collaborative and dynamic environment, our prioritizes technical excellence, innovation, and continuous growth for our team. Join our agile-based team to contribute to the development of advanced trading platforms in a rapidly evolving industry. About the Role: We are looking for a skilled and motivated Java Developer to join our development team. You will be responsible for designing, implementing, and maintaining high-performance Java applications, while collaborating with cross-functional teams to deliver robust, scalable solutions. Job Responsibilities Develop, test, and maintain Java-based applications and services. Participate in software design and architecture discussions. Collaborate with front-end developers, product managers, and QA teams. Write clean, scalable, and well-documented code following best practices. Optimize application performance and troubleshoot issues as they arise. Participate in code reviews and provide constructive feedback. Maintain version control using tools like Git. Job Requirements Strong proficiency in Core Java, Spring/Spring Boot, and Hibernate. Solid understanding of OOP, design patterns, and software development best practices. Experience with RESTful APIs, microservices, and web services. Familiarity with relational databases such as MySQL, PostgreSQL, or Oracle. Experience with build tools like Maven or Gradle. Knowledge of Git, JIRA, and CI/CD pipelines. Strong problem-solving skills and ability to work independently or as part of a team. Bachelor’s degree in computer science, Engineering, or related field. Experience in any cloud platform. (AWS, Azure, GCP, etc) Why Join? Exceptional team building and corporate celebrations Be part of a high-growth, fast-paced fintech environment. Flexible working arrangements and supportive culture. Opportunities to lead innovation in the online trading space. Skills: microservices,git,gcp,postgresql,restful apis,spring,design patterns,java,oop,gradle,java development,springboot,ci/cd,oracle,maven,cloud platforms,software development best practices,mysql,azure,core java,aws,relational databases,hibernate,jira,build tools Show more Show less

Posted 1 day ago

Apply

2.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

What You’ll Do Works in Data and Analytics under the close supervision of the team manager or senior associates Leverage coding best practices to ensure efficient execution of code against large datasets Run standard processes to ensure metrics, reports and insights are delivered consistently to stakeholders Leverage knowledge of data structures to prepare data for ingestion efforts analysis, assembling data from disparate data sources for the creation of insights Integrate Equifax, customer and third party data to solve basic internal or customer analytical problems and report findings to managers and internal stakeholders Review output of code for anomalies and perform analysis to determine cause, and work with Data, Analytics, Product and Technology counterparts to implement corrective measures Supports discussion on impact and importance of findings on the business (either Equifax or external customer) Ensure proper use of Equifax data assets by working closely with data governance and compliance professionals What Experience You Need 2 years of proven experience as a Data Analyst or Data Scientist Cognizance of BFSI or marketing analytics landscape Experience of working with Python (mandatory) R, SQL Experience of using business intelligence tools (e.g. Tableau) and data frameworks (e.g. Hadoop, BigQuery) Analytical mind and business acumen Strong math skills (e.g. statistics, algebra) Problem-solving aptitude Excellent communication and presentation skills BSc/BA/BTech in Computer Science, Engineering or relevant field; graduate degree in Data Science or other quantitative STEM streams What Could Set You Apart Cloud certification such as GCP strongly preferred Self Starter Excellent communicator / Client Facing Ability to work in fast paced environment Flexibility work across A/NZ time zones based on project needs Show more Show less

Posted 1 day ago

Apply

5.0 - 7.0 years

20 - 22 Lacs

Hyderabad, Bengaluru

Work from Office

Naukri logo

We are looking for an experienced and highly analytical Senior Data Analyst to join our team. In this role, you will leverage your expertise to lead complex data analysis projects, deliver actionable insights, and support strategic decision-making across the organization. You will collaborate with cross-functional teams and mentor junior analysts to drive data-driven culture and business outcomes. Required skillsets: Experience with cloud data platforms (e.g., AWS, Azure, GCP). Familiarity with data warehousing concepts and tools. Knowledge of business intelligence (BI) best practices. Exposure to machine learning concepts and predictive analytics. Experience in [industry-specific experience, if relevant Lead the design, implementation, and delivery of advanced data analyses and reporting solutions. Partner with business stakeholders to identify opportunities, define metrics, and translate business requirements into analytical solutions. Develop, maintain, and optimize dashboards, reports, and data visualizations for various audiences. Perform deep-dive analyses to uncover trends, patterns, and root causes in large, complex datasets. Present findings and recommendations to senior management and non-technical stakeholders. Ensure data quality, integrity, and governance across all analytics initiatives. Mentor and provide guidance to junior analysts and team members. Collaborate with data engineering and IT teams to improve data infrastructure and processe Must Have:- SQL, Databricks Good to Have:- AWS Skills Senior Data Analyst aws azure gcp sql databricks business intelligence.

Posted 1 day ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title: Data Engineer – Cloud-Agnostic, Data, Analytics & AI Product Team Location: Hyderabad Employment Type: Full-time Why this role matters Our analytics and AI products are only as good as the data they run on. You will design and operate the pipelines and micro-services that transform multi-structured data into reliable, governed, and instantly consumable assets—regardless of which cloud the customer chooses. Core Skills & Knowledge Programming: Python 3.10+, Pandas or Polars, SQL (ANSI, window functions, CTEs), basic bash. Databases & Warehouses: PostgreSQL, Snowflake (stages, tasks, streams), parquet/Delta-Lake tables on S3/ADLS/GCS. APIs & Services: FastAPI, Pydantic models, OpenAPI specs, JWT/OAuth authentication. Orchestration & Scheduling: Apache Airflow, Dagster, or Prefect; familiarity with event-driven triggers via cloud queues (SQS, Pub/Sub). Cloud Foundations: Hands-on with at least one major cloud (AWS, Azure, GCP) and willingness to write cloud-agnostic code, with a cost-aware development approach. Testing & CI/CD: pytest, GitHub Actions / Azure Pipelines; Docker-first local dev; semantic versioning. Data Governance: Basic understanding of GDPR/PII handling, role-based access, and encryption-at-rest/in-flight. Nice-to-Have / Stretch Skills Streaming ingestion with Kafka / Kinesis / Event Hub and PySpark Structured Streaming. Great Expectations, Soda, or Monte Carlo for data quality monitoring. Graph or time-series stores (Neo4j, TimescaleDB). Experience & Education 6-8 years of overall IT experience with over 4 years of relevant experience building data pipelines or back-end services in production, ideally supporting analytics or ML use-cases. Bachelor’s in Computer Science, Data Engineering, or demonstrably equivalent experience. Show more Show less

Posted 1 day ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Company Qualcomm India Private Limited Job Area Engineering Group, Engineering Group > Software Engineering General Summary As a Staff Engineer – Car to Cloud test engineer, you will be responsible for designing, implementing, and executing test plans to ensure the quality and performance of cloud-interactive applications and services. Your role involves collaborating with development teams to identify and resolve issues, automating test processes, and continuously challenging and improving testing coverage and strategies. Basic Qualifications Bachelor’s degree in engineering, Information Systems, Computer Science, or related field and 10+ years of Systems Test Engineering or related work experience. Extensive experience in end-to-end product and system testing with scalable cloud and device-based systems, including IoT and vehicle connected services. Demonstrated leadership in managing a test team, monitoring test metrics, and test reporting. Demonstrated expertise in test planning and test execution of complex technical features. Experience in Load, Performance, Scalability, and Backwards Compatibility testing Hands on experience with AWS, Azure, GCP or equivalent cloud systems and services Ability to troubleshoot complex technical issues independently and identify solutions. Knowledge of configuring and managing embedded devices on Linux, Android, or QNX. Understanding and practice of Scrum and Agile methodology. Excellent communication skills and experience interacting with external customers. Additional Qualifications The following would be considered as a bonus and are not required to be eligible for interviews: Excellent programming skills in one or more programming languages (Python, Java) Deep understanding of automation testing and writing automation scripts Test experience in embedded software, OS like Linux/Android/QNX Master’s degree in engineering, Information Systems, Computer Science, or related field and 7+ years of Software Test Engineering or related work experience. Minimum Qualifications Bachelor's degree in Engineering, Information Systems, Computer Science, or related field and 4+ years of Software Engineering or related work experience. OR Master's degree in Engineering, Information Systems, Computer Science, or related field and 3+ years of Software Engineering or related work experience. OR PhD in Engineering, Information Systems, Computer Science, or related field and 2+ years of Software Engineering or related work experience. 2+ years of work experience with Programming Language such as C, C++, Java, Python, etc. Applicants : Qualcomm is an equal opportunity employer. If you are an individual with a disability and need an accommodation during the application/hiring process, rest assured that Qualcomm is committed to providing an accessible process. You may e-mail disability-accomodations@qualcomm.com or call Qualcomm's toll-free number found here. Upon request, Qualcomm will provide reasonable accommodations to support individuals with disabilities to be able participate in the hiring process. Qualcomm is also committed to making our workplace accessible for individuals with disabilities. (Keep in mind that this email address is used to provide reasonable accommodations for individuals with disabilities. We will not respond here to requests for updates on applications or resume inquiries). Qualcomm expects its employees to abide by all applicable policies and procedures, including but not limited to security and other requirements regarding protection of Company confidential information and other confidential and/or proprietary information, to the extent those requirements are permissible under applicable law. To all Staffing and Recruiting Agencies : Our Careers Site is only for individuals seeking a job at Qualcomm. Staffing and recruiting agencies and individuals being represented by an agency are not authorized to use this site or to submit profiles, applications or resumes, and any such submissions will be considered unsolicited. Qualcomm does not accept unsolicited resumes or applications from agencies. Please do not forward resumes to our jobs alias, Qualcomm employees or any other company location. Qualcomm is not responsible for any fees related to unsolicited resumes/applications. If you would like more information about this role, please contact Qualcomm Careers. 3074434 Show more Show less

Posted 1 day ago

Apply

8.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Introduction Software Developers at IBM are the backbone of our strategic initiatives to design, code, test, and provide industry-leading solutions that make the world run today - planes and trains take off on time, bank transactions complete in the blink of an eye and the world remains safe because of the work our software developers do. Whether you are working on projects internally or for a client, software development is critical to the success of IBM and our clients worldwide. At IBM, you will use the latest software development tools, techniques and approaches and work with leading minds in the industry to build solutions you can be proud of Your Role And Responsibilities This Candidate is responsible for DB2 installation and configuration on the below environments. On Prem Multi Cloud Redhat Open shift Cluster HADR Non-DPF and DPF. Migration of other databases to Db2(eg TERADATA / SNOWFLAKE / SAP/ Cloudera to db2 migration) Create high-level designs, detail level designs, maintaining product roadmaps which includes both modernization and leveraging cloud solutions Design scalable, performant, and cost-effective data architectures within the Lakehouse to support diverse workloads, including reporting, analytics, data science, and AI/ML. Perform health check of the databases, make recommendations and deliver tuning. At the Database and system level. Deploy DB2 databases as containers within Red Hat OpenShift clusters Configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Lead the architectural design and implementation of solutions on IBM watsonx.data, ensuring alignment with overall enterprise data strategy and business objectives. Define and optimize the watsonx.data ecosystem, including integration with other IBM watsonx components (watsonx.ai, watsonx.governance) and existing data infrastructure (DB2, Netezza, cloud data sources) Establish best practices for data modeling, schema evolution, and data organization within the watsonx.data lakehouse Act as a subject matter expert on Lakehouse architecture, providing technical leadership and guidance to data engineering, analytics, and development teams. Mentor junior architects and engineers, fostering their growth and knowledge in modern data platforms. Participate in the development of architecture governance processes and promote best practices across the organization. Communicate complex technical concepts to both technical and non-technical stakeholders. Required Technical And Professional Expertise 8+years of experience in data architecture, data engineering, or a similar role, with significant hands-on experience in cloud data platforms Strong proficiency in DB2, SQL and Python. Strong understanding of: Database design and modelling(dimensional, normalized, NoSQL schemas) Normalization and indexing Data warehousing and ETL processes Cloud platforms (AWS, Azure, GCP) Big data technologies (e.g., Hadoop, Spark) Database Migration project experience from one database to another database (target database Db2). Experience in deployment of DB2 databases as containers within Red Hat OpenShift clusters and configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Excellent communication, collaboration, problem-solving, and leadership skills Preferred Technical And Professional Experience Experience with machine learning environments and LLMs Certification in IBM watsonx.data or related IBM data and AI technologies Hands-on experience with Lakehouse platform (e.g., Databricks, Snowflake) Having exposure to implement or understanding of DB replication process Experience with integrating watsonx.data with GenAI or LLM initiatives (e.g., RAG architectures). Experience with NoSQL databases (e.g., MongoDB, Cassandra). Show more Show less

Posted 1 day ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Hyderabad

Work from Office

Naukri logo

Connects- We are hiring for Site Reliability Engineer with 5+ years of experience to join our team immediately! Positions available in Hyderabad ( Nanakramguda ). Skills: GCP- GKE Google Kubernetes Engine Terraform Datadog, Dynatrace or similar tools Python or Any Scripting languages If interested in the above requirement, please reply with the below requested details at the earliest. Total Exp- Official Notice Period- Last working date(if any):- Current CTC- Expected CTC- Offers Holding any- Current Location- Preferred Location- Interested for 4 Days Work from Office? - Date of Birth(DOB)- F2F Availability- Alternate Mobile No-Any Gap in Carrier / Education- Interested in (2 PM - 10 PM Shift)- It's a fantastic opportunity to work with a great team. Showcase your skills and experience now! Apply Now! Regards Deepan- TA deepankumar.j@htcinc.com

Posted 1 day ago

Apply

3.0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

Experience Good experience in API design, development, and implementation 3 years experience of cloud platform services (preferably GCP) Hands-on experience in designing, implementing, and maintaining APIs that meet the highest standards of performance, security, and scalability. Hands-on experience in Design, develop, and implement microservices architectures and solutions using industry best practices and design patterns. Hands-on experience with cloud computing and services. Hands-on experience with proficiency in programming languages like Java, Python, JavaScript etc. Hands-on experience with API Gateway and management tools like Apigee, Kong, API Gateway. Hands-on experience with integrating APIs with a variety of systems/applications/microservices and infrastructure . Deployment experience in Cloud environment (preferably GCP) Experience in TDD/DDD and unit testing. Hands-on CI/CD experience in automating the build, test, and deployment processes to ensure rapid and reliable delivery of API updates. Technical Skills Programming & Languages: Java, GraphQL, SQL, API Gateway and management tools Apigee, API Gateway Database Tech: Oracle, Spanner, BigQuery, Cloud Storage Operating Systems Linux Expert with API design principles, specification and architectural styles like REST, GraphQL, and gRPC, Proficiency in API lifecycle management, advanced security measures, and performance optimization. Good Knowledge of Security Best Practices and Compliance Awareness. Good Knowledge of messaging patterns and distributed systems. Well-versed with protocols and data formats. Strong development knowledge in microservice design, architectural patterns, frameworks and libraries. Knowledge of SQL and NoSQL databases, and how to interact with them through APIs Good to have knowledge of data modeling and database management design database schemas that efficiently store and retrieve data. Scripting and configuration (eg yaml) knowledge. Strong Testing and Debugging Skills writing unit tests and familiarity with the tools and techniques to fix issues. DevOps knowledge CI/CD practices and tools. Familiarity with Monitoring and observability platforms for real-time insights into application performance Understanding version control systems like Git. Familiarity with API documentation standards such as OpenAPI. Problem-solving skills and ability to work independently in a fast-paced environment. Effective Communication negotiate and communicate effectively with stakeholders to ensure API solutions meet both technical and non-technical stakeholders. Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Sion, Maharashtra, India

On-site

Linkedin logo

Job Description Business Advisors shape the vision and strategy with the client, understand the needs of the users/stakeholders, carry out an elicitation of processes, data and capabilities and derive the target processes and the business requirements for the current and future solution. Job Description - Grade Specific Performs analysis of processes, systems, data and business information and research, and builds up domain knowledge. Skills (competencies) Abstract Thinking Active Listening Agile (Software Development Framework) Analytical Thinking Backlog Grooming Business Architecture Modeling Business Process Modeling (e.g. BPMN) Change Management Coaching Collaboration Commercial Acumen Conceptual Data Modeling Conflict Management Confluence Critical Thinking CxO Conversations Data Analysis Data Requirements Management Decision-Making Emotional Intelligence Enterprise Architecture Modelling Facilitation Functional IT Architecture Modelling Giving Feedback Google Cloud Platform (GCP) (Cloud Platform) Influencing Innovation Jira Mediation Mentoring Microsoft Office Motivation Negotiation Networking Power BI Presentation skills Prioritization Problem Solving Project Governance Project Management Project Planning Qlik Relationship-Building Requirements Gathering Risk Management Scope Management SQL Stakeholder Management Story Mapping Storytelling Strategic Management Strategic tThinking SWOT Analysis Systems Requirement Analysis (or Management) Tableau Trusted Advisor UI-Design / Wireframing UML User Journey User Research Verbal Communication Written Communication Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Mysore, Karnataka, India

On-site

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your Role And Responsibilities As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Preferred Education Master's Degree Required Technical And Professional Expertise Experience with Apache Spark (PySpark): In-depth knowledge of Spark’s architecture, core APIs, and PySpark for distributed data processing. Big Data Technologies: Familiarity with Hadoop, HDFS, Kafka, and other big data tools. Data Engineering Skills: Strong understanding of ETL pipelines, data modeling, and data warehousing concepts. Strong proficiency in Python: Expertise in Python programming with a focus on data processing and manipulation. Data Processing Frameworks: Knowledge of data processing libraries such as Pandas, NumPy. SQL Proficiency: Experience writing optimized SQL queries for large-scale data analysis and transformation. Cloud Platforms: Experience working with cloud platforms like AWS, Azure, or GCP, including using cloud storage systems Preferred Technical And Professional Experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing Show more Show less

Posted 1 day ago

Apply

8.0 - 13.0 years

27 - 42 Lacs

Kolkata, Hyderabad, Pune

Work from Office

Naukri logo

About Client Hiring for One of the Most Prestigious Multinational Corporations Job Title: Senior GCP Data Engineer Experience: 8 to 13 years Key Responsibilities : Design, build, and maintain scalable and reliable data pipelines on Google Cloud Platform (GCP) . Develop ETL/ELT workflows using Cloud Dataflow , Apache Beam , Dataproc , BigQuery , and Cloud Composer (Airflow). Optimize performance of data processing and storage solutions (e.g., BigQuery, Cloud Storage). Collaborate with data analysts, data scientists, and business stakeholders to deliver data-driven insights. Design and implement data lake and data warehouse solutions following best practices. Ensure data quality, security, and governance across GCP environments. Implement CI/CD pipelines for data engineering workflows using tools like Cloud Build , GitLab CI , or Jenkins . Monitor and troubleshoot data jobs, ensuring reliability and timeliness of data delivery. Mentor junior engineers and participate in architectural design discussions. Technical Skills: Strong experience in Google Cloud Platform (GCP) data services: BigQuery , Dataflow , Dataproc , Pub/Sub , Cloud Storage , Cloud Functions Proficiency in Python and/or Java for data processing. Strong knowledge of SQL and performance tuning in large-scale environments. Hands-on experience with Apache Beam , Apache Spark , and Airflow . Solid understanding of data modeling , data warehousing , and streaming/batch processing . Experience with CI/CD , Git, and modern DevOps practices for data workflows. Familiarity with data security and compliance in cloud environments. NOTE : Only immediate and 15 days joiners Notice period : Only immediate and 15 days joiners Location: Pune, Chennai. Hyderabad, Kolkata Mode of Work : WFO(Work From Office) Thanks & Regards, SWETHA Black and White Business Solutions Pvt.Ltd. Bangalore,Karnataka,INDIA. Contact Number:8067432433 rathy@blackwhite.in |www.blackwhite.in

Posted 1 day ago

Apply

8.0 years

0 Lacs

Greater Delhi Area

Remote

Linkedin logo

About Tide At Tide, we are building a business management platform designed to save small businesses time and money. We provide our members with business accounts and related banking services, but also a comprehensive set of connected administrative solutions from invoicing to accounting. Launched in 2017, Tide is now used by over 1 million small businesses across the world and is available to UK, Indian and German SMEs. Headquartered in central London, with offices in Sofia, Hyderabad, Delhi, Berlin and Belgrade, Tide employs over 2,000 employees. Tide is rapidly growing, expanding into new products and markets and always looking for passionate and driven people. Join us in our mission to empower small businesses and help them save time and money. About The Role We are looking for a seasoned Lead Data Scientist to build and embed intelligence into the products that significantly reduce fraud and improve the FinCrime efforts at Tide . You will work closely with the FinCrime team to solve business challenges, design data products, and leverage machine learning to detect and mitigate fraud, driving meaningful impact for the business. As The Lead Data Scientist You’ll Be Collaborating with the FinCrime team: Work closely to identify and solve fraud detection problems, using data science to drive business decisions and significantly reduce fraudulent activities. Translating business requirements: Understand business needs and translate them into data products and models that address specific fraud detection challenges. Building and optimizing models: Train machine learning models, optimize hyperparameters, design KPIs, and implement experiments to improve fraud detection accuracy and business outcomes. Productionizing models: Work with machine learning engineers and data engineers to deploy models into production, ensuring they are scalable and optimized for real-time fraud detection. Adopting new methodologies: Lead the adoption of innovative methods and technologies, continuously improving fraud detection models and data science practices. Coaching junior data scientists: Mentor and guide junior team members, setting best practices for model development, optimization, and deployment. Being a technical subject matter expert: Serve as a subject matter expert, providing guidance on complex technical concepts related to fraud detection, machine learning, and data science. What We Are Looking For Experience: 8+ years of experience as a Data Scientist, with a proven track record in solving complex problems, particularly in fraud detection or financial crime. Machine Learning Expertise: Extensive experience in designing, developing, and deploying machine learning models to detect and mitigate fraud. You should be comfortable translating business challenges into data-driven solutions. Working with Large-Scale Data: Proficiency in handling large, tabular datasets, and applying robust techniques for data analysis and model training. Advanced Tools and Platforms: Experience with tools such as PySpark, Databricks, AWS, or GCP for processing large datasets, training models, and deploying them at scale. Production-Ready Models: Proven ability to deploy models into production environments, optimizing them for performance and scalability, while ensuring they remain effective over time. Data & Model Observability: Expertise in monitoring and maintaining the health and performance of models post-deployment to ensure continuous improvement and fraud detection accuracy. Fintech & Fraud Detection: Background in the Fintech industry, with specific experience in financial crime and fraud detection, applying data science to solve real-world business problems. Collaboration & Communication: Strong interpersonal skills to collaborate effectively with data engineers, machine learning engineers, and product managers in an agile, iterative environment. Ability to communicate complex insights clearly to both technical and non-technical stakeholders. What You’ll Get In Return Make work, work for you! We are embracing new ways of working and support flexible working arrangements. With our Working Out of Office (WOO) policy our colleagues can work remotely from home or anywhere in their assigned Indian state. Additionally, you can work from a different country or Indian state for 90 days of the year. Plus, you’ll get: Competitive salary Self & Family Health Insurance Term & Life Insurance OPD Benefits Mental wellbeing through Plumm Learning & Development Budget WFH Setup allowance 15 days of Privilege leaves 12 days of Casual leaves 12 days of Sick leaves 3 paid days off for volunteering or L&D activities Stock Options Tidean Ways Of Working At Tide, we champion a flexible workplace model that supports both in-person and remote work to cater to the specific needs of our different teams. While remote work is supported, we believe in the power of face-to-face interactions to foster team spirit and collaboration. Our offices are designed as hubs for innovation and team-building, where we encourage regular in-person gatherings to foster a strong sense of community. TIDE IS A PLACE FOR EVERYONE At Tide, we believe that we can only succeed if we let our differences enrich our culture. Our Tideans come from a variety of backgrounds and experience levels. We consider everyone irrespective of their ethnicity, religion, sexual orientation, gender identity, family or parental status, national origin, veteran, neurodiversity or differently-abled status. We celebrate diversity in our workforce as a cornerstone of our success. Our commitment to a broad spectrum of ideas and backgrounds is what enables us to build products that resonate with our members’ diverse needs and lives. We are One Team and foster a transparent and inclusive environment, where everyone’s voice is heard. At Tide, we thrive on diversity, embracing various backgrounds and experiences. We welcome all individuals regardless of ethnicity, religion, sexual orientation, gender identity, or disability. Our inclusive culture is key to our success, helping us build products that meet our members' diverse needs. We are One Team, committed to transparency and ensuring everyone’s voice is heard. You personal data will be processed by Tide for recruitment purposes and in accordance with Tide's Recruitment Privacy Notice . Show more Show less

Posted 1 day ago

Apply

8.0 years

0 Lacs

Gurgaon Rural, Haryana, India

On-site

Linkedin logo

Job Title: Senior Java Fullstack Developer Location: Gurgaon (Cybercity) Contract Duration: 3+ Months Experience: 8+ Years Work Hours: IST Shift Overview We are seeking a Senior Java Fullstack Developer with over 8 years of experience to join our team on a short-term contract basis. The ideal candidate will bring hands-on expertise in Java (11+), Spring Boot, Microservices, and cloud platforms (preferably AWS) , along with strong knowledge of front-end technologies such as React or Angular . This is a great opportunity to work on enterprise-grade solutions in a fast-paced Agile environment. Experience in the hospitality domain is a plus. Key Responsibilities Design, develop, and maintain scalable backend systems using Java 11+ and Spring Boot Build and consume RESTful APIs and develop secure, high-performance microservices Implement and manage CI/CD pipelines using Jenkins, GitLab CI, or GitHub Actions Work with AWS (preferred), Azure , or GCP for cloud-based deployment and integrations Collaborate with front-end developers to build responsive UIs using React or Angular Apply modern software engineering practices including Agile, DDD, and BFF patterns Optimize application performance, scalability, and reliability Utilize Kubernetes for container orchestration (EKS, AKS, or GKE) Integrate systems using Kafka , MQ , or other event-driven platforms Ensure code quality through automated tests, peer reviews, and version control (Git) Required Skills and Experience 8+ years of hands-on experience in Core Java development Strong expertise in Spring Boot , Spring Framework , and REST API development Experience designing and maintaining microservices-based architecture Proficiency with JPA , Hibernate , and relational databases (MS-SQL, PostgreSQL) Solid understanding of Java 11+ features such as streams, lambdas, and functional programming Experience with CI/CD pipelines and version control using Git Working knowledge of cloud platforms , especially AWS (preferred) Front-end development skills in React or Angular , along with HTML, CSS3/Tailwind Familiarity with Kubernetes and containerized deployment workflows Experience working with event-driven architectures and messaging systems like Kafka or MQ Strong communication, problem-solving, and teamwork skills Experience in Agile development environments Hospitality services domain knowledge is a bonus Show more Show less

Posted 1 day ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

🚨 We’re Hiring | GCP Data Engineer (Full-Time) 📍 Location: Chennai (Onsite) 📅 Experience: 8+ Years 💼 Notice period : 0 days 💰 Budget: Based on Experience Are you a data-driven engineer with strong hands-on experience in GCP, Python, and Big Data technologies? We’re looking for seasoned professionals to join our growing team in Chennai! 🔧 Key Skills & Experience Required: ✅ 2+ Years in GCP services: BigQuery, Dataflow, Dataproc, DataPlex, DataFusion, Cloud SQL, Cloud Storage, Redis Memory ✅ 2+ Years in Terraform, Tekton, Data Transfer Utilities ✅ 2+ Years in Git or other version control tools ✅ 2+ Years in Confluent Kafka ✅ 1+ Year in API Development ✅ 2+ Years working in Agile Frameworks ✅ 4+ Years in Python & PySpark development ✅ 4+ Years in Shell Scripting for data import/export 💡 Bonus Skills: Cloud Run DataForm Airflow Agile Software Development Methodologies 🎓 Education: Bachelor’s Degree (required) If you're passionate about data engineering and cloud-native development and ready to work on challenging, large-scale solutions — we want to hear from you! 📩 Apply Here!: rajesh@reveilletechnologies.com ./ Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Key Responsibilities Cloud Infrastructure Management: Design, deploy, and maintain scalable, high-performance cloud environments using platforms such as AWS, Azure, or Google Cloud. Cloud Automation & Orchestration: Leverage automation tools like Terraform, Ansible, or CloudFormation to streamline provisioning and management of cloud resources. Cloud Security & Compliance: Ensure cloud environments comply with industry security standards and best practices. Implement security measures for data protection, access control, and disaster recovery. SAP Integration (if applicable): Experience in integrating or migrating SAP workloads to the cloud (AWS, Azure, etc.) or managing hybrid SAP environments. Knowledge of SAP cloud solutions, such as SAP S/4HANA Cloud, is a plus. On-Premises Environment Management: Manage and troubleshoot on-premises infrastructure, ensuring a smooth transition and integration with cloud environments where needed. Collaboration with Cross-Functional Teams: Work closely with software engineers, DevOps, network engineers, and security teams to deliver cloud solutions that meet business needs. Monitoring & Performance Optimization: Implement cloud monitoring solutions to ensure the health and performance of cloud resources. Troubleshoot and resolve issues related to cloud infrastructure. Required Skills And Experience Proven experience with cloud platforms (AWS, Azure, or GCP). Hands-on experience with cloud automation and orchestration tools (Terraform, Ansible, CloudFormation, etc.). Strong understanding of cloud security, networking, and disaster recovery principles. Basic experience with SAP or on-premises systems (SAP NetWeaver, SAP S/4HANA, etc.) is a plus. Familiarity with containerization technologies (Docker, Kubernetes) and CI/CD pipelines. Strong scripting skills (Python, Bash, PowerShell) for automation. Excellent troubleshooting and problem-solving skills. Ability to work in a team-oriented environment and communicate effectively with technical and non-technical stakeholders. Preferred Qualifications Experience in migrating SAP environments to the cloud. Certifications in cloud platforms (AWS Certified Solutions Architect, Azure Solutions Architect, Google Cloud Professional Cloud Architect). Knowledge of hybrid cloud architectures and management. Familiarity with infrastructure-as-code practices. Show more Show less

Posted 1 day ago

Apply

Exploring GCP Jobs in India

The job market for Google Cloud Platform (GCP) professionals in India is rapidly growing as more and more companies are moving towards cloud-based solutions. GCP offers a wide range of services and tools that help businesses in managing their infrastructure, data, and applications in the cloud. This has created a high demand for skilled professionals who can work with GCP effectively.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Hyderabad
  4. Pune
  5. Chennai

Average Salary Range

The average salary range for GCP professionals in India varies based on experience and job role. Entry-level positions can expect a salary range of INR 5-8 lakhs per annum, while experienced professionals can earn anywhere from INR 12-25 lakhs per annum.

Career Path

Typically, a career in GCP progresses from a Junior Developer to a Senior Developer, then to a Tech Lead position. As professionals gain more experience and expertise in GCP, they can move into roles such as Cloud Architect, Cloud Consultant, or Cloud Engineer.

Related Skills

In addition to GCP, professionals in this field are often expected to have skills in: - Cloud computing concepts - Programming languages such as Python, Java, or Go - DevOps tools and practices - Networking and security concepts - Data analytics and machine learning

Interview Questions

  • What is Google Cloud Platform and its key services? (basic)
  • Explain the difference between Google Cloud Storage and Google Cloud Bigtable. (medium)
  • How would you optimize costs in Google Cloud Platform? (medium)
  • Describe a project where you implemented CI/CD pipelines in GCP. (advanced)
  • How does Google Cloud Pub/Sub work and when would you use it? (medium)
  • What is Cloud Spanner and how is it different from other database services in GCP? (advanced)
  • Explain the concept of IAM and how it is implemented in GCP. (medium)
  • How would you securely transfer data between different regions in GCP? (advanced)
  • What is Google Kubernetes Engine (GKE) and how does it simplify container management? (medium)
  • Describe a scenario where you used Google Cloud Functions in a project. (advanced)
  • How do you monitor performance and troubleshoot issues in GCP? (medium)
  • What is Google Cloud SQL and when would you choose it over other database options? (medium)
  • Explain the concept of VPC (Virtual Private Cloud) in GCP. (basic)
  • How do you ensure data security and compliance in GCP? (medium)
  • Describe a project where you integrated Google Cloud AI services. (advanced)
  • What is the difference between Google Cloud CDN and Google Cloud Load Balancing? (medium)
  • How do you handle disaster recovery and backups in GCP? (medium)
  • Explain the concept of auto-scaling in GCP and when it is useful. (medium)
  • How would you set up a multi-region deployment in GCP for high availability? (advanced)
  • Describe a project where you used Google Cloud Dataflow for data processing. (advanced)
  • What are the best practices for optimizing performance in Google Cloud Platform? (medium)
  • How do you manage access control and permissions in GCP? (medium)
  • Explain the concept of serverless computing and how it is implemented in GCP. (medium)
  • What is the difference between Google Cloud Identity and Access Management (IAM) and AWS IAM? (advanced)
  • How do you ensure data encryption at rest and in transit in GCP? (medium)

Closing Remark

As the demand for GCP professionals continues to rise in India, now is the perfect time to upskill and pursue a career in this field. By mastering GCP and related skills, you can unlock numerous opportunities and build a successful career in cloud computing. Prepare well, showcase your expertise confidently, and land your dream job in the thriving GCP job market in India.

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies