Jobs
Interviews

103 Bigtable Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 6.0 years

20 - 30 Lacs

bengaluru

Work from Office

Job Summary: We are seeking a Senior AI Engineer with deep expertise in Google Cloud Platform (GCP) , particularly Vertex AI APIs , to design, develop, and deploy scalable AI/ML solutions. The ideal candidate is proficient in Python , experienced with FastAPI for building high-performance APIs, and has exposure to React UI for front-end integration. Key Responsibilities: Design, implement, and optimize AI/ML models leveraging GCP Vertex AI services . Develop robust RESTful APIs using FastAPI for model serving and integration. Deploy and manage AI solutions in production environments with strong focus on scalability and performance. Work closely with data scientists and ML engineers to productionize models. Integrate front-end solutions with APIs (React UI experience is a plus). Implement best practices for model versioning, monitoring, and MLOps pipelines on GCP. Collaborate with cross-functional teams to translate business needs into technical solutions. Required Skills & Qualifications: Mandate and Strong experience with GCP (Vertex AI, BigQuery,Dataflow,Bigtable). Expert-level Python programming skills for ML/AI development. Proven experience with FastAPI for building and deploying scalable APIs. Understanding of MLflow, TensorFlow, PyTorch , or similar frameworks. Knowledge of CI/CD pipelines for ML models. Familiarity with React UI or similar front-end technologies (preferred, not mandatory). Strong problem-solving and system design skills. Preferred Qualifications: Exposure to MLOps practices on cloud. Experience integrating AI solutions with enterprise applications. Knowledge of containerization (Docker, Kubernetes) for model deployment. Education: Bachelors or Masters degree in Computer Science, AI/ML, Data Science, or related field .

Posted 2 weeks ago

Apply

6.0 - 10.0 years

20 - 35 Lacs

pune

Work from Office

We are looking for a Technical Lead with strong expertise in ETL development (Informatica) , SQL , and Google Cloud Platform (GCP) services. The ideal candidate will lead data integration initiatives , design scalable ETL workflows , and ensure the delivery of high-quality data solutions for analytics, reporting. Key Responsibilities: Lead and mentor a team of Data Engineers to design, develop, and optimize ETL workflows. Architect end-to-end ETL solutions using Informatica and GCP services (BigQuery, Dataflow, Cloud Storage). Define and enforce best practices , code standards, and review processes for ETL development. Collaborate with business stakeholders and data architects to translate requirements into technical solutions. Drive data migration, integration, and transformation for large-scale projects on GCP . Ensure data quality, error handling , and robust validation mechanisms across pipelines. Oversee code deployment processes , including version control (Git, SVN) and CI/CD integration. Maintain clear and detailed documentation for ETL workflows, design, and deployment. Required Skills & Experience: 8+ years of experience in ETL development and data integration . Strong hands-on experience with Informatica Expertise in GCP data services such as BigQuery, Dataflow, Cloud Composer, Cloud Storage . Advanced proficiency in SQL (including PL/SQL) for data transformations and logic. Knowledge of ETL performance tuning, troubleshooting , and optimization techniques. Experience with file processing , shell scripting , and automation for ETL processes. Familiarity with code version control tools (Git, SVN) and deployment best practices. Good understanding of data modeling , data governance , and pipeline architecture . Good to Have: Experience in building cloud-native ETL pipelines on GCP . Exposure to Informatica Cloud (IICS) migrations to GCP. Familiarity with Airflow or other orchestration tools for workflow automation. Knowledge of DevOps and CI/CD practices for ETL deployments. Leadership & Soft Skills: Ability to lead, guide, and mentor a team of data engineers. Strong communication skills to collaborate with business and technical stakeholders. Capability to own delivery timelines and ensure adherence to quality standards.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

15 - 20 Lacs

gurugram

Work from Office

About the Role We are seeking a Senior Big Data Engineer with deep expertise in distributed systems, batch data processing, and large-scale data pipelines. The ideal candidate has strong hands-on experience with Oozie, Pig, the Apache Hadoop ecosystem, and programming proficiency in Java (preferred) or Python. This role requires a deep understanding of data structures and algorithms, along with a proven track record of writing production-grade code and building robust data workflows. This is a fully remote position and requires an independent, self-driven engineer who thrives in complex technical environments and communicates effectively across teams. Key Responsibilities: Design and develop scalable batch processing systems using technologies like Hadoop, Oozie, Pig, Hive, MapReduce, and HBase, with hands-on coding in Java or Python (Java is a must). Must be able to lead Jira Epics Write clean, efficient, and production-ready code with a strong focus on data structures and algorithmic problem-solving applied to real-world data engineering tasks. Develop, manage, and optimize complex data workflows within the Apache Hadoop ecosystem, with a strong focus on Oozie orchestration and job scheduling. Leverage Google Cloud Platform (GCP) tools such as Dataproc, GCS, and Composer to build scalable and cloud-native big data solutions. Implement DevOps and automation best practices, including CI/CD pipelines, infrastructure as code (IaC), and performance tuning across distributed systems. Collaborate with cross-functional teams to ensure data pipeline reliability, code quality, and operational excellence in a remote-first environment. Qualifications: Bachelors's degree in Computer Science, software engineering or related field of study. Experience with managed cloud services and understanding of cloud-based batch processing systems are critical. Must be able to lead Jira Epics is MUST Proficiency in Oozie, Airflow, Map Reduce, Java are MUST haves. Strong programming skills with Java (specifically Spark), Python, Pig, and SQL. Expertise in public cloud services, particularly in GCP. Proficiency in the Apache Hadoop ecosystem with Oozie, Pig, Hive, Map Reduce. Familiarity with BigTable and Redis. Experienced in Infrastructure and Applied DevOps principles in daily work. Utilize tools for continuous integration and continuous deployment (CI/CD), and Infrastructure as Code (IaC) like Terraform to automate and improve development and release processes. Proven experience in engineering batch processing systems at scale. Must Have: (Important) 5+ years of experience in customer-facing software/technology or consulting. 5+ years of experience with on-premises to cloud migrations or IT transformations. 5+ years of experience building, and operating solutions built on GCP Proficiency in Oozie andPig Must be able to lead Jira Epics Proficiency in Java or Python

Posted 2 weeks ago

Apply

5.0 - 10.0 years

3 - 6 Lacs

hyderabad

Hybrid

Employment type: Contractual Job Description: GCP Data engineer with very strong Python skills and for Dollar General the person should also be well versed with streaming platforms like Kafka & Pubsub Mandatory Skills: GCP DE skills & Python Nice to have skills: Kubernettes,Containers Additional Information: NP: Immediate to 15 days

Posted 2 weeks ago

Apply

9.0 - 14.0 years

35 - 50 Lacs

noida, bengaluru, delhi / ncr

Work from Office

Design & optimize BigQuery, Dataflow, Dataproc, Pub/Sub, Composer; ETL/ELT, governance, security, migration, SQL/Python. 8+ years of experience in data engineering, data architecture, or analytics. At least 3 years data or solutions architect role.

Posted 2 weeks ago

Apply

5.0 - 7.0 years

8 - 12 Lacs

bengaluru

Work from Office

Educational Requirements Master of Science (Technology),Master Of Comp. Applications,Master Of Engineering,Master Of Technology,Bachelor Of Comp. Applications,Bachelor Of Science (Tech),Bachelor of Engineering,Bachelor Of Technology Service Line Application Development and Maintenance Responsibilities A day in the life of an InfoscionAs part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organizations financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem-solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Technical and Professional Requirements: Primary skills: Technology Big Data Big Table | Technology Cloud Integration | Azure Data Factory (ADF) | Technology | Data on Cloud - Platform | AWS Preferred Skills: Technology->Data On Cloud - Platform->AWS Technology->Cloud Integration->Azure Data Factory (ADF) Technology->Cloud Platform->Google Big Data->GCP

Posted 2 weeks ago

Apply

8.0 - 15.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Architecture professional with 8-15 years of experience, your primary responsibility will be to design and implement data-centric solutions on Google Cloud Platform (GCP). You will utilize various GCP tools such as Big Query, Google Cloud Storage, Cloud SQL, Memory Store, Dataflow, Dataproc, Artifact Registry, Cloud Build, Cloud Run, Vertex AI, Pub/Sub, and GCP APIs to create efficient and scalable solutions. Your role will involve building ETL pipelines to ingest data from diverse sources into our system and developing data processing pipelines using programming languages like Java and Python for data extraction, transformation, and loading (ETL). You will be responsible for creating and maintaining data models to ensure efficient storage, retrieval, and analysis of large datasets. Additionally, you will deploy and manage both SQL and NoSQL databases like Bigtable, Firestore, or Cloud SQL based on project requirements. Your expertise will be crucial in optimizing data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure. Version control and CI/CD practices for data engineering workflows will be implemented by you to ensure reliable and efficient deployments. You will leverage GCP monitoring and logging tools to proactively identify and address performance bottlenecks and system failures. Troubleshooting and resolving issues related to data processing, storage, and retrieval will be part of your daily tasks. Addressing code quality issues throughout the development lifecycle using tools like SonarQube, Checkmarx, Fossa, and Cycode will also be essential. Implementing security measures and data governance policies to maintain the integrity and confidentiality of data will be a critical aspect of your role. Collaboration with stakeholders to gather and define data requirements aligned with business objectives is key to success. You will be responsible for developing and maintaining documentation for data engineering processes to facilitate knowledge transfer and system maintenance. Participation in on-call rotations to address critical issues and ensure the reliability of data engineering systems will be required. Furthermore, providing mentorship and guidance to junior team members to foster a collaborative and knowledge-sharing environment will be an integral part of your role as a Data Architecture professional.,

Posted 2 weeks ago

Apply

3.0 - 5.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Your Role Very good Understanding of current work and the tools and technologies being used. Comprehensive knowledge and clarity on Bigquery, ETL, GCS, Airflow/Composer, SQL, Python. Experience working with Fact and Dimension tables, SCD. Minimum 3 years experience in GCP Data Engineering. Java/ Python/ Spark on GCP, Programming experience in any one language - either Python or Java or PySpark,SQL. GCS(Cloud Storage), Composer (Airflow) and BigQuery experience. Should have worked on handling big data. Your Profile Strong data engineering experience using Java or Python programming languages or Spark on Google Cloud. Pipeline development experience using Dataflow or Dataproc (Apache Beam etc). Any other GCP services or databases like Datastore, Bigtable, Spanner, Cloud Run, Cloud Functions etc. Proven analytical skills and Problem-solving attitude. Excellent Communication Skills. What you'll love about working here .You can shape yourwith us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. .You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. .You will have theon one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.

Posted 3 weeks ago

Apply

7.0 - 9.0 years

0 Lacs

pune, maharashtra, india

On-site

Job description Some careers shine brighter than others. If you're looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Senior Consultant Specialist. In this role, you will: Design, develop, and deploy Java-based data pipelines using GCP technologies like Dataflow, Pub/Sub, and BigQuery, BigTable. Collaborate with cross-functional pods to gather requirements and implement data solutions. Ensure the performance, scalability, and reliability of data pipelines, handling heavy loads. Implement industry development tools, principles, and best practices to ensure high-quality code. Debug and troubleshoot issues in data pipelines and provide timely resolutions. Support testing efforts and ensure the accuracy and integrity of data in the pipelines. Stay up-to-date with the latest trends and technologies in data engineering and cloud platforms. Document processes, procedures, and code for future reference. Requirements To be successful in this role, you should meet the following requirements: Bachelor Degree in Computer Science or related disciplines 7 or more years of hands-on development experience as a Java Data Engineer or similar role, with a focus on GCP, Dataflow, Pub/Sub, BigQuery, and BigTable. Strong knowledge of Java programming and experience working on data pipelines with heavy loads. Ability to work with geographically distributed and cross-functional teams Working knowledge of industry development tools and principles. Familiarity with banking or relevant industry processes and regulations is a plus. Knowledge of developing applications hosted on cloud platforms like GCP or any relevant cloud platform. Excellent problem-solving and communication skills. Ability to work independently and in a team, with a flexible and adaptable mindset. Strong attention to detail and ability to prioritize tasks effectively. You'll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by - HSBC Software Development India

Posted 3 weeks ago

Apply

9.0 - 14.0 years

30 - 45 Lacs

noida, bengaluru, delhi / ncr

Work from Office

8+ years of extensive experience in data engineering, data architecture, or analytics. GCP using BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, Composer; expertise in SQL, Python, ETL/ELT, IaC, security, governance, and migration.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

15 - 20 Lacs

kolkata, pune, delhi / ncr

Hybrid

We have an opening for Java GCP at Pune only. Please Let me know, if you fine for any of the location, will process your profile immediately. Experience: 5-8Years Notice Period: 0-30Days Mandatory skills : Java - spring boot, GCP Pub/sub, Eventos, Big data, Bigtable, BigQuery, Composer/Airflow

Posted 3 weeks ago

Apply

3.0 - 8.0 years

6 - 10 Lacs

bengaluru

Work from Office

Your Role Very good Understanding of current work and the tools and technologies being used. Comprehensive knowledge and clarity on Bigquery, ETL, GCS, Airflow/Composer, SQL, Python. Experience working with Fact and Dimension tables, SCD. Minimum 3 years" experience in GCP Data Engineering. Java/ Python/ Spark on GCP, Programming experience in any one language - either Python or Java or PySpark,SQL. GCS(Cloud Storage), Composer (Airflow) and BigQuery experience. Should have worked on handling big data. Your Profile Strong data engineering experience using Java or Python programming languages or Spark on Google Cloud. Pipeline development experience using Dataflow or Dataproc (Apache Beam etc). Any other GCP services or databases like Datastore, Bigtable, Spanner, Cloud Run, Cloud Functions etc. Proven analytical skills and Problem-solving attitude. Excellent Communication Skills. What youll love about working here You can shape yourcareerwith us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have theopportunity to learnon one of the industry"s largest digital learning platforms, with access to 250,000+ courses and numerous certifications. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem. Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.

Posted 3 weeks ago

Apply

7.0 - 12.0 years

12 - 16 Lacs

hyderabad

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/ Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc).

Posted 3 weeks ago

Apply

3.0 - 6.0 years

6 - 8 Lacs

noida

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)

Posted 3 weeks ago

Apply

4.0 - 8.0 years

22 - 25 Lacs

bengaluru

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)

Posted 3 weeks ago

Apply

4.0 - 8.0 years

22 - 25 Lacs

chennai

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)

Posted 3 weeks ago

Apply

4.0 - 8.0 years

22 - 25 Lacs

hyderabad

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)

Posted 3 weeks ago

Apply

4.0 - 8.0 years

15 - 25 Lacs

chennai

Work from Office

Role Description Provides leadership for the overall architecture, design, development, and deployment of a full-stack cloud native data analytics platform. Designing & Augmenting Solution architecture for Data Ingestion, Data Preparation, Data Transformation, Data Load, ML & Simulation Modelling, Java BE & FE, State Machine, API Management & Intelligence consumption using data products, on cloud Understand Business Requirements and help in developing High level and Low-level Data Engineering and Data Processing Documentation for the cloud native architecture Developing conceptual, logical and physical target-state architecture, engineering and operational specs. Work with the customer, users, technical architects, and application designers to define the solution requirements and structure for the platform Model and design the application data structure, storage, and integration Lead the database analysis, design, and build effort Work with the application architects and designers to design the integration solution Ensure that the database designs fulfill the requirements, including data volume, frequency needs, and long-term data growth Able to perform Data Engineering tasks using Spark Knowledge of developing efficient frameworks for development and testing using (Sqoop/Nifi/Kafka/Spark/Streaming/ WebHDFS/Python) to enable seamless data ingestion processes on to the Hadoop/BigQuery platforms. Enabling Data Governance and Data Discovery Exposure of Job Monitoring framework along validations automation Exposure of handling structured, Un Structured and Streaming data. Technical Skills Experience with building data platform on cloud (Data Lake, Data Warehouse environment, Databricks) Strong technical understanding of data modeling, design and architecture principles and techniques across master data, transaction data and derived/analytic data Proven background of designing and implementing architectural solutions which solve strategic and tactical business needs Deep knowledge of best practices through relevant experience across data-related disciplines and technologies, particularly for enterprise-wide data architectures, data management, data governance and data warehousing Highly competent with database design Highly competent with data modeling Strong Data Warehousing and Business Intelligence skills or including: Handling ELT and scalability issues for enterprise level data warehouse Creating ETLs/ELTs to handle data from various data sources and various formats Strong hands-on experience of programming language like Python, Scala with Spark and Beam. Solid hands-on and Solution Architecting experience in Cloud Technologies Aws, Azure and GCP (GCP preferred) Hands on working experience of data processing at scale with event driven systems, message queues (Kafka/ Flink/Spark Streaming) Hands on working Experience with GCP Services like BigQuery, DataProc, PubSub, Dataflow, Cloud Composer, API Gateway, Datalake, BigTable, Spark, Apache Beam, Feature Engineering/Data Processing to be used for Model development Experience gathering and processing raw data at scale (including writing scripts, web scraping, calling APIs, write SQL queries, etc.) Experience building data pipelines for structured/unstructured, real-time/batch, events/synchronous/ asynchronous using MQ, Kafka, Steam processing Hands-on working experience in analyzing source system data and data flows, working with structured and unstructured data Must be very strong in writing SparkSQL queries Strong organizational skills, with the ability to work autonomously as well as leading a team Pleasant Personality, Strong Communication & Interpersonal Skills Qualifications A bachelor's degree in computer science, computer engineering, or a related discipline is required to work as a technical lead Certification in GCP would be a big plus Individuals in this field can further display their leadership skills by completing the Project Management Professional certification offered by the Project Management Institute.

Posted 4 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

You should have at least 6+ years of experience in Java, Springboot, Microservices, ReactJS, product development, and sustenance. Troubleshooting and debugging existing code will be part of your responsibilities. It is essential to be proficient in code quality, security compliance, and application performance management. Your role will also involve participation in the agile planning process and estimation of planned tasks. Good verbal and written communication skills are necessary, along with expertise in unit testing (Junit). As part of your key responsibilities and deliverables, you will be responsible for feature implementation and delivering production-ready code. Technical documentation and system diagrams, debugging reports, and fixes, as well as performance optimizations, will also be expected from you. Qualifications and Experience: - 6+ years of experience in developing and designing software applications using Java - Expert understanding of core computer science fundamentals such as data structures, algorithms, and concurrent programming - Experience in analyzing, designing, implementing, and troubleshooting software solutions for highly transactional systems - Proficiency in OOAD and design principles, implementing microservices architecture using various technologies including JEE, Spring, Spring Boot, Spring Cloud, Hibernate, Oracle, CloudSQL PostgreSQL, BigTable, BigQuery, NoSQL, Git, IntelliJ IDEA, Pub/Sub, Data Flow - Experience working in Native & Hybrid Cloud environments - Familiarity with Agile development methodology - Strong collaboration and communication skills to work effectively across product and technology teams - Ability to translate strategic priorities into scalable and user-centric solutions - Detail-oriented problem solver with excellent communication skills and a can-do attitude - Experience with Java, Java IDEs like Eclipse or IntelliJ, Java EE application servers, object-oriented design, Git, Maven, scripting languages, JSON, XML, YAML, Terraform, etc. Preferred Skills/Experience: - Experience with Agile Scrum methodologies, continuous integration systems like Jenkins or GitHub CI, SAFe methodologies - Deep knowledge of creating secure solutions by design, multi-threaded backend environments, and tools/languages like Ruby, Python, Perl, Node.js, bash scripting languages, Spring, Spring Boot, C, C++, Docker, Kubernetes, Oracle, etc. Working with GlobalLogic offers a culture of caring, learning and development opportunities, interesting and meaningful work, balance and flexibility, and a high-trust organization. You'll have the chance to collaborate with innovative clients and work on cutting-edge solutions that shape the world today. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for creating impactful digital products and experiences, collaborating with clients to transform businesses through intelligent products and services.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

As a Java Developer with over 6 years of experience in Java, Springboot, Microservices, and ReactJS, you will be responsible for troubleshooting and debugging existing code as necessary. Your proficiency in ensuring code quality, security compliance, and application performance management will be crucial to the success of the projects. You will actively participate in the agile planning process and estimate planned tasks while possessing good verbal and written communication skills. Additionally, your expertise in unit testing, particularly with Junit, will be essential in ensuring the overall quality of the software. Your key responsibilities will include feature implementation and delivering production-ready code, along with creating technical documentation and system diagrams. You will also be tasked with generating debugging reports, implementing fixes, and optimizing performance to enhance the overall efficiency of the systems. To excel in this role, you should have a solid foundation in core computer science fundamentals, including data structures, algorithms, and concurrent programming. Your experience should demonstrate a deep understanding of software design principles, microservices architecture, and various technologies such as JEE, Spring, Hibernate, Oracle, CloudSQL PostgreSQL, BigTable, BigQuery, NoSQL, Git, IntelliJ IDEA, Pub/Sub, and Data Flow. Experience with Native & Hybrid Cloud environments, Agile development methodologies, and proficiency in programming languages like Python and Java will be beneficial. You are expected to collaborate effectively with the product and technology teams, translating strategic priorities into scalable and user-centric solutions. Your attention to detail and problem-solving skills will be critical in addressing complex issues and delivering effective solutions. Strong communication skills and a proactive, team-oriented attitude are essential for success in this role. Preferred skills and experiences include familiarity with Agile Scrum methodologies, continuous integration systems like Jenkins or GitHub CI, SAFe methodologies, and creating secure solutions by design. Experience with multi-threaded backend environments, Docker, Kubernetes, and scripting languages like Ruby, Python, Perl, Node.js, and bash will be advantageous. At GlobalLogic, we value a culture of caring, continuous learning and development, meaningful work, balance, flexibility, and integrity. As part of our team, you will have the opportunity to work on impactful projects, grow personally and professionally, and collaborate with forward-thinking clients on cutting-edge solutions that shape the world today. Join us and be a part of our commitment to engineering impact and transforming businesses through intelligent digital products and services.,

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

Sabre is a technology company that powers the global travel industry. By leveraging next-generation technology, we create global technology solutions that take on the biggest opportunities and solve the most complex challenges in travel. Positioned at the center of the travel industry, we shape the future by offering innovative advancements that pave the way for a more connected and seamless ecosystem. Our solutions power mobile apps, online travel sites, airline and hotel reservation networks, travel agent terminals, and many other platforms, connecting people with moments that matter. Sabre is seeking a talented senior software engineer full Senior Data Science Engineer for SabreMosaic Team. In this role, you will plan, design, develop, and test data science and data engineering software systems or applications for software enhancements and new products based on cloud-based solutions. Role and Responsibilities: - Develop, code, test, and debug new complex data-driven software solutions or enhancements to existing products. - Design, plan, develop, and improve applications using advanced cloud-native technology. - Work on issues requiring in-depth knowledge of organizational objectives and implement strategic policies in selecting methods and techniques. - Encourage high coding standards, best practices, and high-quality output. - Interact regularly with subordinate supervisors, architects, product managers, HR, and others on project or team performance matters. - Provide technical mentorship and cultural/competency-based guidance to teams. - Offer larger business/product context and mentor on specific tech stacks/technologies. Qualifications and Education Requirements: - Minimum 4-6 years of related experience as a full-stack developer. - Expertise in Data Engineering/DW projects with Google Cloud-based solutions. - Designing and developing enterprise data solutions on the GCP cloud platform. - Experience with relational databases and NoSQL databases like Oracle, Spanner, BigQuery, etc. - Expert-level SQL skills for data manipulation, validation, and manipulation. - Experience in designing data modeling, data warehouses, data lakes, and analytics platforms on GCP. - Expertise in designing ETL data pipelines and data processing architectures for Datawarehouse. - Strong experience in designing Star & Snowflake Schemas and knowledge of Dimensional Data Modeling. - Collaboration with data scientists, data teams, and engineering teams using Google Cloud platform for data analysis and data modeling. - Familiarity with integrating datasets from multiple sources for data modeling for analytical and AI/ML models. - Understanding and experience in Pub/Sub, Kafka, Kubernetes, GCP, AWS, Hive, Docker. - Expertise in Java Spring Boot / Python or other programming languages used for Data Engineering and integration projects. - Strong problem-solving and analytical skills. - Exposure to AI/ML, MLOPS, and Vertex AI is an advantage. - Familiarity with DevOps practices like CICD pipeline. - Airline domain experience is a plus. - Excellent spoken and written communication skills. - GCP Cloud Data Engineer Professional certification is a plus. We will carefully consider your application and review your details against the position criteria. Only candidates who meet the minimum criteria for the role will proceed in the selection process.,

Posted 1 month ago

Apply

5.0 - 10.0 years

10 - 16 Lacs

Bengaluru

Work from Office

Description: 5+ years of experience in Java, Springboot, Microservices, ReactJS, product development and sustenance Troubleshooting and debugging of existing code when required. Proficient in code quality, security compliance and app performance mgmt Participation in agile planning process and estimation of planned tasks Good verbal written communication skills Good expertise in unit testing (Junit) Requirements: Qualifications & Experience • 5+ years of experience developing and designing software applications using Java • Expert understanding of core computer science fundamentals including data structures, algorithms, and concurrent programming • Expert in analyzing, designing, implementing and troubleshooting software solutions for highly transactional systems. • Expert in OOAD and design principals, implementing micro services architecture using JEE, Spring, Spring Boot, Spring Cloud, Hibernate, Oracle, CloudSQL PostgreSQL, BigTable, BigQuery, NoSQL, Git, IntelliJ IDEA, Pub/Sub, Data Flow. • Experience working in Native & Hybrid Cloud environment. • Experience with Agile development methodology. • Proficiency in agile software development including technical skillsets such as programming (e.g., Python, Java), multi-tenant cloud technologies, and product management tools (e.g., Jira) • Strong collaboration and communication skills to effectively work across the product team with product and technology team members and clearly articulate technical ideas • Ability to translate strategic priorities as features and user stories into scalable solutions that are structured, efficient, and user-centric • Detail-oriented problem solver who can break down complex issues to deliver effectively • Excellent communication and team player with can-do attitude. • Ability to analyze user and business requirements to create technical design requirements and software architecture • Experience must also include: • Java • Java IDE like Eclipse or IntelliJ • Java EE Application servers like Apache Tomcat • Object-oriented design, Git, Maven, and a popular scripting language • JSON, XML, YAML, Terraform scripting languages Preferred Skills/Experience: • Champion of Agile Scrum methodologies • Experience continuous integration systems like Jenkins or GitHub CI • Experience with SAFe methodologies • Deep knowledge and understanding to create secure solutions by design • Multi-threaded backend environments with concurrent users • Experience with tools or languages like: • Ruby, Python, Perl, Node.js and bash scripting languages • Spring, Spring Boot • C, C++, Java and Java EE development experience • Oracle • Docker • Kubernetes Job Responsibilities: Key Responsibilities & Deliverables • Feature implementation and production ready code • Technical documentation and system diagrams • Debugging reports and fixes • Performance optimizations What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!

Posted 1 month ago

Apply

5.0 - 7.0 years

5 - 14 Lacs

Pune, Gurugram, Bengaluru

Work from Office

• Handson experience in objectoriented programming using Python, PySpark, APIs, SQL, BigQuery, GCP • Building data pipelines for huge volume of data • Dataflow Dataproc and BigQuery • Deep understanding of ETL concepts

Posted 1 month ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

As a GCP Senior Data Engineer/Architect, you will play a crucial role in our team by designing, developing, and implementing robust and scalable data solutions on the Google Cloud Platform (GCP). Collaborating closely with Architects and Business Analysts, especially for our US clients, you will translate data requirements into effective technical solutions. Your responsibilities will include designing and implementing scalable data warehouse and data lake solutions, orchestrating complex data pipelines, leading cloud data lake implementation projects, participating in cloud migration projects, developing containerized applications, optimizing SQL queries, writing automation scripts in Python, and utilizing various GCP data services such as BigQuery, Bigtable, and Cloud SQL. Your expertise in data warehouse and data lake design and implementation, experience in data pipeline development and tuning, hands-on involvement in cloud migration and data lake projects, proficiency in Docker and GKE, strong SQL and Python scripting skills, and familiarity with GCP services like BigQuery, Cloud SQL, Dataflow, and Composer will be essential for this role. Additionally, knowledge of data governance principles, experience with dbt, and the ability to work effectively within a team and adapt to project needs are highly valued. Strong communication skills, the willingness to work in UK shift timings, and the openness to giving and receiving feedback are important traits that will contribute to your success in this role.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

You will be part of a dynamic team at Equifax, where we are seeking creative, high-energy, and driven software engineers with hands-on development skills to contribute to various significant projects. As a software engineer at Equifax, you will have the opportunity to work with cutting-edge technology alongside a talented group of engineers. This role is perfect for you if you are a forward-thinking, committed, and enthusiastic individual who is passionate about technology. Your responsibilities will include designing, developing, and operating high-scale applications across the entire engineering stack. You will be involved in all aspects of software development, from design and testing to deployment, maintenance, and continuous improvement. By utilizing modern software development practices such as serverless computing, microservices architecture, CI/CD, and infrastructure-as-code, you will contribute to the integration of our systems with existing internal systems and tools. Additionally, you will participate in technology roadmap discussions and architecture planning to translate business requirements and vision into actionable solutions. Working within a closely-knit, globally distributed engineering team, you will be responsible for triaging product or system issues and resolving them efficiently to ensure the smooth operation and quality of our services. Managing project priorities, deadlines, and deliverables will be a key part of your role, along with researching, creating, and enhancing software applications to advance Equifax Solutions. To excel in this position, you should have a Bachelor's degree or equivalent experience, along with at least 7 years of software engineering experience. Proficiency in mainstream Java, SpringBoot, TypeScript/JavaScript, as well as hands-on experience with Cloud technologies such as GCP, AWS, or Azure, is essential. You should also have a solid background in designing and developing cloud-native solutions and microservices using Java, SpringBoot, GCP SDKs, and GKE/Kubernetes. Experience in deploying and releasing software using Jenkins CI/CD pipelines, infrastructure-as-code concepts, Helm Charts, and Terraform constructs is highly valued. Moreover, being a self-starter who can adapt to changing priorities with minimal supervision could set you apart in this role. Additional advantageous skills include designing big data processing solutions, UI development, backend technologies like JAVA/J2EE and SpringBoot, source code control management systems, build tools, working in Agile environments, relational databases, and automated testing. If you are ready to take on this exciting opportunity and contribute to Equifax's innovative projects, apply now and be part of our team of forward-thinking software engineers.,

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies