Jobs
Interviews

103 Bigtable Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 12.0 years

7 - 17 Lacs

pune, chennai, bengaluru

Work from Office

• Handson experience in objectoriented programming using Python, PySpark, APIs, SQL, BigQuery, GCP • Building data pipelines for huge volume of data • Dataflow Dataproc and BigQuery • Deep understanding of ETL concepts

Posted 1 day ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You will have the opportunity to work at Capgemini, a company that empowers you to shape your career according to your preferences. You will be part of a collaborative community of colleagues worldwide, where you can reimagine what is achievable and contribute to unlocking the value of technology for leading organizations to build a more sustainable and inclusive world. Your Role: - You should have a very good understanding of current work, tools, and technologies being used. - Comprehensive knowledge and clarity on Bigquery, ETL, GCS, Airflow/Composer, SQL, Python are required. - Experience with Fact and Dimension tables, SCD is necessary. - Minimum 3 years of experience in GCP Data Engineering is mandatory. - Proficiency in Java/ Python/ Spark on GCP, with programming experience in Python, Java, or PySpark, SQL. - Hands-on experience with GCS (Cloud Storage), Composer (Airflow), and BigQuery. - Ability to work with handling big data efficiently. Your Profile: - Strong data engineering experience using Java or Python programming languages or Spark on Google Cloud. - Experience in pipeline development using Dataflow or Dataproc (Apache Beam etc). - Familiarity with other GCP services or databases like Datastore, Bigtable, Spanner, Cloud Run, Cloud Functions, etc. - Possess proven analytical skills and a problem-solving attitude. - Excellent communication skills. What you'll love about working here: - You can shape your career with a range of career paths and internal opportunities within the Capgemini group. - Access to comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage, or new parent support via flexible work. - Opportunity to learn on one of the industry's largest digital learning platforms with access to 250,000+ courses and numerous certifications. About Capgemini: Capgemini is a global business and technology transformation partner, helping organizations accelerate their dual transition to a digital and sustainable world while creating tangible impact for enterprises and society. With a diverse team of over 340,000 members in more than 50 countries, Capgemini leverages its over 55-year heritage to unlock the value of technology for clients across the entire breadth of their business needs. The company delivers end-to-end services and solutions, combining strengths from strategy and design to engineering, fueled by market-leading capabilities in AI, generative AI, cloud, and data, along with deep industry expertise and a strong partner ecosystem.,

Posted 2 days ago

Apply

4.0 - 8.0 years

0 Lacs

maharashtra

On-site

As a talented Full Stack Developer with expertise in Node.js and React.js, your role will involve developing and maintaining scalable web applications. You will be responsible for ensuring high performance and working on both frontend and backend development. **Key Responsibilities:** - Develop and maintain scalable, high-performance web applications using React.js for the frontend and Node.js (Express/Next.js) for the backend. - Design, develop, and manage RESTful APIs and integrate third-party services. - Implement authentication and authorization (OAuth, JWT, Firebase Auth, etc.). - Optimize applications for speed and scalability. - Deploy and manage applications on Google Cloud Platform (GCP) services like Cloud Functions, App Engine, Pub/Sub, and Firestore. - Write clean, maintainable, and well-documented code following best practices. - Work with CI/CD pipelines and containerization tools (Docker, Kubernetes). - Debug, troubleshoot, and enhance application performance. - Collaborate with front-end developers, DevOps, and other stakeholders. **Qualifications Required:** - Strong proficiency in JavaScript/TypeScript. - Hands-on experience with React.js, Next.js for frontend development. - Experience with RESTful APIs, GraphQL, and WebSockets. - Hands-on experience with relational (PostgreSQL, MySQL) and NoSQL (MongoDB, Bigtable, Firestore) databases. - Familiarity with Google Cloud Platform (GCP) services like Cloud Functions, Cloud Run, Firestore, and Pub/Sub. - Knowledge of CI/CD pipelines (GitHub Actions, GitLab CI/CD, Jenkins, etc.). - Strong understanding of security best practices (data encryption, authentication, authorization). In addition to the above, you may also be required to have familiarity with serverless computing (Cloud Functions, AWS Lambda, Firebase), knowledge of performance tuning and application monitoring tools like Stackdriver, Prometheus, or Datadog, and exposure to Bun as an alternative runtime. Looking forward to receiving your updated resume at ruchita.parsekar@e-stonetech.com.,

Posted 3 days ago

Apply

7.0 - 12.0 years

25 - 27 Lacs

hyderabad

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc) Keywords :dataproc,pyspark,data flow,kafka,cloud storage,terraform,oops,cloud spanner,hadoop,java,hive,spark,mapreduce,big data,gcp,aws,javascript,mysql,postgresql,sql server,oracle,bigtable,software development,sql*,python development*,python*,bigquery*,pandas*

Posted 4 days ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

bengaluru

Work from Office

About the Team When 5% of Indian households shop with us, its important to build resilient systems to manage millions of orders every day. Weve done this with zero downtime! ?? Sounds impossible? Well, thats the kind of Engineering muscle that has helped Meesho become the e-commerce giant that it is today. We value speed over perfection and see failures as opportunities to become better. Weve taken steps to inculcate a strong Founders Mindset across our engineering teams, making us grow and move fast. We place special emphasis on the continuous growth of each team member - and we do this with regular 1-1s and open communication. As a Database Engineer II, you will be part of self-starters who thrive on teamwork and constructive feedback. We know how to party as hard as we work! If we arent building unparalleled tech solutions, you can find us debating the plot points of our favorite books and games or even gossiping over chai. So, if a day filled with building impactful solutions with a fun team sounds appealing to you, join us. About the Role As a Database Engineer II, youll establish and implement the best Nosql Database Engineering practices proactively. Youll have opportunities to work on different Nosql technologies on a large scale. Youll also work closely with other engineering teams and establish seamless collaborations within the organization. Being proficient in emerging technologies and the ability to work successfully with a team is key to success in this role. What you will do Manage, maintain and monitor a multitude of Relational/NoSQL databases clusters, ensuring obligations to SLAs. Manage both in-house and SaaS solutions in the Public cloud (Or 3rd party).Diagnose, mitigate and communicate database-related issues to relevant stakeholders. Design and Implement best practices for planning, provisioning, tuning, upgrading and decommissioning of database clusters. Understand the cost optimization aspects of such tools/softwares and implement cost control mechanisms along with continuous improvement. Advice and support product, engineering and operations teams. Maintain general backup/recovery/DR of data solutions. Work with the engineering and operations team to automate new approaches for scalability, reliability and performance. Perform R&D on new features and for innovative solutions. Participate in on-call rotations. What you will need 5 years+ experience in provisioning & managing Relational/NoSQL databases. Proficiency in two or more: Mysql,PostgreSql, Big Table ,Elastic Search, MongoDB, Redis, ScyllaDB. Proficiency in Python programming language. Experience with deployment orchestration, automation, and security configuration management (Jenkins, Terraform, Ansible). Hands-on experience with Amazon Web Services (AWS)/ Google Cloud Platform (GCP).Comfortable working in Linux/Unix environments. Knowledge of TCP/IP stack, Load balancer, Networking. Proven ability to drive projects to completion. A degree in computer science, software engineering, information technology or related fields will be an advantage.

Posted 5 days ago

Apply

3.0 - 8.0 years

6 - 10 Lacs

bengaluru

Work from Office

Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way youd like, where youll be supported and inspired bya collaborative community of colleagues around the world, and where youll be able to reimagine whats possible. Join us and help the worlds leading organizationsunlock the value of technology and build a more sustainable, more inclusive world. Job Role Very good Understanding of current work and the tools and technologies being used. Comprehensive knowledge and clarity on Bigquery, ETL, GCS, Airflow/Composer, SQL, Python. Experience working with Fact and Dimension tables, SCD. Minimum 3 years" experience in GCP Data Engineering. Java/ Python/ Spark on GCP, Programming experience in any one language - either Python or Java or PySpark,SQL. GCS(Cloud Storage), Composer (Airflow) and BigQuery experience. Should have worked on handling big data Your Profile Strong data engineering experience using Java or Python programming languages or Spark on Google Cloud. Pipeline development experience using Dataflow or Dataproc (Apache Beam etc). Any other GCP services or databases like Datastore, Bigtable, Spanner, Cloud Run, Cloud Functions etc. Proven analytical skills and Problem-solving attitude. Excellent Communication Skills. What youll love about working here You can shape yourcareerwith us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have theopportunity to learnon one of the industry"s largest digital learning platforms, with access to 250,000+ courses and numerous certifications. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem. Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.

Posted 5 days ago

Apply

3.0 - 6.0 years

6 - 8 Lacs

noida

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)KeywordsPython Development,Python,Bigquery,Pandas,Dataproc,Pyspark,Data Flow,Kafka,Cloud Storage,Terraform,Oops,Cloud Spanner,Hadoop,Java,Hive,Spark,Mapreduce,Big Data,Gcp,Aws,Javascript,Mysql,Postgresql,Sql Server,Oracle,Bigtable,Software Development,Sql*

Posted 5 days ago

Apply

4.0 - 8.0 years

22 - 25 Lacs

hyderabad, chennai, bengaluru

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)Keywordsdataproc,pyspark,data flow,kafka,cloud storage,terraform,oops,cloud spanner,hadoop,java,hive,spark,mapreduce,big data,gcp,aws,javascript,mysql,postgresql,sql server,oracle,bigtable,software development,sql*,python development*,python*,bigquery*,pandas*

Posted 5 days ago

Apply

10.0 - 14.0 years

0 Lacs

noida, uttar pradesh

On-site

Role Overview: You are required to work as a GCP Data Architect with a total experience of 12+ years. Your relevant experience for engagement should be 10 years. Your primary responsibilities will include maintaining architecture principles, guidelines, and standards, data warehousing, programming in Python/Java, working with Big Data, Data Analytics, and GCP Services. You will be responsible for designing and implementing solutions in various technology domains related to Google Cloud Platform Data Components like BigQuery, BigTable, CloudSQL, Dataproc, Data Flow, Data Fusion, etc. Key Responsibilities: - Maintain architecture principles, guidelines, and standards - Work on Data Warehousing projects - Program in Python and Java for various data-related tasks - Utilize Big Data technologies for data processing and analysis - Implement solutions using GCP Services such as BigQuery, BigTable, CloudSQL, Dataproc, Data Flow, Data Fusion, etc. Qualifications Required: - Strong experience in Big Data including data modeling, design, architecting, and solutioning - Proficiency in programming languages like SQL, Python, and R-Scala - Good Python skills with experience in data visualization tools such as Google Data Studio or Power BI - Knowledge of A/B Testing, Statistics, Google Cloud Platform, Google Big Query, Agile Development, DevOps, Data Engineering, and ETL Data Processing - Migration experience of production Hadoop Cluster to Google Cloud will be an added advantage Additional Company Details: The company is looking for individuals who are experts in Big Query, Dataproc, Data Fusion, Dataflow, Bigtable, Fire Store, CloudSQL, Cloud Spanner, Google Cloud Storage, Cloud Composer, Cloud Interconnect, etc. Relevant certifications such as Google Professional Cloud Architect will be preferred.,

Posted 5 days ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

bengaluru

Work from Office

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)

Posted 6 days ago

Apply

5.0 - 10.0 years

10 - 15 Lacs

bengaluru, secunderabad

Work from Office

Description: Join GlobalLogic, to be a valid part of the team working on a huge software project for the world-class company providing M2M / IoT 4G/5G modules e.g. to the automotive, healthcare and logistics industries. Through our engagement, we contribute to our customer in developing the end-user modules’ firmware, implementing new features, maintaining compatibility with the newest telecommunication and industry standards, as well as performing analysis and estimations of the customer requirements. Requirements: Qualifications & Experience • 6+ years of experience developing and designing software applications using Java • Expert understanding of core computer science fundamentals including data structures, algorithms, and concurrent programming • Expert in analyzing, designing, implementing and troubleshooting software solutions for highly transactional systems. • Expert in OOAD and design principals, implementing micro services architecture using JEE, Spring, Spring Boot, Spring Cloud, Hibernate, Oracle, CloudSQL PostgreSQL, BigTable, BigQuery, NoSQL, Git, IntelliJ IDEA, Pub/Sub, Data Flow. • Experience working in Native & Hybrid Cloud environment. • Experience with Agile development methodology. • Proficiency in agile software development including technical skillsets such as programming (e.g., Python, Java), multi-tenant cloud technologies, and product management tools (e.g., Jira) • Strong collaboration and communication skills to effectively work across the product team with product and technology team members and clearly articulate technical ideas • Ability to translate strategic priorities as features and user stories into scalable solutions that are structured, efficient, and user-centric • Detail-oriented problem solver who can break down complex issues to deliver effectively • Excellent communication and team player with can-do attitude. • Ability to analyze user and business requirements to create technical design requirements and software architecture • Experience must also include: • Java • Java IDE like Eclipse or IntelliJ • Java EE Application servers like Apache Tomcat • Object-oriented design, Git, Maven, and a popular scripting language • JSON, XML, YAML, Terraform scripting languages Preferred Skills/Experience: • Champion of Agile Scrum methodologies • Experience continuous integration systems like Jenkins or GitHub CI • Experience with SAFe methodologies • Deep knowledge and understanding to create secure solutions by design • Multi-threaded backend environments with concurrent users • Experience with tools or languages like: • Ruby, Python, Perl, Node.js and bash scripting languages • Spring, Spring Boot • C, C++, Java and Java EE development experience • Oracle • Docker • Kubernetes Job Responsibilities: Key Responsibilities & Deliverables • Feature implementation and production ready code • Technical documentation and system diagrams • Debugging reports and fixes • Performance optimizations What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!

Posted 6 days ago

Apply

10.0 - 15.0 years

20 - 27 Lacs

bengaluru, secunderabad

Work from Office

Description: Join GlobalLogic, to be a valid part of the team working on a huge software project for the world-class company providing M2M / IoT 4G/5G modules e.g. to the automotive, healthcare and logistics industries. Through our engagement, we contribute to our customer in developing the end-user modules’ firmware, implementing new features, maintaining compatibility with the newest telecommunication and industry standards, as well as performing analysis and estimations of the customer requirements. Requirements: Qualifications & Experience • 8+ years of experience developing and designing software applications using Java • Expert understanding of core computer science fundamentals including data structures, algorithms, and concurrent programming • Expert in analyzing, designing, implementing and troubleshooting software solutions for highly transactional systems. • Expert in OOAD and design principals, implementing micro services architecture using JEE, Spring, Spring Boot, Spring Cloud, Hibernate, Oracle, CloudSQL PostgreSQL, BigTable, BigQuery, NoSQL, Git, IntelliJ IDEA, Pub/Sub, Data Flow. • Experience working in Native & Hybrid Cloud environment. • Experience with Agile development methodology. • Proficiency in agile software development including technical skillsets such as programming (e.g., Python, Java), multi-tenant cloud technologies, and product management tools (e.g., Jira) • Strong collaboration and communication skills to effectively work across the product team with product and technology team members and clearly articulate technical ideas • Ability to translate strategic priorities as features and user stories into scalable solutions that are structured, efficient, and user-centric • Detail-oriented problem solver who can break down complex issues to deliver effectively • Excellent communication and team player with can-do attitude. • Ability to analyze user and business requirements to create technical design requirements and software architecture • Experience must also include: • Java • Java IDE like Eclipse or IntelliJ • Java EE Application servers like Apache Tomcat • Object-oriented design, Git, Maven, and a popular scripting language • JSON, XML, YAML, Terraform scripting languages Preferred Skills/Experience: • Champion of Agile Scrum methodologies • Experience continuous integration systems like Jenkins or GitHub CI • Experience with SAFe methodologies • Deep knowledge and understanding to create secure solutions by design • Multi-threaded backend environments with concurrent users • Experience with tools or languages like: • Ruby, Python, Perl, Node.js and bash scripting languages • Spring, Spring Boot • C, C++, Java and Java EE development experience • Oracle • Docker • Kubernetes Job Responsibilities: • 8+ years of experience in software engineering and enterprise architecture • Proven leadership in guiding high-performing engineering teams • Deep expertise in designing and managing API-first platforms and developer ecosystems • Experience with fintech and banking standards including ISO 20022, Open Banking, and FDX • Strong understanding of cloud-native technologies such as GCP, AWS, Kubernetes, containers, and service mesh • Proficient in distributed systems, multi-tenancy, and event-driven architecture • Hands-on experience integrating AI/ML into product and platform architecture • Knowledge of DevOps practices including CI/CD tools like GitHub, Jenkins, and Google Cloud Build • Strong collaboration and communication skills to lead cross-functional initiatives • Automation-first mindset with a drive to optimize delivery pipelines and eliminate manual processes Key Responsibilities & Deliverables • Feature implementation and production ready code • Technical documentation and system diagrams • Debugging reports and fixes • Performance optimizations What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!

Posted 6 days ago

Apply

2.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

As a Senior Data Engineer with 5-8 years of IT experience, including 2-3 years focused on GCP data services, you will be a valuable addition to our dynamic data and analytics team. Your primary responsibility will be to design, develop, and implement robust and insightful data-intensive solutions using GCP Cloud services. Your role will entail a deep understanding of data engineering, proficiency in SQL, and extensive experience with various GCP services such as BigQuery, DataFlow, DataStream, Pub/Sub, Dataproc, Cloud Storage, and other key GCP services for Data Pipeline Orchestration. You will be instrumental in the construction of a GCP native cloud data platform. Key Responsibilities: - Lead and contribute to the development, deployment, and lifecycle management of applications on GCP, utilizing services like Compute Engine, Kubernetes Engine (GKE), Cloud Functions, Cloud Run, Pub/Sub, BigQuery, Cloud SQL, Cloud Storage, and more. Required Skills and Qualifications: - Bachelor's degree in Computer Science, Information Technology, Data Analytics, or a related field. - 5-8 years of overall IT experience, with hands-on experience in designing and developing data applications on GCP Cloud. - In-depth expertise in GCP services and architectures, including Compute, Storage & Databases, Data & Analytics, and Operations & Monitoring. - Proven ability to translate business requirements into technical solutions. - Strong analytical, problem-solving, and critical thinking skills. - Effective communication and interpersonal skills for collaboration with technical and non-technical stakeholders. - Experience in Agile development methodology. - Ability to work independently, manage multiple priorities, and meet deadlines. Preferred Skills (Nice to Have): - Experience with other Hyperscalers. - Proficiency in Python or other scripting languages for data manipulation and automation. If you are a highly skilled and experienced Data Engineer with a passion for leveraging GCP data services to drive innovation, we invite you to apply for this exciting opportunity in Gurugram or Hyderabad.,

Posted 6 days ago

Apply

0.0 years

0 Lacs

noida, uttar pradesh, india

On-site

Title : Python support Engineer Must-have skills: Monitor and maintain the availability and GKE based applications in high-pressure production environment. Respond to and resolve incidents and service requests related to application functionality and performance. Collaborate with development teams to troubleshoot and resolve technical issues in a timely manner. Document support processes , procedures, and troubleshooting steps for future reference. Participate in on-call rotation as well as in off-hours to provide after-hours support as needed. Communicate effectively with stakeholders to provide updates on issue resolution and status. Should have experience with monitoring tools and incident management systems . Ability to analyze logs, identify patterns, and trace system failures . Solid experience in SQL and database querying for debugging and reporting. Experience in monitoring/alerting tools on GCP. Good to have: Strong in Python , with production-level experience. Strong in FastAPI development and deployment practices. Worked in Google Kubernetes Engine ( GKE ) - including workload deployment, autoscaling, and tuning . Must have GCP exp in - Cloud Functions, Pub/Sub, Dataflow, Composer, Bigtable and Bigquery.

Posted 6 days ago

Apply

2.0 - 6.0 years

0 Lacs

maharashtra

On-site

As a passionate Software Engineer with a proven track record of solving complex problems and driving innovation, you will have the opportunity to work at ReliaQuest and be part of a team that is reshaping the future of threat detection and response. Your role will involve developing cutting-edge security technology, creating REST APIs, and integrating various products to enhance our customers" threat detection capabilities. By working closely with talented individuals, you will contribute directly to the growth and success of our organization. Your responsibilities will include researching and developing solutions across a range of advanced technologies, managing deployment processes, conducting code reviews, and automating software development lifecycle stages. Collaboration with internal and external stakeholders will be crucial to ensure the effective utilization of our products. Additionally, you will support your team members and foster a culture of continuous collaboration. To excel in this role, you should have 2-4 years of software development experience in languages such as Python, JavaScript, React, Angular, Java, C#, MySQL, Elastic Search, or equivalent. Proficiency in both written and verbal English is essential for effective communication. What sets you apart is hands-on experience with technologies like Elasticsearch, Kafka, Apache Spark, Logstash, Hadoop/Hive, Tensorflow, Kibana, Athena/Presto/BigTable, Angular, and React. Familiarity with cloud platforms such as AWS, GCP, or Azure, as well as knowledge of unit testing, continuous integration, deployment practices, and Agile Methodology, will be advantageous. Higher education or relevant certifications will further distinguish you as a candidate.,

Posted 1 week ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

chennai

Work from Office

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

bengaluru, karnataka, india

On-site

We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer III at JPMorgan Chase within the Consumer and Community Banking - Banking and Wealth Management Team, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm's business objectives. Job responsibilities Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture Contributes to software engineering communities of practice and events that explore new and emerging technologies Adds to team culture of diversity, opportunity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 3+ years applied experience Hands-on experience with cloud-based applications, technologies and tools, deployment, monitoring and operations, such as Kubernetes, Prometheus, FluentD, Slack, Elasticsearch, Grafana, Kibana, etc. Relational and NoSQL databases developing and managing operations leveraging key event streaming, messaging and DB services such as Cassandra, MQ/JMS/Kafka, Aurora, RDS, Cloud SQL, BigTable, DynamoDB, MongoDB, Cloud Spanner, Kinesis, Cloud Pub/Sub, etc. Networking (Security, Load Balancing, Network Routing Protocols, etc.) Demonstrated experience in the fields of production engineering and automation. Strong understanding of cloud technology standards and practices. Proficiency in utilizing tools for monitoring, analysis, and troubleshooting, including Splunk, Dynatrace, Datadog, or equivalent. Preferred qualifications, capabilities, and skills Ability to conduct detailed analysis on incidents to identify patterns and trends, thereby enhancing operational stability and efficiency. Familiarity with digital certificate management and automation tools. Knowledge of frameworks such as CI/CD pipeline. Excellent communication and collaboration skills.

Posted 1 week ago

Apply

7.0 - 12.0 years

3 - 5 Lacs

gurgaon, haryana, india

On-site

Experienced Data Engineer with hands on experience on GCP offerings experienced in BigQuery/BigTable/Pyspark Worked on prior data engineering projects leveraging GCP product offerings Strong SQL background Prior Amex experience is a big Plus

Posted 1 week ago

Apply

10.0 - 12.0 years

0 Lacs

pune, maharashtra, india

On-site

Job description Some careers shine brighter than others. If you're looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Senior Software Engineer In this role, you will: Work with high performance decisioning data store for aggregations, custom views and orchestration of data on cloud. Responsible for automating the continuous integration / continuous delivery pipeline within a DevOps Product/Service team driving a culture of continuous improvement. Ensure service resilience, service sustainability and recovery time objectives are met for all the software solutions delivered. Collaborate with central teams (architecture, security, engineering, networks) Implement DevOps / Automation on GCP according to system requirement. Resolving all the technical and process blockers for cloud migration/adoption Ensuring deployments, patch management, repavement of infra is all under control. Maintain SLA's , troubleshoot the workflow and ensure system is up and running all the time. Requirements To be successful in this role, you should meet the following requirements: Minimum experience of atleast 10+ years. Should have good experience in production support domain and should be flexible enough to work on production issues, even at odd hours. Should be capable enough to take leadership role like ITSO etc. Solid cloud knowledge especially Distributed Tech stack - Java, Unix, Windows, SQL/Oracle Experience in working large scale complex global program. Coding experience GCP Programming, (python preferred), BQ SQL Good Knowledge of Containers, Docker, Kubernetes (preferred) Knowledge of DevOps processes and automation (e.g. Jenkins pipelines) Knowledge on GCP like Big query, Bigtable, Dataproc, dataflow, pub-sub etc. preferred. Experience with ITIL framework (incident, problem, change management knowhow) Capable to work in a team that is located across multiple countries/regions. Willingness to adapt and learn new things. Takes ownership of tasks. You'll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by - HSBC Software Development India

Posted 1 week ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

bengaluru

Work from Office

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

This role is for a GCP Data Engineer who can build cloud analytics platforms to meet expanding business requirements with speed and quality using lean Agile practices. You will work on analysing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and Analytics in the GCP. You will be responsible for designing the transformation and modernization on GCP. Experience with large scale solutions and operationalizing of data warehouses, data lakes and analytics platforms on Google Cloud Platform or other cloud environment is a must. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on the Google Cloud Platform. Responsibilities Develop technical solutions for Data Engineering and work between 1 PM and 10 PM IST to enable more overlap time with European and North American counterparts. This role will work closely with teams in US and as well as Europe to ensure robust, integrated migration aligned with Global Data Engineering patterns and standards. Design and deploying data pipelines with automated data lineage. Develop, reusable Data Engineering patterns. Design and build production data engineering solutions to deliver pipeline patterns using Google Cloud Platform (GCP) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. Ensure timely migration of Ford Credit Europe FCE Teradata warehouse to GCP and to enable Teradata platform decommissioning by end 2025 with a strong focus on ensuring continued, robust, and accurate Regulatory Reporting capability. Position Opportunities The Data Engineer role within FC Data Engineering supports the following opportunities for successful individuals: Key player in a high priority program to unlock the potential of Data Engineering Products and Services & secure operational resilience for Ford Credit Europe. Explore and implement leading edge technologies, tooling and software development best practices. Experience of managing data warehousing and product delivery within a financially regulated environment. Experience of collaborative development practices within an open-plan, team-designed environment. Experience of working with third party suppliers / supplier management. Continued personal and professional development with support and encouragement for further certification. Qualifications Essential: 5+ years of experience in data engineering, with a focus on data warehousing and ETL development (including data modelling, ETL processes, and data warehousing principles). 5+ years of SQL development experience. 3+ years of Cloud experience (GCP preferred) with solutions designed and implemented at production scale. Strong understanding of key GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, BigQuery, Dataflow, DataFusion, Dataproc, Cloud Build, AirFlow, and Pub/Sub, alongside and storage including Cloud Storage, Bigtable, Cloud Spanner. Excellent problem-solving skills, with the ability to design and optimize complex data pipelines. Strong communication and collaboration skills, capable of working effectively with both technical and non-technical stakeholders as part of a large global and diverse team. Experience developing with micro service architecture from container orchestration framework. Designing pipelines and architectures for data processing. Strong evidence of self-motivation to continuously develop own engineering skills and those of the team. Proven record of working autonomously in areas of high ambiguity, without day-to-day supervisory support. Evidence of a proactive mindset to problem solving and willingness to take the initiative. Strong prioritization, co-ordination, organizational and communication skills, and a proven ability to balance workload and competing demands to meet deadlines. Desired: Professional Certification in GCP (e.g., Professional Data Engineer). Data engineering or development experience gained in a regulated, financial environment. Experience with Teradata to GCP migrations is a plus. Strong expertise in SQL and experience with programming languages such as Python, Java, and/or Apache Beam. Experience of coaching and mentoring Data Engineers. Experience with data security, governance, and compliance best practices in the cloud. An understanding of current architecture standards and digital platform services strategy.,

Posted 1 week ago

Apply

5.0 - 7.0 years

20 - 25 Lacs

chennai

Work from Office

Position Description: Representing the Ford Credit (FC) Data Engineering Organization as a Google Cloud Platform (GCP) Data Engineer, specializing in migration and transformation, you will be a developer part of a global team to build a complex Datawarehouse in the Google Cloud Platform. This role involves designing, implementing, and optimizing data pipelines, ensuring data integrity during migration, and leveraging GCP services to enhance data transformation processes for scalability and efficiency. This role is for a GCP Data Engineer who can build cloud analytics platforms to meet expanding business requirements with speed and quality using lean Agile practices. You will work on analyzing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and Analytics in the GCP. You will be responsible for designing the transformation and modernization on GCP. Experience with large scale solutions and operationalizing of data warehouses, data lakes and analytics platforms on Google Cloud Platform or other cloud environment is a must. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on the Google Cloud Platform. Experience Required: 5+ years of experience in data engineering, with a focus on data warehousing and ETL development (including data modelling, ETL processes, and data warehousing principles). • 5+ years of SQL development experience • 3+ years of Cloud experience (GCP preferred) with solutions designed and implemented at production scale. • Strong understanding and experience of key GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, BigQuery, Dataflow, DataFusion, Dataproc, Cloud Build, AirFlow, and Pub/Sub, alongside and storage including Cloud Storage, Bigtable, Cloud Spanner • Experience developing with micro service architecture from container orchestration framework. • Designing pipelines and architectures for data processing • Excellent problem-solving skills, with the ability to design and optimize complex data pipelines. • Strong communication and collaboration skills, capable of working effectively with both technical and non-technical stakeholders as part of a large global and diverse team • Strong evidence of self-motivation to continuously develop own engineering skills and those of the team. • Proven record of working autonomously in areas of high ambiguity, without day-to-day supervisory support • Evidence of a proactive mindset to problem solving and willingness to take the initiative. • Strong prioritization, co-ordination, organizational and communication skills, and a proven ability to balance workload and competing demands to meet deadlines

Posted 2 weeks ago

Apply

3.0 - 8.0 years

9 - 16 Lacs

gurugram, chennai, bengaluru

Work from Office

Qualifications: The candidate should have extensive production experience (3-4 Years) in GCP, Other cloud experience would be a strong bonus. Exposure to enterprise application development is a must. Roles & Responsibilities: 3-6 years of IT experience range is preferred. Able to effectively use GCP managed services e.g., Dataproc, Dataflow, pub/sub, Cloud functions, Big Query, GCS - At least 4 of these Services. Good to have knowledge on Cloud Composer, Cloud SQL, Big Table, Cloud Function. Strong experience in Big Data technologies Hadoop, Sqoop, Hive and Spark including DevOPs. Good hands-on expertise on either Python or Java programming. Good Understanding of GCP core services like Google cloud storage, Google compute engine, Cloud SQL, Cloud IAM. Good to have knowledge on GCP services like App engine, GKE, Cloud Run, Cloud Built, Anthos. Ability to drive the deployment of the customers’ workloads into GCP and provide guidance, cloud adoption model, service integrations, appropriate recommendations to overcome blockers and technical roadmaps for GCP cloud implementations. Experience with technical solutions based on industry standards using GCP - IaaS, PaaS, and SaaS capabilities. Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technologies. Act as a subject-matter expert OR developer around GCP and become a trusted advisor to multiple teams. Technical ability to become certified in required GCP technical certifications.

Posted 2 weeks ago

Apply

2.0 - 4.0 years

0 Lacs

pune, maharashtra, india

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes forour clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences foreach other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firms growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities: Cloud Data Engineer (AWS/Azure/Databricks/GCP) Experience :2-4 years in Data Engineering Job Description : We are seeking skilled and dynamic Cloud Data Engineers specializing in AWS, Azure, Databricks, and GCP. The ideal candidate will have a strong background in data engineering, with a focus on data ingestion, transformation, and warehousing. They should also possess excellent knowledge of PySpark or Spark, and a proven ability to optimize performance in Spark job executions. - Design, build, and maintain scalable data pipelines for a variety of cloud platforms including AWS, Azure, Databricks, and GCP. - Implement data ingestion and transformation processes to facilitate efficient data warehousing. - Utilize cloud services to enhance data processing capabilities: - AWS : Glue, Athena, Lambda, Redshift, Step Functions, DynamoDB, SNS. - Azure : Data Factory, Synapse Analytics, Functions, Cosmos DB, Event Grid, Logic Apps, Service Bus. - GCP : Dataflow, BigQuery, DataProc, Cloud Functions, Bigtable, Pub/Sub, Data Fusion. - Optimize Spark job performance to ensure high efficiency and reliability. - Stay proactive in learning and implementing new technologies to improve data processing frameworks. - Collaborate with cross-functional teams to deliver robust data solutions. - Work on Spark Streaming for real-time data processing as necessary. Qualifications: - 2-4 years of experience in data engineering with a strong focus on cloud environments. - Proficiency in PySpark or Spark is mandatory. - Proven experience with data ingestion, transformation, and data warehousing. - In-depth knowledge and hands-on experience with cloud services(AWS/Azure/GCP): - Demonstrated ability in performance optimization of Spark jobs. - Strong problem-solving skills and the ability to work independently as well as in a team. - Cloud Certification (AWS, Azure, or GCP) is a plus. - Familiarity with Spark Streaming is a bonus. Mandatory skill sets: Python, Pyspark, SQL with (AWS or Azure or GCP) Preferred skill sets: Python, Pyspark, SQL with (AWS or Azure or GCP) Years of experience required: 2-4 years Education qualification: BE/BTECH, ME/MTECH, MBA, MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Business Administration, Master of Engineering, Bachelor of Technology Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills PySpark, Python (Programming Language), Structured Query Language (SQL) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Artificial Intelligence, Big Data, C++ Programming Language, Communication, Complex Data Analysis, Data-Driven Decision Making (DIDM), Data Engineering, Data Lake, Data Mining, Data Modeling, Data Pipeline, Data Quality, Data Science, Data Science Algorithms, Data Science Troubleshooting, Data Science Workflows, Deep Learning, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Machine Learning + 12 more Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship No Government Clearance Required No Job Posting End Date Show more Show less

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

As a Senior Software Engineer - DevOps at INVIDI Technologies Corporation in Bangalore, India, you will be part of a globally acclaimed software development company that revolutionizes television broadcasting. Our Emmy Award-winning technology is utilized by leading cable, satellite, and telco operators worldwide, delivering targeted ads seamlessly across various devices and platforms. INVIDI's innovative solutions have played a pivotal role in shaping the addressable television industry, with clients including major operators, networks, advertising agencies, and prominent brands. In this dynamic and fast-paced environment, you will be at the forefront of commercial television innovation, contributing to the development of a unified video ad tech platform. Your role as a DevOps Engineer is essential to supporting and enhancing a remote product development team. Operating within a modern agile product organization, you will be tasked with maintaining and deploying scalable, performant backend services in Java and Kotlin, ensuring high availability and operational efficiency. Collaborating closely with peers and product owners, you will play a key role in evolving deployment pipelines, troubleshooting issues, and mentoring team members. Your responsibilities will also include active participation in on-call rotations, responding to alarms and maintaining critical services as needed. With a focus on simplicity, elegance, and continuous learning, you will be expected to excel in a collaborative and agile work environment. To excel in this role, you should possess a Master's degree in computer science or equivalent, along with at least 4 years of experience in the industry. Strong development skills, experience with high-volume systems, and proficiency in technologies such as Dropwizard, Kafka, Google Cloud, Terraform, and Docker are highly desirable. Additionally, expertise in infrastructure maintenance, CI/CD tools, and cloud services like GCP and AWS will be advantageous. INVIDI offers a supportive and organized office environment, where your contributions will be valued and recognized. If you are a proactive and motivated DevOps Engineer with a passion for innovation and a commitment to excellence, we invite you to apply and be a part of our talented team at INVIDI Technologies Corporation.,

Posted 2 weeks ago

Apply
Page 1 of 5
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies