Jobs
Interviews

103 Bigtable Jobs - Page 3

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

karnataka

On-site

66degrees is a leading consulting and professional services company specializing in developing AI-focused, data-led solutions leveraging the latest advancements in cloud technology. With unmatched engineering capabilities and vast industry experience, we help the world's leading brands transform their business challenges into opportunities and shape the future of work. For the role at 66degrees, we are seeking a senior contractor to engage in a 2.5-month remote assignment with the potential to extend. Candidates with the required skills and the ability to work independently as well as within a team environment are encouraged to apply. As part of the responsibilities, you will be expected to facilitate, guide, and influence the client and teams towards an effective architectural pattern. You will serve as an interface between business leadership, technology leadership, and the delivery teams. Your role will involve performing Migration Assessments and producing Migration Plans that include Total Cost of Ownership (TCO), Migration Architecture, Migration Timelines, and Application Waves. Additionally, you will be responsible for designing a solution architecture on Google Cloud to support critical workloads. This will include Heterogeneous Oracle Migrations to Postgres or Spanner. You will need to design a migration path that accounts for the conversion of Application Dependencies, Database objects, Data, Data Pipelines, Orchestration, Users and Security. You will oversee migration activities and provide troubleshooting support, including translation of DDL and DML, executing data transfers using native Google Cloud and 3rd party tools, and setting up and configuring relative Google Cloud components. Furthermore, you will engage with customer teams as a Google Cloud expert to provide Education Workshops, Architectural Recommendations, and Technology reviews and recommendations. Qualifications: - 5+ years of experience with data engineering, cloud architecture, or working with data infrastructure. - 5+ years of Oracle database management and IT experience. - Experience with Oracle Database adjacent products like Golden Gate and Data Guard. - 3+ years of PostgreSQL experience. - Proven experience in performing performance testing and applying remediations to address performance issues. - Experience in designing data models. - Proficiency in Python programming language and SQL. - Advanced SQL skills, including the ability to write, tune, and interpret SQL queries; tool-specific experience in the database platforms listed above is ideal. - Proven experience in migrating and/or implementing cloud databases like Cloud SQL, Spanner, and Bigtable. Desired Skills: - Google Cloud Professional Architect and/or Data Engineer Certification is preferred. 66degrees is committed to protecting your privacy. Your personal information is collected, used, and shared in accordance with the California Consumer Privacy Act (CCPA).,

Posted 1 month ago

Apply

4.0 - 8.0 years

10 - 14 Lacs

Chennai

Work from Office

Role Description Provides leadership for the overall architecture, design, development, and deployment of a full-stack cloud native data analytics platform. Designing & Augmenting Solution architecture for Data Ingestion, Data Preparation, Data Transformation, Data Load, ML & Simulation Modelling, Java BE & FE, State Machine, API Management & Intelligence consumption using data products, on cloud Understand Business Requirements and help in developing High level and Low-level Data Engineering and Data Processing Documentation for the cloud native architecture Developing conceptual, logical and physical target-state architecture, engineering and operational specs. Work with the customer, users, technical architects, and application designers to define the solution requirements and structure for the platform Model and design the application data structure, storage, and integration Lead the database analysis, design, and build effort Work with the application architects and designers to design the integration solution Ensure that the database designs fulfill the requirements, including data volume, frequency needs, and long-term data growth Able to perform Data Engineering tasks using Spark Knowledge of developing efficient frameworks for development and testing using (Sqoop/Nifi/Kafka/Spark/Streaming/ WebHDFS/Python) to enable seamless data ingestion processes on to the Hadoop/BigQuery platforms. Enabling Data Governance and Data Discovery Exposure of Job Monitoring framework along validations automation Exposure of handling structured, Un Structured and Streaming data. Technical Skills Experience with building data platform on cloud (Data Lake, Data Warehouse environment, Databricks) Strong technical understanding of data modeling, design and architecture principles and techniques across master data, transaction data and derived/analytic data Proven background of designing and implementing architectural solutions which solve strategic and tactical business needs Deep knowledge of best practices through relevant experience across data-related disciplines and technologies, particularly for enterprise-wide data architectures, data management, data governance and data warehousing Highly competent with database design Highly competent with data modeling Strong Data Warehousing and Business Intelligence skills or including: Handling ELT and scalability issues for enterprise level data warehouse Creating ETLs/ELTs to handle data from various data sources and various formats Strong hands-on experience of programming language like Python, Scala with Spark and Beam. Solid hands-on and Solution Architecting experience in Cloud Technologies Aws, Azure and GCP (GCP preferred) Hands on working experience of data processing at scale with event driven systems, message queues (Kafka/ Flink/Spark Streaming) Hands on working Experience with GCP Services like BigQuery, DataProc, PubSub, Dataflow, Cloud Composer, API Gateway, Datalake, BigTable, Spark, Apache Beam, Feature Engineering/Data Processing to be used for Model development Experience gathering and processing raw data at scale (including writing scripts, web scraping, calling APIs, write SQL queries, etc.) Experience building data pipelines for structured/unstructured, real-time/batch, events/synchronous/ asynchronous using MQ, Kafka, Steam processing Hands-on working experience in analyzing source system data and data flows, working with structured and unstructured data Must be very strong in writing SparkSQL queries Strong organizational skills, with the ability to work autonomously as well as leading a team Pleasant Personality, Strong Communication & Interpersonal Skills Qualifications A bachelor's degree in computer science, computer engineering, or a related discipline is required to work as a technical lead Certification in GCP would be a big plus Individuals in this field can further display their leadership skills by completing the Project Management Professional certification offered by the Project Management Institute.

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

As a member of the JM Financial team, you will be part of a culture that values recognition and rewards for the hard work and dedication of its employees. We believe that a motivated workforce is essential for the growth of our organization. Our management team acknowledges and appreciates the efforts of our personnel through promotions, bonuses, awards, and public recognition. By fostering an atmosphere of success, we celebrate achievements such as successful deals, good client ratings, and customer reviews. Nurturing talent is a key focus at JM Financial. We aim to prepare our employees for future leadership roles by creating succession plans and encouraging direct interactions with clients. Knowledge sharing and cross-functional interactions are integral to our business environment, fostering inclusivity and growth opportunities for our team members. Attracting and managing top talent is a priority for JM Financial. We have successfully built a diverse talent pool with expertise, new perspectives, and enthusiasm. Our strong brand presence in the market enables us to leverage the expertise of our business partners to attract the best talent. Trust is fundamental to our organization, binding our programs, people, and clients together. We prioritize transparency, two-way communication, and trust across all levels of the organization. Opportunities for growth and development are abundant at JM Financial. We believe in growing alongside our employees and providing them with opportunities to advance their careers. Our commitment to nurturing talent has led to the appointment of promising employees to leadership positions within the organization. With a focus on employee retention and a supportive environment for skill development, we aim to create a strong future leadership team. Emphasizing teamwork, we value both individual performance and collaborative group efforts. In a fast-paced corporate environment, teamwork is essential for achieving our common vision. By fostering open communication channels and facilitating information sharing, we ensure that every member of our team contributes to delivering value to our clients. As a Java Developer at JM Financial, your responsibilities will include designing, modeling, and building services to support new features and products. You will work on an integrated central platform to power various web applications, developing a robust backend framework and implementing features across different products using a combination of technologies. Researching and implementing new technologies to enhance our services will be a key part of your role. To excel in this position, you should have a BTech Degree in Computer Science or equivalent experience, with at least 3 years of experience building Java-based web applications in Linux/Unix environments. Proficiency in scripting languages such as JavaScript, Ruby, or Python, along with compiled languages like Java or C/C++, is required. Experience with Google Cloud Platform services, knowledge of design methodologies for backend services, and building scalable infrastructure are essential skills for this role. Our technology stack includes JavaScript, Angular, React, NextJS, HTML5/CSS3/Bootstrap, Windows/Linux/OSX Bash, Kookoo telephony, SMS Gupshup, Sendgrid, Optimizely, Mixpanel, Google Analytics, Firebase, Git, Bash, NPM, Browser Dev Console, NoSQL, Google Cloud Datastore, Google Cloud Platform (App Engine, PubSub, Cloud Functions, Bigtable, Cloud Endpoints). If you are passionate about technology and innovation, and thrive in a collaborative environment, we welcome you to join our team at JM Financial.,

Posted 1 month ago

Apply

5.0 - 8.0 years

15 - 20 Lacs

Pune

Hybrid

We have an opening for Java GCP at Pune only. Please Let me know, if you fine for any of the location, will process your profile immediately. Experience: 5-8Years Notice Period: 0-30Days Mandatory skills : Java - spring boot, GCP Pub/sub, Eventos, Big data, Bigtable, BigQuery, Composer/Airflow

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

As a member of the Flipkart team focused on GenZ, you will be at the forefront of the company's strategic growth bet on Video & Live Commerce. The core pillars of this capability are enhancing user experience, empowering creators, and encouraging seller/brands participation. Your primary goal will be to videofy the Flipkart app across various discovery points such as homepage, S&B, and Product Page, while also creating a dedicated discovery destination where users can explore inspirational content akin to TikTok or Instagram reels. You will be instrumental in developing a next-generation Live streaming experience that supports concurrent livestreams for millions of users. Additionally, your role will involve leading the development of cutting-edge systems aimed at enhancing personalization through relevant product discovery for each user. Leveraging GenAI technology, you will drive automated quality control of Images, Videos, Creators, and content to deliver a more personalized shopping experience. Your responsibilities will include driving hyper-personalization of the user experience using Machine Learning and Data Science techniques at various stages of the funnel. By utilizing data-driven insights and a growth mindset, you will continuously strive to enhance user experience at scale, ensuring the delivery of video reels with minimal size compression and latency. From a technical perspective, you will work with a cutting-edge tech stack that includes technologies and frameworks like Kafka, Zookeeper, Apache Pulsar, Spark, Bigtable, HBase, Redis, MongoDB, Elasticsearch, Docker, Kubernetes, and various Video technologies such as OBS, RTMP, Jitsi, and Transcoder. Your role will involve collaborating with diverse stakeholders to deliver scalable and quality technology solutions, while also facilitating platform solutions that extend beyond your team to the wider ecosystem. As an Engineering Manager (EM), you will lead a team of engineers across different levels, guiding them towards realizing Flipkart's vision. You will be responsible for setting the direction and long-term vision for the team, partnering with product, business, and other stakeholders to bring this vision to life. Your role will involve providing technical leadership, creating clear career paths for team members, attracting and retaining top talent, driving strategy and vision, and fostering a strong team culture of responsiveness and agility in execution. Overall, as a key member of the Flipkart team, you will play a crucial role in driving innovation, personalization, and growth in the realm of Video & Live Commerce, while also contributing to the technical excellence and strategic direction of the organization.,

Posted 2 months ago

Apply

7.0 - 12.0 years

20 - 30 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Work Location: Bangalore/Pune/Hyderabad/ NCR Experience: 5-12yrs Required Skills: Proven experience as a Data Engineer with expertise in GCP. Strong understanding of data warehousing concepts and ETL processes. Experience with BigQuery, Dataflow, and other GCP data services Design, develop, and maintain data pipelines on GCP. Implement data storage solutions and optimize data processing workflows. Ensure data quality and integrity throughout the data lifecycle. Collaborate with data scientists and analysts to understand data requirements. Monitor and maintain the health of the data infrastructure. Troubleshoot and resolve data-related issues. Thanks & Regards Suganya R Suganya@spstaffing.in

Posted 2 months ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

66degrees is a leading consulting and professional services company specializing in developing AI-focused, data-led solutions leveraging the latest advancements in cloud technology. With unmatched engineering capabilities and vast industry experience, we help the world's leading brands transform their business challenges into opportunities and shape the future of work. In this role, you will be a senior contractor engaged on a 2.5-month remote assignment with the potential to extend. We are looking for candidates with required skills who can work independently as well as within a team environment. Your responsibilities will include facilitating, guiding, and influencing the client and teams towards an effective architectural pattern. You will become an interface between business leadership, technology leadership, and the delivery teams. Additionally, you will perform Migration Assessments and Produce Migration Plans that encompass Total Cost of Ownership (TCO), Migration Architecture, Migration Timelines, Application Waves, designing solution architecture on Google Cloud to support critical workloads, and Heterogeneous Oracle Migrations to Postgres or Spanner. You will design a migration path that accounts for the conversion of Application Dependencies, Database objects, Data, Data Pipelines, Orchestration, Users, and Security. Your role will also involve overseeing migration activities and providing troubleshooting support, including translation of DDL and DML, executing data transfers using native Google Cloud and 3rd party tools, and setting up and configuring relative Google Cloud components. Furthermore, you will engage with customer teams as a Google Cloud expert to provide Education Workshops, Architectural Recommendations, Technology reviews, and recommendations. Qualifications: - 5+ years of experience with data engineering, cloud architecture, or working with data infrastructure. - 5+ years of Oracle database management and IT experience. - Experience with Oracle Database adjacent products like Golden Gate and Data Guard. - 3+ years of PostgreSQL experience. - Proven experience in performing performance testing and applying remediations to address performance issues. - Experience in designing data models. - Proficiency in Python programming language and SQL. - Advanced SQL skills, including the ability to write, tune, and interpret SQL queries; tool-specific experience in the database platforms listed above is ideal. - Proven experience in migrating and/or implementing cloud databases like Cloud SQL, Spanner, and Bigtable. Desired Skills: - Google Cloud Professional Architect and/or Data Engineer Certification is preferred. 66degrees is committed to protecting your privacy and handles personal information in accordance with the California Consumer Privacy Act (CCPA).,

Posted 2 months ago

Apply

1.0 - 3.0 years

3 - 6 Lacs

Mumbai, Mangaluru

Hybrid

6 months-3 yrs of IT experience Knowledge on Bigquery, SQL Or similar tools Aware of ETL and Data warehouse concepts Good oral and written communication skills Great team player and able to work efficiently with minimal supervision Should have good knowledge of Java or python to conduct data cleansing Preferred: Good communication and problem-solving skills Experience on Spring Boot would be an added advantage Apache Beam developer with Google Cloud BigTable and Google BigQuery is desirable Experience in Google Cloud Platform (GCP) Skills in writing batch and stream processing jobs using Apache Beam Framework (Dataflow) Knowledge of Microservices, Pub/Sub, Cloud Run, Cloud Function Roles and Responsibilities Develop high performance and scalable solutions using GCP that extract, transform, and load big data. Designing and building production-grade data solutions from ingestion to consumption using Java / Python Design and optimize data models on GCP cloud using GCP data stores such as BigQuery Optimizing data pipelines for performance and cost for large scale data lakes. Writing complex, highly-optimized queries across large data sets and to create data processing layers. Closely interact with Data Engineers to identify right tools to deliver product features by performing POC Collaborative team player that interacts with business, BAs and other Data/ML engineers Research new use cases for existing data.

Posted 2 months ago

Apply

6.0 - 11.0 years

15 - 30 Lacs

Pune

Hybrid

Software Engineer - Specialist What youll do Demonstrate a deep understanding of cloud-native, distributed microservice-based architectures. Deliver solutions for complex business problems through standard Software Development Life Cycle (SDLC) practices. Build strong relationships with both internal and external stakeholders, including product, business, and sales partners. Demonstrate excellent communication skills with the ability to both simplify complex problems and also dive deeper if needed. Lead strong technical teams that deliver complex software solutions that scale. Work across teams to integrate our systems with existing internal systems. Participate in a tight-knit, globally distributed engineering team. Provide deep troubleshooting skills with the ability to lead and solve production and customer issues under pressure. Leverage strong experience in full-stack software development and public cloud platforms like GCP and AWS. Mentor, coach, and develop junior and senior software, quality, and reliability engineers. Ensure compliance with secure software development guidelines and best practices. Define, maintain, and report SLA, SLO, and SLIs meeting EFX engineering standards in partnership with the product, engineering, and architecture teams. Collaborate with architects, SRE leads, and other technical leadership on strategic technical direction, guidelines, and best practices. Drive up-to-date technical documentation including support, end-user documentation, and run books. Responsible for implementation architecture decision-making associated with Product features/stories and refactoring work decisions. Create and deliver technical presentations to internal and external technical and non-technical stakeholders, communicating with clarity and precision, and presenting complex information in a concise, audience-appropriate format. What experience you need Bachelor's degree or equivalent experience. 5+ years of software engineering experience. 5+ years experience writing, debugging, and troubleshooting code in mainstream Java and SpringBoot. 5+ years experience with Cloud technology: GCP, AWS, or Azure. 5+ years experience designing and developing cloud-native solutions. 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes. 5+ years experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others. 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understanding infrastructure-as-code concepts, Helm Charts, and Terraform constructs. What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Strong communication and presentation skills. Strong leadership qualities. Demonstrated problem-solving skills and the ability to resolve conflicts. Experience creating and maintaining product and software roadmaps. Working in a highly regulated environment. Experience on GCP in Big data and distributed systems - Dataflow, Apache Beam, Pub/Sub, BigTable, BigQuery, GCS. Experience with backend technologies such as JAVA/J2EE, SpringBoot, Golang, gRPC, SOA, and Microservices. Source code control management systems (e.g., SVN/Git, Github, Gitlab), build tools like Maven & Gradle, and CI/CD like Jenkins or Gitlab. Agile environments (e.g., Scrum, XP). Relational databases (e.g., SQL Server, MySQL). Atlassian tooling (e.g., JIRA, Confluence, and Github). Developing with modern JDK (v1.7+). Automated Testing: JUnit, Selenium, LoadRunner, SoapUI.

Posted 2 months ago

Apply

11.0 - 17.0 years

45 - 50 Lacs

Pune

Work from Office

: Job Title: Fintech Product Engineering Lead Corporate Title: VP Location: Pune, India Role Description Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your Key Responsibilities: Ability to navigate a strong sense of urgency while maintaining focus and clarity. Skilled at solving complex design challenges independently , without needing oversight. Proven track record of quickly delivering high-quality code and features . Able to inspire and energise teams through urgency, ownership, and technical excellence. Willing to do whatever it takes to ensure product success , from strategy to hands-on execution. Deep experience in architecting scalable systems (HLD & LLD) in fast-paced environments. Comfortable leading through ambiguity, change, and high-growth pressure . Known for balancing speed with engineering quality and operational readiness . Strong communicator can align teams, resolve conflicts, and drive decisions fast. A true builder mindset acts with ownership, speed, and high accountability . Your skills and experience Hands-on experience in building responsive UIs withReact and Javascript. Hands-on knowledge ofGo(Golang)/ Java and GIN/ SpringBoot framework for backend development. Proficient in HTML, CSS and styling tools like Tailwind. Proficient inRESTful, GraphQLandgRPCfor building scalable and high-performance APIs. Experience with GCP/AWS, for building scalable, resilient micro-service based architectures. Experience with relational and NoSQL databases (e.g.,PostgreSQL,MySQL,Firestore,BigTable). Experience with logging, monitoring and alerting using ( egGrafana, Prometheus, ELK ) Familiarity with CI/CD pipelines, automated testing and deployment strategies with detailed knowledge on Terrafom. Knowledge of best practices for building secure applications (e.g., mTLS, Encryption, OAuth, JWT and Data Compliance). Knowledge of disaster recovery, zero-downtime deploys, and backup strategies How well support you

Posted 2 months ago

Apply

0.0 - 3.0 years

6 - 8 Lacs

Noida

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)

Posted 2 months ago

Apply

4.0 - 8.0 years

22 - 25 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)

Posted 2 months ago

Apply

7.0 - 12.0 years

25 - 27 Lacs

Hyderabad

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)

Posted 2 months ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Bengaluru

Work from Office

Key Responsibilities: Design, develop, and maintain backend services and APIs using Python (Flask/Django/FastAPI). Develop scalable and secure microservices for data processing, analytics, and APIs. Manage and optimize data storage with SQL (PostgreSQL/MySQL) and NoSQL databases (MongoDB/Firestore/Bigtable). Design and implement CI/CD pipelines and automate cloud deployments on GCP (App Engine, Cloud Run, Cloud Functions, GKE). Collaborate with front-end developers, product owners, and other stakeholders to integrate backend services with business logic and UI. Optimize application performance and troubleshoot issues across backend systems. Implement best practices in code quality, testing (unit/integration), security, and scalability. Qualifications: Bachelors or masters degree in computer science, Data Science, or a related field. Must have 3+ years of relevant IT experience Strong hands-on programming experience in Python. Experience with one or more Python frameworks: Flask, Django, or FastAPI. Deep understanding of RESTful API design and development. Proficient in working with relational databases (PostgreSQL, MySQL) and NoSQL databases (MongoDB, Firestore, BigQuery). Solid understanding and experience with GCP services Familiarity with Git, CI/CD tools (e.g., Cloud Build, Jenkins, GitHub Actions). Strong debugging, problem-solving, and performance tuning skills.

Posted 2 months ago

Apply

5.0 - 10.0 years

18 - 25 Lacs

Sholinganallur

Hybrid

Skills Required:Big Query,, BigTable, Data Flow, Pub/Sub, Data fusion, Dataproc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Function, App Engine, AIRFLOW, Cloud Storage, BigTable, Cloud Spanner Skills Preferred:ETL Experience Required:• 5+ years of experience in data engineering, with a focus on data warehousing and ETL development (including data modelling, ETL processes, and data warehousing principles). • 5+ years of SQL development experience • 3+ years of Cloud experience (GCP preferred) with solutions designed and implemented at production scale. • Strong understanding and experience of key GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, BigQuery, Dataflow, DataFusion, Dataproc, Cloud Build, AirFlow, and Pub/Sub, alongside and storage including Cloud Storage, Bigtable, Cloud Spanner • Experience developing with micro service architecture from container orchestration framework. • Designing pipelines and architectures for data processing • Excellent problem-solving skills, with the ability to design and optimize complex data pipelines. • Strong communication and collaboration skills, capable of working effectively with both technical and non-technical stakeholders as part of a large global and diverse team • Strong evidence of self-motivation to continuously develop own engineering skills and those of the team. • Proven record of working autonomously in areas of high ambiguity, without day-to-day supervisory support • Evidence of a proactive mindset to problem solving and willingness to take the initiative. • Strong prioritization, co-ordination, organizational and communication skills, and a proven ability to balance workload and competing demands to meet deadlines Thanks & Regards, Varalakshmi V 9019163564

Posted 2 months ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Bengaluru

Work from Office

About the Team When 5% of Indian households shop with us, its important to build resilient systems to manage millions of orders every day. Weve done this with zero downtime! ?? Sounds impossible? Well, thats the kind of Engineering muscle that has helped Meesho become the e-commerce giant that it is today. We value speed over perfection and see failures as opportunities to become better. Weve taken steps to inculcate a strong Founders Mindset across our engineering teams, making us grow and move fast. We place special emphasis on the continuous growth of each team member - and we do this with regular 1-1s and open communication. As a Database Engineer II, you will be part of self-starters who thrive on teamwork and constructive feedback. We know how to party as hard as we work! If we arent building unparalleled tech solutions, you can find us debating the plot points of our favorite books and games or even gossiping over chai. So, if a day filled with building impactful solutions with a fun team sounds appealing to you, join us. About the Role As a Database Engineer II, youll establish and implement the best Nosql Database Engineering practices proactively. Youll have opportunities to work on different Nosql technologies on a large scale. Youll also work closely with other engineering teams and establish seamless collaborations within the organization. Being proficient in emerging technologies and the ability to work successfully with a team is key to success in this role. What you will do Manage, maintain and monitor a multitude of Relational/NoSQL databases clusters, ensuring obligations to SLAs. Manage both in-house and SaaS solutions in the Public cloud (Or 3rd party).Diagnose, mitigate and communicate database-related issues to relevant stakeholders. Design and Implement best practices for planning, provisioning, tuning, upgrading and decommissioning of database clusters. Understand the cost optimization aspects of such tools/softwares and implement cost control mechanisms along with continuous improvement. Advice and support product, engineering and operations teams. Maintain general backup/recovery/DR of data solutions. Work with the engineering and operations team to automate new approaches for scalability, reliability and performance. Perform R&D on new features and for innovative solutions. Participate in on-call rotations. What you will need 5 years+ experience in provisioning & managing Relational/NoSQL databases. Proficiency in two or more: Mysql,PostgreSql, Big Table ,Elastic Search, MongoDB, Redis, ScyllaDB. Proficiency in Python programming language. Experience with deployment orchestration, automation, and security configuration management (Jenkins, Terraform, Ansible). Hands-on experience with Amazon Web Services (AWS)/ Google Cloud Platform (GCP).Comfortable working in Linux/Unix environments. Knowledge of TCP/IP stack, Load balancer, Networking. Proven ability to drive projects to completion. A degree in computer science, software engineering, information technology or related fields will be an advantage.

Posted 2 months ago

Apply

4.0 - 9.0 years

10 - 15 Lacs

Pune

Work from Office

MS Azure Infra (Must), PaaS will be a plus, ensuring solutions meet regulatory standards and manage risk effectively. Hands-On Experience using Terraform to design and deploy solutions (at least 5+ years), adhering to best practices to minimize risk and ensure compliance with regulatory requirements. Primary Skill AWS Infra along with PaaS will be an added advantage. Certification in Terraform is an added advantage. Certification in Azure and AWS is an added advantage. Can handle large audiences to present HLD, LLD, and ERC. Able to drive Solutions/Projects independently and lead projects with a focus on risk management and regulatory compliance. Secondary Skills Amazon Elastic File System (EFS) Amazon Redshift Amazon S3 Apache Spark Ataccama DQ Analyzer AWS Apache Airflow AWS Athena Azure Data Factory Azure Data Lake Storage Gen2 (ADLS) Azure Databricks Azure Event Hub Azure Stream Analytics Azure Synapse Analytics BigID C++ Cloud Storage Collibra Data Governance (DG) Collibra Data Quality (DQ) Data Lake Storage Data Vault Modeling Databricks DataProc DDI Dimensional Data Modeling EDC AXON Electronic Medical Record (EMR) Extract, Transform & Load (ETL) Financial Services Logical Data Model (FSLDM) Google Cloud Platform (GCP) BigQuery Google Cloud Platform (GCP) Bigtable Google Cloud Platform (GCP) Dataproc HQL IBM InfoSphere Information Analyzer IBM Master Data Management (MDM) Informatica Data Explorer Informatica Data Quality (IDQ) Informatica Intelligent Data Management Cloud (IDMC) Informatica Intelligent MDM SaaS Inmon methodology Java Kimball Methodology Metadata Encoding & Transmission Standards (METS) Metasploit Microsoft Excel Microsoft Power BI NewSQL noSQL OpenRefine OpenVAS Performance Tuning Python R RDD Optimization SaS SQL Tableau Tenable Nessus TIBCO Clarity

Posted 2 months ago

Apply

10.0 - 15.0 years

30 - 40 Lacs

Noida, Pune, Bengaluru

Hybrid

Strong Experience in Big Data- Data Modelling, Design, Architecting & Solutioning Understands programming language like SQL, Python, R-Scala. Good Python skills, - Experience from data visualisation tools such as Google Data Studio or Power BI Knowledge in A/B Testing, Statistics, Google Cloud Platform, Google Big Query, Agile Development, DevOps, Date Engineering, ETL Data Processing Strong Migration experience of production Hadoop Cluster to Google Cloud. Good To Have:- Expert in Big Query, Dataproc, Data Fusion, Dataflow, Bigtable, Fire Store, CloudSQL, Cloud Spanner, Google Cloud Storage, Cloud Composer, Cloud Interconnect, Etc

Posted 2 months ago

Apply

5.0 - 7.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering,Bachelor Of Technology,Master Of Engineering,Master Of Technology,Intergrated course BCA+MCA,Master of Science (Technology),Bachelor Of Science (Tech),Bachelor Of Comp. Applications Service Line Application Development and Maintenance Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem-solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Technical and Professional : Primary skills:Technology-Big Data-Big Table,Technology-Cloud Integration-Azure Data Factory (ADF),Technology-Data On Cloud - Platform-AWS Preferred Skills: Technology-Big Data-Big Table-GCP Technology-Data On Cloud - Platform-AWS Technology-Cloud Integration-Azure Data Factory (ADF)

Posted 2 months ago

Apply

10.0 - 15.0 years

12 - 16 Lacs

Pune

Work from Office

To be successful in this role, you should meet the following requirements(Must have ) Payments and Banking experience is a must. Experience in implementing and monitoring data governance using standard methodology throughout the data life cycle, within a large organisation. Demonstrate up-to-date knowledge of data governance theory, standard methodology and the practical considerations. Demonstrate knowledge of data governance industry standards and tools. Overall experience of 10+ years experience in Data governance, encompassing Data Quality management, Master data management, Data privacy & compliance, Data cataloguing and metadata management, Data security, maturity and lineage. Prior experience in implementing an end-to-end data governance framework. Experience in Automating Data cataloguing, ensuring accurate, consistent metadata, making data easily discoverable and usable. Domain experience across the payments and banking lifecycle. An analytical mind and inclination for problem-solving, with an attention to detail. Ability to effectively navigate and deliver transformation programmes in large global financial organisations, amidst the challenges posed by bureaucracy, globally distributed teams and local data regulations. Strong communication skills coupled with presentation skills of complex information and data. A first-class degree in Engineering or relevant field - with 2 or more of the following subjects as a major Mathematics, Computer Science, Statistics, Economics. The successful candidate will also meet the following requirements(Good to have ) Database TypeRelational, NoSQL, DocumentDB DatabasesOracle, PostgreSQL, BigQuery, Big Table, MongoDB, Neo4j Experience in Conceptual/ Logical/Physical Data Modeling. Experience in Agile methodology and leading agile delivery, aligned with organisational needs. Effective leader as well as team player with a strong commitment to quality and efficiency.

Posted 2 months ago

Apply

10.0 - 15.0 years

12 - 16 Lacs

Pune

Work from Office

To be successful in this role, you should meet the following requirements(Must have ) Expertise in Conceptual/ Logical/Physical Data Modeling. Payments and Banking experience is a must. Database Design Database TypeRelational, NoSQL, DocumentDB DatabasesOracle, PostgreSQL, BigQuery, Big Table, MongoDB, Neo4j ToolsErwin, Visual Paradigm Solid experience in PL/SQL, Python, Unix Shell scripting, Java. Domain experience across the payments and banking lifecycle. An analytical mind and inclination for problem-solving, with an attention to detail. Sound knowledge of payments workflows and statuses across various systems within a large global bank. Experience in collecting large data sets, identifying patterns and trends in data sets. Overall experience of 10+ years, with considerable experience in Big Data and Relational Database. Prior experience across requirements gathering, build and implementation, stakeholder co-ordination, release management and production support. Ability to effectively navigate and deliver transformation programmes in large global financial organisations, amidst the challenges posed by bureaucracy, globally distributed teams and local data regulations. Strong communication skills coupled with presentation skills of complex information and data Strong communication skills coupled with presentation skills of complex information and data. A first-class degree in Engineering or relevant field with 2 or more of the following subjects as a major Mathematics, Computer Science, Statistics, Economics. The successful candidate will also meet the following requirements(Good to have ) Understanding of DevOps and CI tools (Jenkins, Git, Grunt, Bamboo, Artifactory) would be an added advantage. Experience in Agile methodology and leading agile delivery, aligned with organisational needs. Effective leader as well as team player with a strong commitment to quality and efficiency.

Posted 2 months ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

Hyderabad

Work from Office

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxIntuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)

Posted 2 months ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxIntuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)

Posted 2 months ago

Apply

6.0 - 10.0 years

15 - 20 Lacs

Hyderabad

Work from Office

Develop, optimize, and maintain scalable data pipelines using Python and PySpark. Design and implement data processing workflows leveraging GCP services such as: BigQuery Dataflow Cloud Functions Cloud Storage

Posted 2 months ago

Apply

2.0 - 6.0 years

5 - 9 Lacs

Noida

Work from Office

About Us :. At TELUS Digital, we enable customer experience innovation through spirited teamwork, agile thinking, and a caring culture that puts customers first. TELUS Digital is the global arm of TELUS Corporation, one of the largest telecommunications service providers in Canada. We deliver contact center and business process outsourcing (BPO) solutions to some of the world's largest corporations in the consumer electronics, finance, telecommunications and utilities sectors. With global call center delivery capabilities, our multi-shore, multi-language programs offer safe, secure infrastructure, value-based pricing, skills-based resources and exceptional customer service all backed by TELUS, our multi-billion dollar telecommunications parent.. Required Skills :. Minimum 6 years of experience in Architectecture, Design and building data extraction, loading, and transformation pipelines and data products across on-prem and cloud platforms.. Perform application impact assessments, requirements reviews, and develop work estimates.. Develop test strategies and site reliability engineering measures for data products and solutions.. Lead agile development "scrums" and solution reviews.. Mentor junior Data Engineering Specialists.. Lead the resolution of critical operations issues, including post-implementation reviews.. Perform technical data stewardship tasks, including metadata management, security, and privacy by design.. Demonstrate expertise in SQL and database proficiency in various data engineering tasks.. Automate complex data workflows by setting up DAGs in tools like Control-M, Apache Airflow, and Prefect.. Develop and manage Unix scripts for data engineering tasks.. Intermediate proficiency in infrastructure-as-code tools like Terraform, Puppet, and Ansible to automate infrastructure deployment.. Proficiency in data modeling to support analytics and business intelligence.. Working knowledge of ML Ops to integrate machine learning workflows with data pipelines.. Extensive expertise in GCP technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog, Cloud. Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion, Dataproc (good to have), and BigTable.. Advanced proficiency in programming languages (Python).. Qualifications:. Bachelor's degree in Software Engineering, Computer Science, Business, Mathematics, or related field.. Analytics certification in BI or AI/ML.. 6+ years of data engineering experience.. 4 years of data platform solution architecture and design experience.. GCP Certified Data Engineer (preferred).. Show more Show less

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies