Home
Jobs

962 Bigquery Jobs - Page 31

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10 - 14 years

25 - 30 Lacs

Pune

Work from Office

Naukri logo

Role Description The Business Architect defines the technical solution design of specific IT platforms and provides guidance to the squad members in order to design, build, test and deliver high quality software solutions. A key element in this context is translation of functional and non-functional business requirements into an appropriate technical solution design, leveraging best practices and consistent design patterns. The Business Architect collaborates closely with Product Owners Chapter leads and Squad members to ensure consisten adherence to the agreed-upon application design and is responsible for maintaining an appropriate technical design documentation. The Solution architect ensures that the architectures and designs of solutions conform to the principles blueprints, standards, patterns etc,that have been established by the Enterprise Architecture in this context the Business Architect is closely collaborating with the respective solution Architect to ensure architecture compliance. The business Architect also actively contributes to the definition and enrichment of design patterns and standards with the aim to leverage those across squads and tribes. Your key responsibilities Define the technical Architecture of IT Solutions in line with functional and non-functional requirements following consistent design patterns and best practices. Ensure that the solution design is in sync with WM target Architecture blueprints and principles, as well as with overarching DB architecture and security standards. Create appropriate technical design documentation and ensure this is kept up-to-date. Provide guidance to the squad members to design, build, test and deliver high quality software solutions in line with business requirements Responsible for all aspects of the solution architecture (i.e. Maintainablity, scalability, effective integration with other solutions, usage of shared solutions and components where possible, optimization of the resource consumption etc. ) with the object to meet the appropriate balance between business needs and total cost of ownership Closely collaborate with enterprise architecture to ensure architecture compliance and make sure that any design options are discussed in a timely manner to allow sufficient time for deliberate decision taking Present architecture proposals to relevant forums along with enterprise architect at different levels and drive the process to gain the necessary architecture approvals. Collaborate with relevant technology stakeholders within other squads and across tribes to ensure the cross-squad and cross-tribe solution architecture synchronization and alignment Contribute to definition and enrichment of appropriate design patterns and standards that can be leveraged across WM squads tribes Serve as a Counsel to designers and developers and carry out reviews of software designs and high level detailed level design documentation provided by other squad members Lead the technical discussions with CCO, Data factory, Central Data quality and Complience, end to end and control functions for technical queries contribute to peer level solution architecture reviews e.g. within a respective chapter Your skills and experience Ability experience in defining the high level and low level technical solution designs for complex initiatives very good analytical skills and ability to oversee structure complex tasks Hands on skills with various google cloud components like storage buckets, BigQuery, Dataproc, cloud composer, cloud functions etc aling with Pyspark, Scala is essential,. Good to have experience in Cloud SQL, Dataflow, Java and Unix Experience with implementing a google cloud based solution is essesntial persuasive power and persistence in driving adherence to solution design within the squad Ability to apply the appropriate architectural patterns considering the relevant functional and nonfunctional requirements proven ability to balance business demands and IT capabilities in terms of standardization reducing risk and increasing the IT flexibility comfortable working in an open, highly collaborative team ability to work in an agile and dynamic environment and to build up the knowledge related to new technology/ solutions in an effective and timely manner ability to communicate effectively with other technology stakeholders feedback: seek feedback from others, provides feedback to others in support of their development and is open and honest while dealing constructively with criticism inclusive leadership: values individuals and embraces diversity by integrating differences in promoting diversity and inclusion across teams and functions coaching: understands and anticipates people's needs skills and abilities in order to coach, motivate and empower them for success broad set of architecture knowledge and application design skills and - depending on the specific squad requirements - in-depth expertise with regards to specific architecture domains (e.g. service and integration architecture web and mobile front end architecture guitar architecture security architecture infrastructure architecture) and related technology stacks and design patterns experience in establishing thought leadership in solution architecture practices and ability to lead design and development teams and defining building and delivering first class software solutions familiar with current and emerging technologies, tools, frameworks and design patterns experience in effectively collaborating across multiple teams and geographies ability to appropriately consider other dimensions(e.g. financials, risk, time to market) On top of the architecture drivers in order to propose balanced and physical architecture solutions Experience Qualifications : 10+ years relevant experience as technology Manager within the IT support industry experience in financial banking industry preferred Minimum 8 years' experience supporting Oracle Platform in a mid-size to large corporate environment Preferably from Banking Wealth Management experience Must have experience working in agile organization.

Posted 3 months ago

Apply

4 - 6 years

6 - 8 Lacs

Pune

Work from Office

Naukri logo

Role Description Our team is part of the area Technology, Data, and Innovation (TDI) Private Bank. Within TDI, Partner data is the central client reference data system in Germany. As a core banking system, many banking processes and applications are integrated and communicate via -2k interfaces. From a technical perspective, we focus on mainframe but also build solutions on premise cloud, restful services, and an angular frontend. Next to the maintenance and the implementation of new CTB requirements, the content focus also lies on the regulatory and tax topics surrounding a partner/ client. We are looking for a very motivated candidate for the Big Data and Google Cloud area. Your key responsibilities You are responsible for the implementation of the new project on GCP (Spark, Dataproc, Dataflow, BigQuery, Terraform etc) in the whole SDLC chain You are responsible for the support of the migration of current functionalities to Google Cloud You are responsible for the stability of the application landscape and support software releases You also support in L3 topics and application governance You are responsible in the CTM area for coding as part of an agile team (Java, Scala, Spring Boot) Your skills and experience You have experience with databases (HDFS, BigQuery, etc.) and development preferably for Big Data and GCP technologies Strong understanding of Data Mesh Approach and integration patterns Understanding of Party data and integration with Product data Your architectural skills for big data solutions, especially interface architecture allows a fast start You have experience in at least: Spark, Java ,Scala and Python, Maven, Artifactory, Hadoop Ecosystem, Github Actions, GitHub, Terraform scripting You have knowledge in customer reference data, customer opening processes and preferably regulatory topics around know your customer processes You can work very well in teams but also independent and are constructive and target oriented Your English skills are good and you can both communicate professionally but also informally in small talks with the team.

Posted 3 months ago

Apply

7 - 11 years

15 - 20 Lacs

Pune

Work from Office

Naukri logo

Role Description As one of the worlds leading asset management firms, data is at the heart of our operations. To support our growing data governance initiatives, were seeking a Data Governance Tooling Engineer to help design, implement, and maintain the tools and technologies that form the foundation of our data governance ecosystem. This position will require strong software engineering skills and a deep understanding of building and operating platform services in a complex enterprise environment. The role is part of DWSs Data Platform Engineering organization. Data Platform Engineering builds and operates our critical enterprise data ecosystem to ensure high-quality, secure and compliant data flows across the organization. Your key responsibilities Support the deployment and configuration of Collibra and other data governance platforms Develop and customize workflows, dashboards, and integrations with Collibra and other tools Configure, implement and operate tooling for data observability and DQ management Troubleshoot and resolve technical issues related to the data governance technology stack Collaborate with data stewards, data owners, and business stakeholders to understand requirements and deliver technical solutions Identify opportunities to optimize data governance processes using automation and advanced tooling features Your skills and experience Strong software engineering background, with experience in Python, Java, or similar programming languages Strong software architecture skills Good understanding of fundamental data engineering concepts Hands-on experience with the Collibra Suite or similar platforms and their integration in the enterprise ecosystem Proficiency in building and deploying cloud-native applications on Google Cloud Platform, knowledge of IaC (Terraform) Knowledge of modern data platforms such as Snowflake or GCP BigQuery Familiarity with data quality measurement and related concepts.

Posted 3 months ago

Apply

5 - 7 years

10 - 20 Lacs

Chennai, Pune, Delhi NCR

Work from Office

Naukri logo

5 to 7 years of experience as a Data Engineer, with hands-on experience working on Google Cloud Platform (GCP). Strong expertise in BigQuery and SQL.

Posted 3 months ago

Apply

15 - 20 years

17 - 22 Lacs

Pune

Work from Office

Naukri logo

Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank Your Key Responsibilities: The candidate is expected to; Hands-on engineering lead involved in analysis, design, design/code reviews, coding and release activities Champion engineering best practices and guide/mentor team to achieve high performance. Work closely with Business stakeholders, Tribe lead, Product Owner, Lead Architect to successfully deliver the business outcomes. Acquire functional knowledge of the business capability being digitized/re-engineered. Demonstrate ownership, inspire others, innovative thinking, growth mindset and collaborate for success. Your Skills & Experience: Minimum 15 years of IT industry experience in Full stack development Expert in Java, Spring Boot, NodeJS, SQL/PLSQL, ReactJS, Strong experience in Big data processing Apache Spark, Hadoop, Bigquery, DataProc, Dataflow etc Strong experience in Kubernetes, OpenShift container platform Experience with Databases Oracle, PostgreSQL, MongoDB, Redis/hazelcast, should understand data modeling, normalization, and performance optimization Experience in message queues (RabbitMQ/IBM MQ, JMS) and Data streaming i.e. Kafka, Pub-sub etc Experience of working on public cloud GCP preferred, AWS or Azure Knowledge of various distributed/multi-tiered architecture styles Micro-services, Data mesh, Integration patterns etc Experience on modern software product delivery practices, processes and tooling and BIzDevOps skills such as CI/CD pipelines using Jenkins, Git Actions etc Experience on designing solutions, based on DDD and implementing Clean Hexagonal Architecture efficient systems that can handle large-scale operation Experience on leading teams and mentoring developers Focus on quality experience with TDD, BDD, Stress and Contract Tests Proficient in working with APIs (Application Programming Interfaces) and understand data formats like JSON, XML, YAML, Parquet etc Key Skills: Java Spring Boot NodeJS SQL/PLSQL ReactJS Advantageous: Having prior experience in Banking/Finance domain Having worked on hybrid cloud solutions preferably using GCP Having worked on product development How we'll support you: Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs Our values define the working environment we strive to create diverse, supportive and welcoming of different views. We embrace a culture reflecting a variety of perspectives, insights and backgrounds to drive innovation. We build talented and diverse teams to drive business results and encourage our people to develop to their full potential. Talk to us about flexible work arrangements and other initiatives we offer. We promote good working relationships and encourage high standards of conduct and work performance. We welcome applications from talented people from all cultures, countries, races, genders, sexual orientations, disabilities, beliefs and generations and are committed to providing a working environment free from harassment, discrimination and retaliation. Visit to discover more about the culture of Deutsche Bank including Diversity, Equity & Inclusion, Leadership, Learning, Future of Work and more besides.

Posted 3 months ago

Apply

7 - 11 years

15 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Role Description Overview Our Technology, Data, and Innovation (TDI) strategy is focused on strengthening engineering expertise, introducing an agile delivery model, as well as modernizing the Bank's Information Technology (IT) infrastructure with long-term investments and taking advantage of cloud computing. We continue to invest and build a team of visionary tech talentwhowill ensure we thrive in this period of unprecedented change for the industry, so we are seeking aLead Engineerto work in the Transaction Monitoring and Data Controls team. You willbe hands on technical engineer withinour delivery pods and deliver software solutions. As lead engineerlead you willdesign software architecture and implement complex solutions, drivingre-useand best practices. You will contribute to strategic design decisions and define engineering approaches that can be disruptive, with the goals of simplifying architectures, reducing technical debt. Your key responsibilities Leverage best practices - Build Data Driven Decisions Defineand buildapplications for re-platform or re-architect strategies and implement blueprints and patterns for common application architectures Collaborateacross the TDI areas such as Cloud Platform, Security, Data, Risk&Compliance areasto create optimum solutions for the Business, increasing re-use, creating best practice, and sharing knowledge Driveoptimizationsin software development life cycle (SDLC) processtoprovide productivity improvements, including tools and techniques Enablethe adoption of practices such as Site Reliability Engineer (SRE) andDev/SecOpsto minimize toil and manual tasks and increase automation and stability. Your skills and experience Skills Youll Need Full experience of all Agile software development frameworks and processes Technical architecture and software design with engineer, focused on building working examples and reference implementations in code Deep professional expertise in Python/pySpark, Docker, Kubernetes, automated testing for Data driven projects Sound knowledge of Big Data technologies Hive, Impala, Spark, BigQuery with the ability to write high-performing & efficient Structured Query Language (SQL) and optimize/simplify existing queries You have experience inimplementing applications ontocloud platforms (Azure, AWS or Global Control Programme (GCP)) and usage of their major components (Software Defined Networks, Identity and Access Management (IAM), Compute, Storage, etc.) in order todefinecloud native application architecturessuch as Microservices, Service Mesh or Data Streaming applications. Skills That Will Help You Excel Be a team-player You would adopt anautomation-first approaches totesting, deployment, security, and compliance of solutions through Infrastructure as Code and automated policy enforcement You enjoy supporting our community of engineersand creates opportunities for progression, promotingcontinuous learning and skills development.

Posted 3 months ago

Apply

6 - 10 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking an experienced PySpark Developer / Data Engineer to design, develop, and optimize big data processing pipelines using Apache Spark and Python (PySpark). The ideal candidate should have expertise in distributed computing, ETL workflows, data lake architectures, and cloud-based big data solutions. Key Responsibilities: Develop and optimize ETL/ELT data pipelines using PySpark on distributed computing platforms (Hadoop, Databricks, EMR, HDInsight). Work with structured and unstructured data to perform data transformation, cleansing, and aggregation. Implement data lake and data warehouse solutions on AWS (S3, Glue, Redshift), Azure (ADLS, Synapse), or GCP (BigQuery, Dataflow). Optimize PySpark jobs for performance tuning, partitioning, and caching strategies. Design and implement real-time and batch data processing solutions. Integrate data pipelines with Kafka, Delta Lake, Iceberg, or Hudi for streaming and incremental updates. Ensure data security, governance, and compliance with industry best practices. Work with data scientists and analysts to prepare and process large-scale datasets for machine learning models. Collaborate with DevOps teams to deploy, monitor, and scale PySpark jobs using CI/CD pipelines, Kubernetes, and containerization. Perform unit testing and validation to ensure data integrity and reliability. Required Skills & Qualifications: 6+ years of experience in big data processing, ETL, and data engineering. Strong hands-on experience with PySpark (Apache Spark with Python). Expertise in SQL, DataFrame API, and RDD transformations. Experience with big data platforms (Hadoop, Hive, HDFS, Spark SQL). Knowledge of cloud data processing services (AWS Glue, EMR, Databricks, Azure Synapse, GCP Dataflow). Proficiency in writing optimized queries, partitioning, and indexing for performance tuning. Experience with workflow orchestration tools like Airflow, Oozie, or Prefect. Familiarity with containerization and deployment using Docker, Kubernetes, and CI/CD pipelines. Strong understanding of data governance, security, and compliance (GDPR, HIPAA, CCPA, etc.). Excellent problem-solving, debugging, and performance optimization skills.

Posted 3 months ago

Apply

4 - 8 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Who You Are Strong sense of ownership, accountability, and business acumen Passion for your team's vision/mission & the entrepreneurial drive to make things happen Mentor the team in growing and reaching their full potential Passion for designing modular systems using reusable components, SDKs, and robust APIs Extensive experience in building tools, frameworks, and CI/CD pipelines Ability to think at scale, bringing a focus on continuous delivery methodologies from design through deployment and operations Collaborate, socialize, and drive cross-team technical initiatives Possess can-do attitude and strong work ethic. Act with empathy and humility Must Have Skills 4+ years experience in software development including 2+ years designing, building, and operating high-scale, mission critical cloud-based production systems Proficient in Java or GoLang or Python Solid understanding and development experience with Microservices, SOA, REST, HTTP, WebSockets, gRPC, SSL/TLS Strong coding skills Solid foundation in data structures and algorithms Delivered ETL/ELT solutions including data extraction, transformation, cleansing, data integration and data management Implemented batch & near real time data ingestion pipelines Working knowledge of databases, data platforms and developer tools like dbt, Snowflake, BigQuery, Airflow, or other data tools Experience developing and operating large scale distributed systems with Kubernetes Excellent troubleshooting, analytical and decision-making skills Disciplined approach to documentation Ability to lead, partner, and collaborate cross functionally across an engineering organization Bachelors or Masters degree in Engineering, Computer Science or equivalent experience Nice To Have Skills Strong knowledge of databases SQL, NoSQL, Time series, GraphDB etc Experience with AI infrastructure and technologies Responsibilities As part of the team, you will design, implement, and scale a massive set of AI backend services (cloud, on-prem and hybrid) You will collaborate with your peers in designing real time systems, streaming pipelines, and low latency architectures Everything will be fully automated, high available and auto-scalable You will mentor, coach and inspire many engineers across multiple teams and functions Represent Uniphore globally as part of research community contributing to opensource repositories and publications Contribute to the intellectual property of Uniphore Location preference: India - Bangalore Uniphore is an equal opportunity employer committed to diversity in the workplace. We evaluate qualified applicants without regard to race, color, religion, sex, sexual orientation, disability, veteran status, and other protected characteristics. For more information on how Uniphore uses AI to unifyand humanizeevery enterprise experience, please visit?www.uniphore.com.

Posted 3 months ago

Apply

7 - 11 years

9 - 14 Lacs

Pune

Work from Office

Naukri logo

Role Description: The DB Cloud FinOps function drives financial accountability of cloud consumption, providing distributed teams with insights into their consumption, spend and optimisation / control options to ensure cloud usage is managed efficiently. We are seeking an individual with a strong background in GCP services, FinOps and IT Application management. The applicant will be the IT Owner of the FinOps Application platform on our Google Cloud Platform (GCP) environment. The FinOps Application consists of our GCP landing zone (GCP projects, resources datasets) and our FinOps Looker instance. This together with our teams skillset is the strategic cloud cost management function for the bank. The role covers E2E platform management, which includes creation and maintenance of BigQuery datasets and views, administration of billing exports, deployment of Cloudfunctions, API integrations and compliance adherence. Your key responsibilities Ensure the FinOps application is maintained in accordance with the Banks IT Security Risk, Audit and Compliance requirements. Provisioning and maintenance of GCP billing, recommender and database services to support FinOps internal and external capabilities (e.g. BigQuery, GCS buckets and billing exports) Provisioning of serverless GCP services to support the automation of key FinOps insights and optimization / recommendation capabilities (i.e, Pub/Sub topics, Cloud Run, API queries and Cloud Functions) Management of FinOps GCP IAM roles, permissions and Github repositories. Support with the integration of our FinOps platform into other tools used withing the bank (e.g. Jira, Looker) Your skills and experience Infrastructure & Cloud technology industry experience (7+ Years) Strong understanding of Software Development Lifecycle methodology Proficient with Terraform, Python and Github (3+ Years) Proficient in GCP infrastructure and GCP services Proficient in data analysis tools (e.g. Excel, PowerQuery, SQL) Strong analytical and problem solving skills Experience in FinOps preferred

Posted 3 months ago

Apply

4 - 8 years

20 - 25 Lacs

Pune

Hybrid

Naukri logo

Experience: 4-8years Job Location: Pune, Hybrid Primary Skills : Terraform,Python, Shell, JSON, automation tools Secondary Skills : ITSM, JIRA, GitHub, CICD practices, json, API invocation Role purpose: Min 4+ years experience in creating GCP infrastructure with data integration patterns for streaming and batch load processes for large scale data platforms / data warehouses Good knowledge and experience in Terraform, Unix, Network Basics, Docker, Kubernetes, Google Cloud Knowledge(VPC, Compute Engine, Load Balancer, Cloud Build), Helm Good knowledge and experience in using CI-CD Pipelines, Jenkins, Cloud build Good understanding of GCP cloud platform Hands on experience in terraform Knowledge and experience in working in agile lean methodologies Preferred GCP DevOps certified, Hashicorp Terraform Certification. Essential Prior work experience of working in DWH Good knowledge of Data Flow, BigQuery , Composer, Stack driver Relevant work experience (3 to 5+) years Desired ITIL Telecom Domain Knowledge GCP Data Engineer

Posted 3 months ago

Apply

15 - 20 years

20 - 35 Lacs

Pune

Work from Office

Naukri logo

Job Title: Lead Engineer (RYR#2025) Role Description Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank What we'll offer you: As part of our flexible scheme, here are just some of the benefits that you'll enjoy Your Key Responsibilities: The candidate is expected to; Hands-on engineering lead involved in analysis, design, design/code reviews, coding and release activities Champion engineering best practices and guide/mentor team to achieve high performance. Work closely with Business stakeholders, Tribe lead, Product Owner, Lead Architect to successfully deliver the business outcomes. Acquire functional knowledge of the business capability being digitized/re-engineered. Demonstrate ownership, inspire others, innovative thinking, growth mindset and collaborate for success. Your Skills & Experience: Minimum 15 years of IT industry experience in Full stack development Expert in Java, Spring Boot, NodeJS, SQL/PLSQL, ReactJS, Strong experience in Big data processing Apache Spark, Hadoop, Bigquery, DataProc, Dataflow etc Strong experience in Kubernetes, OpenShift container platform Experience with Databases Oracle, PostgreSQL, MongoDB, Redis/hazelcast, should understand data modeling, normalization, and performance optimization Experience in message queues (RabbitMQ/IBM MQ, JMS) and Data streaming i.e. Kafka, Pub-sub etc Experience of working on public cloud GCP preferred, AWS or Azure Knowledge of various distributed/multi-tiered architecture styles Micro-services, Data mesh, Integration patterns etc Experience on modern software product delivery practices, processes and tooling and BIzDevOps skills such as CI/CD pipelines using Jenkins, Git Actions etc Experience on designing solutions, based on DDD and implementing Clean / Hexagonal Architecture efficient systems that can handle large-scale operation Experience on leading teams and mentoring developers Focus on quality experience with TDD, BDD, Stress and Contract Tests Proficient in working with APIs (Application Programming Interfaces) and understand data formats like JSON, XML, YAML, Parquet etc Key Skills: Java Spring Boot NodeJS SQL/PLSQL ReactJS Advantageous: Having prior experience in Banking/Finance domain Having worked on hybrid cloud solutions preferably using GCP Having worked on product development How we'll support you: Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs

Posted 3 months ago

Apply

16 - 20 years

25 - 40 Lacs

Pune

Work from Office

Naukri logo

Job Title: Senior Manager for Production /Technology Manager Corporate Title: VP Role Description At the heart of Deutsche Bank's client franchise, is the Corporate Bank (CB), a market leader in Cash Management, Trade Finance & Lending, Securities Services and Trust & Agency services. Focusing on the Treasurers and Finance Departments of Corporate and Commercial clients and Financial Institutions across the Globe, our Universal Expertise and Global Network allows us to offer truly integrated and effective solutions. You will be operating within Corporate Bank Production as a Senior Technology Manager for Production in the Client Access & Services subdivision. Client Access & Services, serves the critical client facing applications categorized under Payment Initiation, Account Services and Client Centric Technology. Our objective at Corporate Bank Production is to consistently strive to make production better which ensures promising End To End experience for our Corporate Clients running their daily Cash Management Business through various access channels. We also implement, encourage, and invest in building Engineering culture in our daily activities to achieve the wider objectives. Our strategy leads to attain reduced number of issues, provide faster resolution on issues, and safeguard any changes being made on our production environment, across all domains at Corporate Bank. You will be accountable to drive a culture of proactive continual improvement into the Production environment through application, user request support, troubleshooting and resolving the errors in production environment; Automation of manual work, monitoring improvements and platform hygiene; Supporting the resolution of issues and conflicts and preparing reports and meetings. Candidate should have experience in all relevant tools used in the Service Operations environment and has specialist expertise in one or more technical domains; Competency to identify SLOs to measure application services to ensure that all associated Service Operations stakeholders are provided with an optimum level of service. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support." Your key responsibilities Partner with, and influence, stakeholders globally from development, infrastructure and production on risk identification, remediation solutions, and managing change conflicts to build momentum in optimizing the processes, platforms across Production. Working as a regional or Functional Lead for a suite of applications in corporate banking technology. Lead the team in driving a culture of proactive continual improvement into the Production environment through automation of manual work, monitoring improvements and platform hygiene. Thought leadership with Continuous Service Improvement approach to resolve IT failings, drive efficiencies and remove repetition to streamline support activities, reduce risk, and improve system availability by understanding emerging trends and proactively addressing them. Carry out technical analysis of the Production platform to identify and remediate performance and resiliency issues. Understand business workflows and make recommendations for improvements (directly in workflows and/or analytics) Assist in the development of long-term organizational strategy to improve production Track progress of strategic goals, monitor key performance indicators (KPIs) and provide periodic updates to senior management Collaborate with internal and external stakeholders to ensure alignment of tactical initiatives in production with business goals Provide data driven insights and reports to support decision making Promote a culture of continuous improvement and foster innovation within the organization Experience in defining and identifying SLOs and measuring application services Experience in investing and driving SRE culture Your skills and experience University degree with technological or scientific focus or equivalent working experience -ideally in Financial Services / Banking industry. Extensive working experience ~16+ years in the financial services industry and a clear understanding of Finance's key processes and system. Leadership and People Management experience working in global matrix structure. Highly qualified, hands- on experience with Production Application Support and ITIL Practices with SRE knowledge and mindset. Proactive Service Management of all services provided across Businesses and Functions ensuring services are delivered in accordance with the agreed SLA Banking Domain knowledge with deep understanding of application Support and/or Development and complex IT infrastructure (UNIX, Database, Middleware, Cloud, MQ etc.) Good understanding of most recent technologies, be it cloud (GCP, AWS, Azure), programming languages (Java, JavaScript, Python), databases (Postgres, BigQuery), and other solutions. Experience in application performance monitoring tools Geneos, Splunk, Grafana & New Relic, Scheduling Tools (Control-M) Must be able to constantly improve process and mechanism based on learning and feedback from various stakeholders. Excellent partnering and communication skills as well as stakeholder management combined with the ability to successfully navigate a complex organization, build strong relationships and work collaboratively with other teams Analytical aptitude and strong attention to detail combined with high level of commitment and the ability to deliver high quality results within tight deadlines Data analysis and visualization experience and understanding, with ability to translate data analysis to extract meaningful commercial insights and visualize data to support decision making processes Excellent communication and interpersonal skills together with ability to explain complex concepts for non-technical stakeholders to understand Strong analytical and problem-solving skills Experience in project management and change management High degree of emotional intelligence and cultural awareness Result oriented with a focus on strategic outcomes Guide and drive customers, suppliers, and partners. Makes decisions which influence the success of projects and team objectives. Collaborates regularly with team members, users, cross-functional teams, and customers. Engages to ensure that Customers/ Clients needs are being met throughout. Works under general direction within a clear framework of accountability. Plans own work to meet given objectives. Ability to work independently and manage multiple priorities. Communicates fluently, orally and in writing, and can present complex information to both technical and non-technical audiences. Plans, schedules, and monitors work to meet time and quality targets. Facilitates collaboration between stakeholders who share common objectives. Fully understands the importance of security to own work and the operation of the organization. Nice to have: Cloud services: GCP Experience with automation solutions (Ansible, Jenkins/Groovy, Python, Java) DevOps & Continuous Integration/ Agile oriented Skills That Will Help You Excel Self-motivated with excellent interpersonal, presentation, and communication skills. Able to think strategically with strong analytical and problem-solving skills. Proactive, Focused, and resilient with positive demeanor Good influencing/negotiating skills able to overcome resistance and reach consensus to attain the required objective. Very strong stakeholder management experience with attention to detail. Able to handle multiple demands and priorities simultaneously, work under pressure, in an organized manner and with teams across multiple locations and time-zones Able to connect, manage and influence people from different backgrounds and cultures. Growth Mindset with a Can-do-attitude Bachelors degree or equivalent in IT related field. ITIL certification preferred. Management certification is an advantage.

Posted 3 months ago

Apply

6 - 10 years

30 - 32 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking an experienced IDMC MDM + CDQ Consultant to design, develop, and manage Informatica Cloud Master Data Management (MDM) and Cloud Data Quality (CDQ) solutions. The ideal candidate will have expertise in data governance, master data modeling, data quality management, and cloud-based MDM implementations. This role involves working with large-scale datasets, ensuring high-quality master data, and integrating data management solutions across cloud and on-premises environments. Key Responsibilities: Design and implement Informatica IDMC MDM and CDQ solutions for managing customer, product, and entity master data. Configure MDM data models, hierarchies, business rules, and workflows for data consolidation and governance. Implement data matching, survivorship, and golden record creation to ensure high-quality master data. Develop and maintain data quality rules, data cleansing, and profiling processes using Informatica CDQ. Integrate MDM and CDQ with enterprise applications such as ERP, CRM, and cloud data warehouses (Snowflake, Redshift, BigQuery, Synapse, etc.). Ensure data lineage, impact analysis, and metadata management for better data governance. Optimize MDM/CDQ performance, tuning, and data processing pipelines. Ensure compliance with data security, privacy, and regulatory requirements (GDPR, HIPAA, CCPA, etc.). Collaborate with data architects, data engineers, and business stakeholders to define MDM and data quality strategies. Provide training and support to data stewards, analysts, and business users on MDM/CDQ best practices. Required Skills & Qualifications: 6+ years of experience in Master Data Management (MDM) and Data Quality (DQ) solutions. Strong hands-on experience with Informatica IDMC MDM and CDQ. Expertise in data modeling, data cleansing, and golden record management. Proficiency in SQL, API-based integrations (REST/SOAP), and data transformation. Strong understanding of data governance, metadata management, and data lineage. Experience with cloud platforms (AWS, Azure, GCP) and cloud-based MDM implementations. Knowledge of data profiling, standardization, enrichment, and validation techniques. Familiarity with ETL tools, cloud storage solutions, and real-time data processing. Strong problem-solving skills and ability to work in Agile/DevOps environments.

Posted 3 months ago

Apply

5 - 10 years

10 - 20 Lacs

Gurgaon

Work from Office

Naukri logo

Data Pipeline Development:- Design & guiding developers in develop and maintain data processing pipelines using PySpark. Collaborate with data engineers and analysts to integrate data from various sources. Implement data transformations, aggregations, and enrichment processes. Performance Optimization Guiding in optimize Spark jobs for performance and scalability. Monitor and troubleshoot performance issues in Spark applications. Implement best practices for efficient Spark code and resource utilization Data Model Design & Development Understanding of Life Insurance Industry leading to design & establish data models in GCP cloud and adhering to enterprise standards. Evaluate data related tools and technologies and recommend appropriate implementation patterns and standards methodologies to ensure latest and modern data architecture & data rules implementation. Create and maintain conceptual / logical data models to identify key business entities and visual relationship resulting in efficient data pipelines and optimized and non-redundant data. Reviewing data models with both technical and business audiences Data Quality and Integrity: ** Ensure data quality and integrity through rigorous testing and validation. Implement data validation and cleansing processes. Design & develop automated data quality checks and monitoring systems

Posted 3 months ago

Apply

3 - 7 years

11 - 18 Lacs

Pune, Bengaluru, Hyderabad

Work from Office

Naukri logo

Role: GCP Data Engg Client: MNC (full time) Position: Permanent Exp: 3.5-6 years Location: PAN India NP: Immediate/Serving/30 DAYS Work mode : Hybrid/WFO Mandatory Skills: GCP, Bigquery, Dataflow, Dataplex, Pubsub, Python & SQL JD : To ensure successful initiation, planning, execution, control and completion of the project by guiding team members on technical aspects, conducting reviews of technical documents and artefacts. Lead project development, production support and maintenance activities. Fill and ensure timesheets are completed, as is the invoicing process, on or before the deadline. Lead the customer interface for the project on an everyday basis, proactively addressing any issues before they are escalated. Create functional and technical specification documents. Track open tickets/ incidents in queue and allocate tickets to resources and ensure that the tickets are closed within the deadlines. Ensure analysts adhere to SLA?s/KPI?s/OLA?s. Ensure that all in the delivery team, including self, are constantly thinking of ways to do things faster, better or in a more economic manner. Lead and ensure the project is in compliance with Software Quality Processes and within timelines. Review functional and technical specification documents. Serve as the single point of contact for the team to the project stakeholders. Promote teamwork, motivate, mentor and develop subordinates. Kindly please fill the below details & share updated cv to mansoor@burgeonits.com Name as per Aadhar card Mobile no Alternate no Email id Alternate email Date of birth Pan card no(for client upload)mandatory* Total Exp & Rev Exp Current company If any payroll (Name) Notice Period (If Serving any Np , Mention last working day) CCTC & ECTC Any offers (Yes/No) If yes how much offer &when joining date Current location & Preferred Location Happy to relocate(Yes/No) Available Interview time slots

Posted 3 months ago

Apply

6 - 10 years

35 - 45 Lacs

Bengaluru

Hybrid

Naukri logo

What the job involves You will be joining a fast-growing team of motivated and talented engineers, helping us build and enhance a suite of innovative products that are transforming the mobile marketing industry. Our solutions enable clients to measure the effectiveness of their data in a completely novel way. This role is fairly independent and offers significant autonomy. You should be comfortable working with a geographically dispersed team and driving your tasks to completion. You will take ownership of your work and may be responsible for supporting one or more projects simultaneously based on business needs. Additionally, you will collaborate closely with our existing team of software engineers and data scientists, contributing to the continuous improvement of our product and solution suite. Who you are Experience & Proficiency: You bring 6-10 years of commercial software engineering experience, with a strong command of backend development using two or more backend languages (GoLang/Python/Javascript Technologies/SQL). You have experience developing and deploying microservices, particularly in cloud environments (AWS or GCP). Developing and managing database interface software libraries Microservices & Performance: Youve worked with microservice architectures, appreciating the importance of performance, data security, and quality requirements. Your knowledge of concurrency and multi-threaded code ensures efficient performance and system scalability. Backend & Cloud Technologies: You have hands-on experience with various storage technologies, including Amazon Redshift, Bigquery, Spark and MySQL, and are comfortable working with cloud-native services. You are familiar with Kubernetes and containerization technologies, with an understanding of their best practices. Problem-Solving & Initiative: You are motivated to solve complex problems and handle the technical challenges. You dont need micromanagement; youre proactive, asking for help when needed but working independently to develop solutions. Documentation & Support: You are diligent in documenting new solutions and maintaining up-to-date records of existing systems. You also provide on-call support to ensure any issues with the onboarding process are quickly resolved. Continuous Learning & Collaboration: You collaborate effectively with cross-functional teams and are committed to continuously broadening your skill set. You embrace new techniques and technologies to stay current with industry standards. Job Responsibilities Research, design, develop and test ingestion pipelines Be a core member of the team creating leading edge big data processing, analytics and AI tools focusing on deriving value out of customer data.. Balance a pace of delivery schedule with a focus on quality and resilience. Maintain and optimize legacy systems while developing new, scalable solutions. Research, design, develop, and test ingestion pipelines to ensure high performance and data accuracy. Profile and optimize CPU usage, memory consumption, and I/O operations to enhance system performance. Document new solutions and keep existing documentation up to date. Bonus Points Strong understanding of software testing methodologies and experience with automated testing frameworks. Experience with Kubernetes and cloud-based orchestration systems. Hands-on experience in deploying AI solutions, particularly Large Language Models (LLMs), in production environments. Proficiency with BigQuery, Spark, and Redshift for data processing and analytics. Expertise in infrastructure as code (IaC) using Terraform.

Posted 3 months ago

Apply

5 - 10 years

7 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Tecovas is looking for an AnalyticsEngineer to joinour growing and dynamic Data Team. This position will play an integral role in democratizing data access and use across all departments at Tecovas. Reporting to the Director of Data, you will be helping to build out the companys Data pipelines, Data Warehouse, and other Data products and play a key role in ensuring Tecovas has a best in class data practice. This candidate is strongly encouraged to work from our HQ office in Austin, TX with the ability to work remotely on other days. What youll do: Develop and maintain data models using dbt ensuring a single source of truth Data Warehouse Coordinate cross functionally to ensure business logic and metrics are accurately captured and aligned Collaborate with Data Science, Analytics, Core Systems and the rest of the Tech team to support advanced data projects Advance data monitoring, security, and compliance efforts to align with modern best practices Improve data infrastructure using software engineering best practices; data testing, observability, orchestration Improve internal tech documentation and business facing documentation / data dictionary Develop and support Data Science and Advanced Analytics pipelines with creative and unique analytics engineering solutions Experience were looking for: Bachelor's degree in computer science, engineering, or a related field 5+ years of experience as a data engineer, analytics engineer, or similar role Expertise with dbt Expertise with modern Data Engineering best practices including CDC, observability, quality testing, and performance and cost optimization Strong experience with Python, SQL, Git Experience with Fivetran, Stitch, or other ETL/ELT tools Familiarity with cloud-based platforms like BigQuery, Airflow, or other tools (GCP preferred, but equivalent experience is welcome). Excellent interpersonal and communication skills What you bring to the table: You are highly organized and a self-starter. You feel confident working in a fast-paced environment. You are able to quickly learn new systems and implement new procedures. You can easily collaborate with cross-functional partners. You have a positive attitude and are motivated by a challenge.

Posted 3 months ago

Apply

7 - 11 years

22 - 25 Lacs

Hyderabad

Work from Office

Naukri logo

Tecovas is the first direct-to-consumer western brand, founded with the simple goal of making the worlds best western boots, apparel and leather goods - and selling them at a fair price. We are a brand revolutionizing a category and welcoming first-time boot buyers and western enthusiasts alike. Tecovas is looking for a SeniorData Engineer to joinour growing and dynamic Data Team. This position will play an integral role in democratizing data access and use across all departments at Tecovas. Reporting directly to the Director of Data, you will be helping to build out the companys Data pipelines, Data Warehouse, and other Data products and play a key role in ensuring Tecovas has a best in class data practice. This candidate is strongly encouraged to work from our HQ office in Austin, TX with the ability to work remotely on other days. What youll do: Develop and maintain scalable and efficient ELT pipelines to gather and store data across all departments at Tecovas Coordinate cross functionally to ensure all relevant data is captured for analysis and reporting Collaborate with Data Science, Analytics, Core Systems and the rest of the Tech team to support advanced data projects Assist in maintaining and improving data transformation models in dbt Advance data monitoring, security, and compliance efforts Manage Airflow, Cloud Functions, and other cloud infrastructure to ensure cost effective solutions with minimal downtime Improve internal tech documentation and business facing documentation / data dictionary Develop and support Data Science and Advanced Analytics pipelines with creative and unique data engineering solutions Experience were looking for: Bachelor's degree in computer science, engineering, or a related field 5+ years of experience as a data engineer, data scientist, or data analyst Expertise with modern Data Engineering best practices including CDC, observability, quality testing, and performance and cost optimization Strong experience with Python, SQL, Git Strong Experience with dbt Experience with Fivetran, Stitch, or other ETL/ELT tools Experience with BigQuery, Airflow, and other cloud based engineering tools. We use GCP butother relevant experience will be considered Excellent interpersonal and communication skills What you bring to the table: You are highly organized and a self-starter. You feel confident working in a fast-paced environment. You are able to quickly learn new systems and implement new procedures. You can easily collaborate with cross-functional partners. You have a positive attitude and are motivated by a challenge.

Posted 3 months ago

Apply

6 - 9 years

35 Lacs

Bengaluru

Work from Office

Naukri logo

We are hiring a Senior GCP Developer to join our high-performance data engineering team. This is a mission-critical role where you will design, build, and maintain scalable ETL pipelines and frameworks in a Data Mesh architecture. You will work with modern tools like Python, dbt, BigQuery (GCP), and SQL to deliver high-quality data products that power decision-making across the organization. We are looking for a highly skilled professional who thrives in demanding environments, takes ownership of their work, and delivers results with precision and reliability. Key Responsibilities * Design, Build, and Maintain ETL Pipelines: Develop robust, scalable, and efficient ETL workflows to ingest, transform, and load data into distributed data products within the Data Mesh architecture. * Data Transformation with dbt: Use dbt to build modular, reusable transformation workflows that align with the principles of Data Products. * Cloud Expertise: Leverage Google Cloud Platform (GCP) services such as BigQuery, Cloud Storage, Pub/Sub, and Dataflow to implement highly scalable data solutions. * Data Quality & Governance: Enforce strict data quality standards by implementing validation checks, anomaly detection mechanisms, and monitoring frameworks. * Performance Optimization: Continuously optimize ETL pipelines for speed, scalability, and cost efficiency. * Collaboration & Ownership: Work closely with data product owners, BI developers, and stakeholders to understand requirements and deliver on expectations. Take full ownership of your deliverables. * Documentation & Standards: Maintain detailed documentation of ETL workflows, enforce coding standards, and adhere to best practices in data engineering. * Troubleshooting & Issue Resolution: Proactively identify bottlenecks or issues in pipelines and resolve them quickly with minimal disruption. Required Skills & Experience * 10+ or 7+ years of hands-on experience in designing and implementing ETL workflows in large-scale environments (Lead & Dev) * Advanced proficiency in Python for scripting, automation, and data processing. * Expert-level knowledge of SQL for querying large datasets with performance optimization techniques. * Deep experience working with modern transformation tools like dbt in production environments. * Strong expertise in cloud platforms like Google Cloud Platform (GCP) with hands-on experience using BigQuery. * Familiarity with Data Mesh principles and distributed data architectures is mandatory. * Proven ability to handle complex projects under tight deadlines while maintaining high-quality standards. * Exceptional problem-solving skills with a strong focus on delivering results. What We Expect This is a demanding role that requires: 1. A proactive mindset you take initiative without waiting for instructions. 2. A commitment to excellence no shortcuts or compromises on quality. 3. Accountability you own your work end-to-end and deliver on time. 4. Attention to detail precision matters; mistakes are not acceptable. Location - Pan india

Posted 3 months ago

Apply

7 - 12 years

10 - 20 Lacs

Chennai

Hybrid

Naukri logo

Key Skills: DAO - AI/ML, Automation, Python, big query, Project Management, Agile, SDLC,JAVA, SQL Role & responsibilities 8+ years of experience in Data Science / Preferable in Automobile Engineering Domain SKILLS: Professional Skill: Business Analysis, Analytical Thinking, Problem Solving, Decision Making, Leadership, Managerial, Time Management, Domain Knowledge Work simplification - methods that maximize output while minimizing expenditure and cost. Analytics with Data - interprets data and turns it into information which can offer ways to improve a business Communication - Good verbal communication and interpersonal skills are essential for collaborating with customers Technical Skills: Python/Numpy, Seaborn, Pandas, Selenium, Beautiful Soup (basic), Spotfire, ML Libraries, RPA, R, Iron-Python, Html CSS, Javascript, SQL, HQL, Git/Gitlabee, Spark, Scala, Webservices, Spotfire/ Tableau, JIRA Tool Skill: Project management tools, Documentation tools, Modeling [wireframe] tools Database Skills: MsSQL, Postgres, MsAccess, Mongo DB Rigorous - The ability to analyse qualitative data quickly and rigorously Adaptability - Being able to adapt to changing environments and work processes

Posted 3 months ago

Apply

6 - 10 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking an experienced PySpark Developer / Data Engineer to design, develop, and optimize big data processing pipelines using Apache Spark and Python (PySpark). The ideal candidate should have expertise in distributed computing, ETL workflows, data lake architectures, and cloud-based big data solutions. Key Responsibilities: Develop and optimize ETL/ELT data pipelines using PySpark on distributed computing platforms (Hadoop, Databricks, EMR, HDInsight). Work with structured and unstructured data to perform data transformation, cleansing, and aggregation. Implement data lake and data warehouse solutions on AWS (S3, Glue, Redshift), Azure (ADLS, Synapse), or GCP (BigQuery, Dataflow). Optimize PySpark jobs for performance tuning, partitioning, and caching strategies. Design and implement real-time and batch data processing solutions. Integrate data pipelines with Kafka, Delta Lake, Iceberg, or Hudi for streaming and incremental updates. Ensure data security, governance, and compliance with industry best practices. Work with data scientists and analysts to prepare and process large-scale datasets for machine learning models. Collaborate with DevOps teams to deploy, monitor, and scale PySpark jobs using CI/CD pipelines, Kubernetes, and containerization. Perform unit testing and validation to ensure data integrity and reliability. Required Skills & Qualifications: 6+ years of experience in big data processing, ETL, and data engineering. Strong hands-on experience with PySpark (Apache Spark with Python). Expertise in SQL, DataFrame API, and RDD transformations. Experience with big data platforms (Hadoop, Hive, HDFS, Spark SQL). Knowledge of cloud data processing services (AWS Glue, EMR, Databricks, Azure Synapse, GCP Dataflow). Proficiency in writing optimized queries, partitioning, and indexing for performance tuning. Experience with workflow orchestration tools like Airflow, Oozie, or Prefect. Familiarity with containerization and deployment using Docker, Kubernetes, and CI/CD pipelines. Strong understanding of data governance, security, and compliance (GDPR, HIPAA, CCPA, etc.). Excellent problem-solving, debugging, and performance optimization skills.

Posted 3 months ago

Apply

7 - 12 years

0 Lacs

Bengaluru, Noida

Work from Office

Naukri logo

Role & responsibilities Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Google Cloud Data Services Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for overseeing the entire application development process and ensuring its successful implementation. This role requires strong leadership skills and expertise in Google Cloud Data Services. Roles & Responsibilities: - Lead the team in designing, building, and configuring applications according to project requirements. - Act as the primary point of contact for all application-related matters, providing guidance and support to team members. - Collaborate with multiple teams to gather requirements, define project scope, and ensure timely delivery of applications. - Manage the team's performance and make decisions regarding resource allocation and task assignments. - Engage with stakeholders to understand their needs and provide solutions to problems for both the immediate team and across multiple teams. - Contribute to key decisions related to application design, architecture, and functionality. - Provide technical expertise and guidance to team members, ensuring the successful implementation of applications. - Stay updated with the latest trends and advancements in application development and recommend innovative solutions to improve processes and efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google Cloud Data Services. - Strong understanding of application development principles and best practices. - Experience in designing and implementing scalable and secure applications. - Hands-on experience with Google Cloud Platform services, such as Cloud Storage, BigQuery, and Dataflow. - Knowledge of programming languages such as Java, Python, or Go. - Familiarity with DevOps practices and tools for continuous integration and deployment. - Experience in working with Agile methodologies and tools like Jira or Trello. - Excellent problem-solving and analytical skills. Additional Information: - The candidate should have a minimum of 7.5 years of experience in Google Cloud Data Services. - A 15 years full-time education is required. Preferred candidate profile Perks and benefits

Posted 3 months ago

Apply

10 - 13 years

25 - 37 Lacs

Hyderabad

Work from Office

Naukri logo

We're Nagarro. We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues. That's where you come in! REQUIREMENTS: Total Experience 10+ years Excellent knowledge and experience in Big data engineer with Spark, Scala and Hadoop. Hands-on expertise with GCP tools such as BigQuery, Dataflow, Spanner, Dataproc, and Pub/Sub. Strong experience in building and managing ETL pipelines for batch and real-time data processing Proficiency in database such as SQL, Hands on experience in Python, and other programming languages used in data engineering. Familiarity with data modeling, data warehousing, and building distributed systems. Expertise in Spanner for high-availability, scalable database solutions. Knowledge of data governance and security practices in cloud-based environments. Problem-solving mindset with the ability to tackle complex data engineering challenges. Strong communication and teamwork skills, with the ability to mentor and collaborate effectively. RESPONSIBILITIES: Writing and reviewing great quality code Understanding the clients business use cases and technical requirements and be able to convert them in to technical design which elegantly meets the requirements Mapping decisions with requirements and be able to translate the same to developers Identifying different solutions and being able to narrow down the best option that meets the client’s requirements Defining guidelines and benchmarks for NFR considerations during project implementation Writing and reviewing design document explaining overall architecture, framework, and high-level design of the application for the developers Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it Understanding and relating technology integration scenarios and applying these learnings in projects Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken Carrying out POCs to make sure that suggested design/technologies meet the requirements

Posted 3 months ago

Apply

5 - 10 years

7 - 12 Lacs

Pune

Work from Office

Naukri logo

About The Role :: Job Title:Lead Engineer Location:Pune, India Role Description We are seeking a highly motivated and experienced engineer with a strong foundation in Containers, Google Kubernetes Engine (GKE), Anthos & GCP to join our Container platform team. In this critical role, you will be responsible for defining, prioritizing, and delivering exceptional customer experiences for our container orchestration and hybrid/multi-cloud solutions. You will work closely with CSO, SRE and cloud products team to ensure successful product launches and ongoing feature updates. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Assist the Product Owner in building, maintaining technical artifacts which includes but not limited to architecture reference document, IAM policy document, reusable code. Collaborate with engineers to prioritize the product backlog, ensuring alignment with the product roadmap and customer needs. Collaborate with CSO teams to define IAM policies, guardrails following the principle of least privilege. Work with FinOps team to optimize cloud budgets for container platforms. Collaborate with SRE team and engineering teams in troubleshooting helping CIOs team to unblock special cases. Work closely with stakeholders to gather and incorporate feedback on product features and functionality. Monitor product performance and gather user feedback to identify areas for improvement. Analyze product usage data and key performance indicators (KPIs) to measure product success and identify opportunities for optimization. Stay abreast of the latest developments in cloud native technologies and GKE best practices. Your skills and experience Qualifications: Experience: Overall 12+ years of experience working in technology\infrastructure with 5+ years of experience as a platform engineer\platform lead on container platform. Technical Expertise: 5+ years of hands-on experience working on containerization technologies - Docker and Kubernetes, GKE. 3+years of hands-on experience working on GCP, Anthos and hybrid-cloud environments. 3+years of hands-on experience with CICD tools like GitHub Actions or equivalent, IAC tools like Terraform, Linux & shell scripting. CKA certification along with the above experiences would be preferred. Soft Skills: Data-driven decision-making and a customer-centric approach. Passion for technology and a strong desire to learn and grow. Lead and mentor cross-functional teams, fostering a collaborative and innovative work environment. Be a brand ambassador of the container platform within the bank. Benefits: Competitive salary and benefits package. Opportunity to work on cutting-edge technologies. Collaborative and innovative work environment. Opportunities for professional development and growth. How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs

Posted 3 months ago

Apply

13 - 20 years

30 - 40 Lacs

Chennai, Hyderabad

Work from Office

Naukri logo

Title : GCP Data Engineer Experience : 14+ Years Location : Chennai/Hyderabad Skillset : GCP, BigQuery, Dataflow, Terraform, AirFlow, CloudSql, Cloud Storage, cloud Composer, Pubsub If anyone interested, Kindly drop CV to sharmeelasri26@gmail.com

Posted 3 months ago

Apply

Exploring BigQuery Jobs in India

BigQuery, a powerful cloud-based data warehouse provided by Google Cloud, is in high demand in the job market in India. Companies are increasingly relying on BigQuery to analyze and manage large datasets, driving the need for skilled professionals in this area.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Delhi
  4. Hyderabad
  5. Pune

Average Salary Range

The average salary range for BigQuery professionals in India varies based on experience level. Entry-level positions may start at around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15-20 lakhs per annum.

Career Path

In the field of BigQuery, a typical career progression may include roles such as Junior Developer, Developer, Senior Developer, Tech Lead, and eventually moving into managerial positions such as Data Architect or Data Engineering Manager.

Related Skills

Alongside BigQuery, professionals in this field often benefit from having skills in SQL, data modeling, data visualization tools like Tableau or Power BI, and cloud platforms like Google Cloud Platform or AWS.

Interview Questions

  • What is BigQuery and how does it differ from traditional databases? (basic)
  • How can you optimize query performance in BigQuery? (medium)
  • Explain the concepts of partitions and clustering in BigQuery. (medium)
  • What are some best practices for designing schemas in BigQuery? (medium)
  • How does BigQuery handle data encryption at rest and in transit? (advanced)
  • Can you explain how BigQuery pricing works? (basic)
  • What are the limitations of BigQuery in terms of data size and query complexity? (medium)
  • How can you schedule and automate tasks in BigQuery? (medium)
  • Describe your experience with BigQuery ML and its applications. (advanced)
  • How does BigQuery handle nested and repeated fields in a schema? (basic)
  • Explain the concept of slots in BigQuery and how they impact query processing. (medium)
  • What are some common use cases for BigQuery in real-world scenarios? (basic)
  • How does BigQuery handle data ingestion from various sources? (medium)
  • Describe your experience with BigQuery scripting and stored procedures. (medium)
  • What are the benefits of using BigQuery over traditional on-premises data warehouses? (basic)
  • How do you troubleshoot and optimize slow-running queries in BigQuery? (medium)
  • Can you explain the concept of streaming inserts in BigQuery? (medium)
  • How does BigQuery handle data security and access control? (advanced)
  • Describe your experience with BigQuery Data Transfer Service. (medium)
  • What are the differences between BigQuery and other cloud-based data warehousing solutions? (basic)
  • How do you handle data versioning and backups in BigQuery? (medium)
  • Explain how you would design a data pipeline using BigQuery and other GCP services. (advanced)
  • What are some common challenges you have faced while working with BigQuery and how did you overcome them? (medium)
  • How do you monitor and optimize costs in BigQuery? (medium)
  • Can you walk us through a recent project where you used BigQuery to derive valuable insights from data? (advanced)

Closing Remark

As you explore opportunities in the BigQuery job market in India, remember to continuously upskill and stay updated with the latest trends in data analytics and cloud computing. Prepare thoroughly for interviews by practicing common BigQuery concepts and showcase your hands-on experience with the platform. With dedication and perseverance, you can excel in this dynamic field and secure rewarding career opportunities. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies