Jobs
Interviews

1359 Bigquery Jobs - Page 19

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 7.0 years

30 - 31 Lacs

Bengaluru

Work from Office

Serko is a cutting-edge tech platform in global business travel & expense technology. When you join Serko, you become part of a team of passionate travellers and technologists bringing people together, using the world’s leading business travel marketplace. We are proud to be an equal opportunity employer. We embrace the richness of diversity, showing up authentically to create a positive impact. There's an exciting road ahead of us, where travel needs real, impactful change. With offices in New Zealand, Australia, North America, and China, we are thrilled to be expanding our global footprint, landing our new hub in Bengaluru, India. With a rapid growth plan in place for India, we’re hiring people from different backgrounds, experiences, abilities, and perspectives to help us build a world-class team and product. Requirements We are seeking an experienced and highly skilled Senior Site Reliability Engineer (Sr.SRE) to help ensure the performance, reliability, and scalability of our infrastructure and applications. As an Sr.SRE, you will bridge the gap between development and operations by applying software engineering principles to system administration tasks. You will be responsible for maintaining high availability, automating infrastructure, optimizing performance, and responding to incidents across our production environment.! What will you be doing: Joining our engineering team at Serko, you’ll play a crucial role in developing critical features for our enterprise corporate customers. In this role you will Focus on implementing best practices for monitoring on GCP. Identify & work on GetThere Critical User Journeys (CUJs) and continuously look for toil reduction (automation) Improve health rules & alerts to make system more stable. Performs activities related to the software release process from development to production. Determines tools needed to support system health, monitor system performance, develop and maintain performance monitoring and error tracking operational tools in Google Cloud. Work closely with multiple internal groups, which include Development, System Owners, Incident Management Responsible for creation and/or review of application Change Records (CR’s) Responsible for release and deployment of application CR’s across multiple platforms from integration through production Required Skills: Linux / Unix experience Scripting Abilities – Bash or Python. Hands on experience in Google Cloud Platform (2-3 years). Strong in Google Compute Engine (GCE) Production Application Support. Take On-call rotation as per the team roster including weekends. Experience with building resources in cloud using IaC tools (Terraform or Ansible). Ability to handle multiple projects simultaneously Good Analytical capabilities & troubleshooting skills Good written and verbal communication skills Change Management Experience. Desired Skills: General experience with distributed architecture and/or High Availability Systems. Thorough understanding of container orchestration platform (Kubernetes). Google Kubernetes Engine (GKE) is preferred GCP certification is desirable but not mandatory. Experience working with globally distributed teams. Ability to handle multiple projects simultaneously. Experience working on monitor tools like AppD and define health check dashboard. ITSM/ITIL Experience Preferred Knowledge of Rally, JIRA and/or Service Now Familiar with CMDB repositories SQL Query Skills – basic Oracle or MYSQL Benefits At Serko we aim to create a place where people can come and do their best work. This means you’ll be operating in an environment with great tools and support to enable you to perform at the highest level of your abilities, producing high-quality, and delivering innovative and efficient results. Our people are fully engaged, continuously improving, and encouraged to make an impact. Some of the benefits of working at Serko are: Competitive base pay Discretionary incentive plan based on individual and company performance Focus on development: Access to a learning & development platform and opportunity for you to own your career pathways Family medical coverage, Meal coupons, Transport allowances, Mobile & Internet Reimbursement Flexible work policy

Posted 3 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

noida, uttar pradesh

On-site

As a highly motivated and experienced Data Engineer, you will be responsible for designing, developing, and implementing solutions that enable seamless data integration across multiple cloud platforms. Your expertise in data lake architecture, Iceberg tables, and cloud compute engines like Snowflake, BigQuery, and Athena will ensure efficient and reliable data access for various downstream applications. Your key responsibilities will include collaborating with stakeholders to understand data needs and define schemas, designing and implementing data pipelines for ingesting, transforming, and storing data. You will also be developing data transformation logic to make Iceberg tables compatible with the data access requirements of Snowflake, BigQuery, and Athena, as well as designing and implementing solutions for seamless data transfer and synchronization across different cloud platforms. Ensuring data consistency and quality across the data lake and target cloud environments will be crucial in your role. Additionally, you will be analyzing data patterns and identifying performance bottlenecks in data pipelines, implementing data optimization techniques to improve query performance and reduce data storage costs, and monitoring data lake health to proactively address potential issues. Collaboration and communication with architects, leads, and other stakeholders to ensure data quality meet specific requirements will also be an essential part of your role. To be successful in this position, you should have a minimum of 4+ years of experience as a Data Engineer, strong hands-on experience with data lake architectures and technologies, proficiency in SQL and scripting languages, and experience with data governance and security best practices. Excellent problem-solving and analytical skills, strong communication and collaboration skills, and familiarity with cloud-native data tools and services are also required. Additionally, certifications in relevant cloud technologies will be beneficial. In return, GlobalLogic offers exciting projects in industries like High-Tech, communication, media, healthcare, retail, and telecom. You will have the opportunity to collaborate with a diverse team of highly talented individuals in an open, laidback environment. Work-life balance is prioritized with flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional development opportunities include Communication skills training, Stress Management programs, professional certifications, and technical and soft skill trainings. GlobalLogic provides competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance, NPS(National Pension Scheme), extended maternity leave, annual performance bonuses, and referral bonuses. Fun perks such as sports events, cultural activities, food on subsidized rates, corporate parties, dedicated GL Zones, rooftop decks, and discounts for popular stores and restaurants are also part of the vibrant office culture at GlobalLogic. About GlobalLogic: GlobalLogic is a leader in digital engineering, helping brands design and build innovative products, platforms, and digital experiences for the modern world. By integrating experience design, complex engineering, and data expertise, GlobalLogic helps clients accelerate their transition into tomorrow's digital businesses. Operating under Hitachi, Ltd., GlobalLogic contributes to driving innovation through data and technology for a sustainable society with a higher quality of life.,

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

tamil nadu

On-site

As a data engineer, you will be expected to be proficient in Python, SQL, and either Java or Scala, especially for Spark/Beam pipelines. Experience with BigQuery, Dataflow, Apache Beam, Airflow, and Kafka will be beneficial for this role. You will be responsible for building scalable batch and streaming pipelines to support machine learning or campaign analytics. Familiarity with ad tech, bid logs, or event tracking pipelines is considered a plus. Your primary role will involve constructing the foundational data infrastructure to handle the ingestion, processing, and serving of bid logs, user events, and attribution data from various sources. Key responsibilities include building scalable data pipelines for real-time and batch ingestion from DSPs, attribution tools, and order management systems. You will need to design clean and queryable data models to facilitate machine learning training and campaign optimization. Additionally, you will be required to enable data joins across 1st, 2nd, and 3rd-party data sets such as device, app, geo, and segment information. Optimizing pipelines for freshness, reliability, and cost efficiency is crucial, along with supporting event-level logging of auction wins, impressions, conversions, and click paths. The ideal candidate for this role should possess skills in Apache Beam, Airflow, Kafka, Scala, SQL, BigQuery, attribution, Java, Dataflow, Spark, machine learning, and Python. If you are enthusiastic about data engineering and have a background in building scalable data pipelines, this position could be a great fit for you.,

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

Role Overview: As a Senior Python Backend Developer, youll be at the forefront of developing robust, scalable backend systems. You will work closely with our engineering team to design, build, and optimize backend solutions that meet the needs of our growing user base. This role requires deep expertise in Python, a strong understanding of backend architecture, and a passion for solving complex technical challenges. Requirements: Experience: 5+ years of professional experience in backend development with a strong focus on Python. Technical Skills: Proficiency in Python and popular frameworks (e.g., Django, Flask, FastAPI). Experience with RESTful APIs and microservices architecture. Database Knowledge: Expertise in relational and non-relational databases (e.g., MS SQLServer). Tools: Working Knowledge of Google Cloud/BigQuery Problem-Solving: Excellent problem-solving skills with the ability to tackle complex technical challenges. Communication: Strong verbal and written communication skills. Ability to work effectively in a collaborative team environment. Education: Bachelors degree in Computer Science, Engineering, or a related field, or equivalent experience. Preferred Qualifications: Experience with containerization and orchestration tools (e.g., Docker, Kubernetes). Knowledge of frontend technologies and frameworks. Experience with Agile development methodologies. Interested candidates should send their resume, a cover letter, and a portfolio of relevant projects to hr@kraniumhealth.com. Please include Senior Python Backend Developer Application [Your Name] in the subject line.,

Posted 3 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

thane, maharashtra

On-site

The ideal candidate for this role should have prior experience in any backend tech stack, but also possess a strong desire to explore and delve deeper into technologies to discover innovative solutions. In this position, you will have the opportunity to: - Contribute to the creation of cutting-edge enterprise solutions that make a real impact. - Take ownership of features from end to end and lead the way in driving meaningful innovations. - Thrive in a fast-paced startup culture where your opinion and input are highly valued. - Collaborate with a team of talented individuals who are all focused on solving complex challenges. - Enhance your skills across a range of technologies including Mongo, SQL, Dynamo, Redis, Firebase, Kafka, SQS, Elastic search, Bigquery, Lambda, Go, Node.js, Python, React, Next.js, Angular, AWS, GCP, and AI agents. If you are passionate about technology and eager to make a difference through your work, this is the perfect opportunity for you to grow and excel in a dynamic and innovative environment.,

Posted 3 weeks ago

Apply

6.0 - 11.0 years

15 - 30 Lacs

Pune

Hybrid

Software Engineer - Specialist What youll do Demonstrate a deep understanding of cloud-native, distributed microservice-based architectures. Deliver solutions for complex business problems through standard Software Development Life Cycle (SDLC) practices. Build strong relationships with both internal and external stakeholders, including product, business, and sales partners. Demonstrate excellent communication skills with the ability to both simplify complex problems and also dive deeper if needed. Lead strong technical teams that deliver complex software solutions that scale. Work across teams to integrate our systems with existing internal systems. Participate in a tight-knit, globally distributed engineering team. Provide deep troubleshooting skills with the ability to lead and solve production and customer issues under pressure. Leverage strong experience in full-stack software development and public cloud platforms like GCP and AWS. Mentor, coach, and develop junior and senior software, quality, and reliability engineers. Ensure compliance with secure software development guidelines and best practices. Define, maintain, and report SLA, SLO, and SLIs meeting EFX engineering standards in partnership with the product, engineering, and architecture teams. Collaborate with architects, SRE leads, and other technical leadership on strategic technical direction, guidelines, and best practices. Drive up-to-date technical documentation including support, end-user documentation, and run books. Responsible for implementation architecture decision-making associated with Product features/stories and refactoring work decisions. Create and deliver technical presentations to internal and external technical and non-technical stakeholders, communicating with clarity and precision, and presenting complex information in a concise, audience-appropriate format. What experience you need Bachelor's degree or equivalent experience. 5+ years of software engineering experience. 5+ years experience writing, debugging, and troubleshooting code in mainstream Java and SpringBoot. 5+ years experience with Cloud technology: GCP, AWS, or Azure. 5+ years experience designing and developing cloud-native solutions. 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes. 5+ years experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others. 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understanding infrastructure-as-code concepts, Helm Charts, and Terraform constructs. What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Strong communication and presentation skills. Strong leadership qualities. Demonstrated problem-solving skills and the ability to resolve conflicts. Experience creating and maintaining product and software roadmaps. Working in a highly regulated environment. Experience on GCP in Big data and distributed systems - Dataflow, Apache Beam, Pub/Sub, BigTable, BigQuery, GCS. Experience with backend technologies such as JAVA/J2EE, SpringBoot, Golang, gRPC, SOA, and Microservices. Source code control management systems (e.g., SVN/Git, Github, Gitlab), build tools like Maven & Gradle, and CI/CD like Jenkins or Gitlab. Agile environments (e.g., Scrum, XP). Relational databases (e.g., SQL Server, MySQL). Atlassian tooling (e.g., JIRA, Confluence, and Github). Developing with modern JDK (v1.7+). Automated Testing: JUnit, Selenium, LoadRunner, SoapUI.

Posted 3 weeks ago

Apply

7.0 - 10.0 years

20 - 35 Lacs

Chennai

Work from Office

Experience : 7.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Hybrid (Chennai) Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Forbes Advisor) What do you need for this opportunity? Must have skills required: Agile, Program Management, data infrastructure Forbes Advisor is Looking for: Program Manager Data Job Description: Were hiring a Program Manager to orchestrate complex, cross-functional data initiativesfrom revenue-pipeline automation to analytics product launches. You’ll be the connective tissue between Data Engineering, Analytics, RevOps, Product, and external partners, ensuring programs land on time, on scope, and with measurable impact. If you excel at turning vision into executable roadmaps, mitigating risk before it bites, and communicating clearly across technical and business audiences, we’d love to meet you. Key Responsibilities: Own program delivery for multi-team data products (e.g., revenue-data pipelines, attribution models, partner-facing reporting APIs). Build and maintain integrated roadmaps, aligning sprint plans, funding, and resource commitments. Drive agile ceremonies (backlog grooming, sprint planning, retrospectives) and track velocity, burn-down, and cycle-time metrics. Create transparent status reporting—risks, dependencies, OKRs—tailored for engineers up to C-suite stakeholders. Proactively remove blockers by coordinating with Platform, IT, Legal/Compliance, and external vendors. Champion process optimisation: intake, prioritisation, change management, and post-mortems. Partner with RevOps and Media teams to ensure program outputs translate into revenue growth and faster decision making. Facilitate launch readiness—QA checklists, enablement materials, go-live runbooks—so new data products land smoothly. Foster a culture of documentation, psychological safety, and continuous improvement within the data organisation. Experience required: 7+ years program or project-management experience in data, analytics, SaaS, or high-growth tech. Proven success delivering complex, multi-stakeholder initiatives on aggressive timelines. Expertise with agile frameworks (Scrum/Kanban) and modern collaboration tools (Jira, Asana, Notion/Confluence, Slack). Strong understanding of data & cloud concepts (pipelines, ETL/ELT, BigQuery, dbt, Airflow/Composer). Excellent written and verbal communication—able to translate between technical teams and business leaders. Risk-management mindset: identify, quantify, and drive mitigation before issues escalate. Experience coordinating across time zones and cultures in a remote-first environment. Nice to Have Formal certification (PMP, PMI-ACP, CSM, SAFe, or equivalent). Familiarity with GCP services, Looker/Tableau, or marketing-data stacks (Google Ads, Meta, GA4). Exposure to revenue operations, performance marketing, or subscription/affiliate business models. Background in change-management or process-improvement methodologies (Lean, Six Sigma). Perks: Monthly long weekends—every third Friday off. Fitness and commute reimbursement. Remote-first culture with flexible hours and a high-trust environment. Opportunity to shape a world-class data platform inside a trusted global brand. Collaborate with talented engineers, analysts, and product leaders who value innovation and impact. How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! Forbes Advisor is a high-growth digital media and technology company that empowers consumers to make confident decisions about money, health, careers, and everyday life. Our global data organisation builds modern, AI-augmented pipelines that turn information into revenue-driving insight.

Posted 3 weeks ago

Apply

4.0 - 7.0 years

6 - 15 Lacs

Pune

Hybrid

Key Skills: Shell Scripting, GCP, Bigquery, SQL, Control M, GitHub, Python, Jenkins, Ansible. Roles and Responsibilities: Build, enhance, and provide production support for GCP-based applications using technologies like Google BigQuery, Python, and workflows. Contribute to end-to-end feature delivery including requirement analysis, design, development, environment setup, data load, testing, deployment, and code reviews. Establish a digital environment and automate processes to reduce variation and ensure consistent high-quality code and data. Collaborate with operations, development, and test engineers to identify and resolve operational issues such as performance, alerts, design defects, and more. Automate the CI/CD pipeline within a DevOps product/service team and promote a culture of continuous improvement. Address code scanning findings with effective remediation solutions. Ensure compliance with IT General Controls such as SDLC and DEPL, and provide evidence during audit reviews when required. Be flexible to work in shifts and provide on-call support as needed. Experience Requirements: Knowledge of Google Cloud Platform, BigQuery, shell scripting, SQL scripts, and batch scheduling tools such as Control-M. 4-7 years of experience in data warehouse or regulatory reporting projects. Familiarity with risk and compliance processes. Understanding of production support procedures, including change, incident, problem, and service management. Basic knowledge of DevOps models and tools such as Jenkins, Ansible, and GitHub. Ability to work independently and resolve production issues, collaborating with global teams across time zones. Knowledge of Agile delivery processes and DevOps practices. Understanding of CI/CD pipelines and DevOps tooling is an added advantage. Willingness to continually enhance skills across various areas such as business knowledge, interface understanding, impact assessments, development, code scanning, deployments, and operational support. Education : B.Tech M.Tech (Dual), B.Tech, M. Tech.

Posted 3 weeks ago

Apply

7.0 - 10.0 years

25 - 30 Lacs

Pune

Hybrid

Role Description We are seeking senior and skilled Tableau and Google BigQuery professional to join our team for a project involving the modernization of existing Tableau reports in Google BigQuery. Skills & Qualifications Bachelors degree in computer science / information technology / related field, with 8 plus years of experience in IT/Software field Proven experience working with Tableau, including creating and maintaining dashboards and reports. Prior experience working with Cognos, including creating and maintaining dashboards and reports. Strong understanding of SQL and database concepts. Familiarity with ETL processes and data validation techniques. Hands-on experience with Google BigQuery and related components/services like Airflow, Composer, etc.. Strong communication and collaboration abilities. Good to have prior experience in data/reports migration from on-premises to cloud

Posted 3 weeks ago

Apply

2.0 - 5.0 years

3 - 6 Lacs

Mumbai

Work from Office

Skill required: Tech for Operations - Automation Anywhere Designation: App Automation Eng Analyst Qualifications: BE Years of Experience: 3 to 5 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do You will be part of the Technology for Operations team that acts as a trusted advisor and partner to Accenture Operations. The team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. We work closely with the sales, offering and delivery teams to identify and build innovative solutions.The Tech For Operations (TFO) team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. Works closely with the sales, offering and delivery teams to identify and build innovative solutions. Major sub deals include AHO(Application Hosting Operations), ISMT (Infrastructure Management), Intelligent AutomationAutomate any process end-to-end with cognitive software robots using the robotic process automation software, Automation Anywhere Enterprise. What are we looking for Adaptable and flexibleAbility to perform under pressureProblem-solving skillsAbility to establish strong client relationshipAgility for quick learningThis request is raised for Contract Conversion Roles and Responsibilities: In this role you are required to do analysis and solving of lower-complexity problems Your day to day interaction is with peers within Accenture before updating supervisors In this role you may have limited exposure with clients and/or Accenture management You will be given moderate level instruction on daily work tasks and detailed instructions on new assignments The decisions you make impact your own work and may impact the work of others You will be an individual contributor as a part of a team, with a focused scope of work Please note that this role may require you to work in rotational shifts Qualification BE

Posted 3 weeks ago

Apply

1.0 - 4.0 years

2 - 5 Lacs

Gurugram

Work from Office

LocationBangalore/Hyderabad/Pune Experience level8+ Years About the Role We are looking for a technical and hands-on Lead Data Engineer to help drive the modernization of our data transformation workflows. We currently rely on legacy SQL scripts orchestrated via Airflow, and we are transitioning to a modular, scalable, CI/CD-driven DBT-based data platform. The ideal candidate has deep experience with DBT , modern data stack design , and has previously led similar migrations improving code quality, lineage visibility, performance, and engineering best practices. Key Responsibilities Lead the migration of legacy SQL-based ETL logic to DBT-based transformations Design and implement a scalable, modular DBT architecture (models, macros, packages) Audit and refactor legacy SQL for clarity, efficiency, and modularity Improve CI/CD pipelines for DBTautomated testing, deployment, and code quality enforcement Collaborate with data analysts, platform engineers, and business stakeholders to understand current gaps and define future data pipelines Own Airflow orchestration redesign where needed (e.g., DBT Cloud/API hooks or airflow-dbt integration) Define and enforce coding standards, review processes, and documentation practices Coach junior data engineers on DBT and SQL best practices Provide lineage and impact analysis improvements using DBTs built-in tools and metadata Must-Have Qualifications 8+ years of experience in data engineering Proven success in migrating legacy SQL to DBT , with visible results Deep understanding of DBT best practices , including model layering, Jinja templating, testing, and packages Proficient in SQL performance tuning , modular SQL design, and query optimization Experience with Airflow (Composer, MWAA), including DAG refactoring and task orchestration Hands-on experience with modern data stacks (e.g., Snowflake, BigQuery etc.) Familiarity with data testing and CI/CD for analytics workflows Strong communication and leadership skills; comfortable working cross-functionally Nice-to-Have Experience with DBT Cloud or DBT Core integrations with Airflow Familiarity with data governance and lineage tools (e.g., dbt docs, Alation) Exposure to Python (for custom Airflow operators/macros or utilities) Previous experience mentoring teams through modern data stack transitions

Posted 3 weeks ago

Apply

3.0 - 6.0 years

5 - 9 Lacs

Chennai

Work from Office

Description Analyzing and translating business needs into long-term solution data models. Evaluating existing data systems. Working with the development team to create conceptual data models and data flows. Developing best practices for data coding to ensure consistency within the system. Reviewing modifications of existing systems for cross-compatibility. Implementing data strategies and developing physical data models. Updating and optimizing local and metadata models. Evaluating implemented data systems for variances, discrepancies, and efficiency. Maintain logical and physical data models along with accurate metadata. Analyze data-related system integration challenges and propose appropriate solutions with strategic approach Should have Strong knowledge in Databases, cloud technologies, Data Valut Architecture.

Posted 3 weeks ago

Apply

8.0 - 13.0 years

7 - 12 Lacs

Pune

Work from Office

: Job TitleBusiness Functional Analyst Corporate TitleAssociate LocationPune, India Role Description Business Functional Analysis is responsible for business solution design in complex project environments (e.g. transformational programmes). Work includes: Identifying the full range of business requirements and translating requirements into specific functional specifications for solution development and implementation Analysing business requirements and the associated impacts of the changes Designing and assisting businesses in developing optimal target state business processes Creating and executing against roadmaps that focus on solution development and implementation Answering questions of methodological approach with varying levels of complexity Aligning with other key stakeholder groups (such as Project Management & Software Engineering) to support the link between the business divisions and the solution providers for all aspects of identifying, implementing and maintaining solutions What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Write clear and well-structured business requirements/documents. Convert roadmap features into smaller user stories. Analyse process issues and bottlenecks and to make improvements. Communicate and validate requirements with relevant stakeholders. Perform data discovery, analysis, and modelling. Assist with project management for selected projects. Understand and translate business needs into data models supporting long-term solutions. Understand existing SQL/Python code convert to business requirement. Write advanced SQL and Python scripts. Your skills and experience A minimum of 8+ years of experience in business analysis or a related field. Exceptional analytical and conceptual thinking skills. Proficient in SQL. Proficient in Python for data engineering. Experience in automating ETL testing using python and SQL. Exposure on GCP services corresponding cloud storage, data lake, database, data warehouse; like Big Query, GCS, Dataflow, Cloud Composer, gsutil, Shell Scripting etc. Previous experience in Procurement and Real Estate would be plus. Competency in JIRA, Confluence, draw i/o and Microsoft applications including Word, Excel, Power Point an Outlook. Previous Banking Domain experience is a plus. Good problem-solving skills How well support you .

Posted 3 weeks ago

Apply

7.0 - 12.0 years

30 - 35 Lacs

Pune

Work from Office

: Job TitleSenior Engineer LocationPune, India Corporate TitleAVP Role Description Investment Banking is technology centric businesses, with an increasing move to real-time processing, an increasing appetite from customers for integrated systems and access to supporting data. This means that technology is more important than ever for business. The IB CARE Platform aims to increase the productivity of both Google Cloud and on-prem application development by providing a frictionless build and deployment platform that offers service and data reusability. The platform provides the chassis and standard components of an application ensuring reliability, usability and safety and gives on-demand access to services needed to build, host and manage applications on the cloud/on-prem. In addition to technology services the platform aims to have compliance baked in, enforcing controls/security reducing application team involvement in SDLC and ORR controls enabling teams to focus more on application development and release to production faster. We are looking for a platform engineer to join a global team working across all aspects of the platform from GCP/on-prem infrastructure and application deployment through to the development of CARE based services. Deutsche Bank is one of the few banks with the scale and network to compete aggressively in this space, and the breadth of investment in this area is unmatched by our peers. Joining the team is a unique opportunity to help build a platform to support some of our most mission critical processing systems. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your Key Responsibilities As a CARE platform engineer you will be working across the board on activities to build/support the platform and liaising with tenants. To be successful in this role the below are key responsibility areas: Responsible for managing and monitoring cloud computing systems and providing technical support to ensure the systems efficiency and security Work with platform leads and platform engineers at technical level. Liaise with tenants regarding onboarding and providing platform expertise. Contribute to the platform offering as part of Sprint deliverables. Support the production platform as part of the wider team. Your skills and experience Understanding of GCP and services such as GKE, IAM, identity services and Cloud SQL. Kubernetes/Service Mesh configuration. Experience in IaaS tooling such as Terraform. Proficient in SDLC / DevOps best practices. Github experience including Git workflow. Exposure to modern deployment tooling, such as ArgoCD, desirable. Programming experience (such as Java/Python) desirable. A strong team player comfortable in a cross-cultural and diverse operating environment Result oriented and ability to deliver under tight timelines. Ability to successfully resolve conflicts in a globally matrix driven organization. Excellent communication and collaboration skills Must be comfortable with navigating ambiguity to extract meaningful risk insights. How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 3 weeks ago

Apply

6.0 - 10.0 years

27 - 32 Lacs

Bengaluru

Work from Office

: Job TitleBusiness Functional Analyst LocationBangalore, India Corporate TitleAVP Role Description Within the Securities Services division, the Fund Services product family has new opportunities for suitable candidates to be part of their data and digital transformation program. Business Functional Analyst will help us build and maintain cloud-based digital data analytics solutions including microservices. You will be responsible for designing, developing, testing and deploying web services and APIs; working in close collaboration with business, product and operations in an Agile culture. Successful candidates are expected to be experts in their fields and hands-on, and passionate to create and deliver. You will bring disciplined and expert approaches to quality software development, focused on optimising the application of technologies and good governance practices to deliver positive impacts and value to business and clients. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. Responsible for enhancing, modifying and/or maintaining applications in the Securities Services Technology environment. Business analysts document requirements, contribute to test planning and execution, as well as contribute to support activities for the securities services systems architecture. Employees work closely with business partners in defining requirements for system applications. Employees typically have in-depth knowledge of analysis tools. Is clearly recognized as a content expert by peers. Individual contributor role. Typically requires 6-10 years of applicable experience. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Responsible for analyzing business requirements Responsible for designing user interface paying attention to UX Responsible for documenting end to end flows Responsible for documenting specific requirements arising from the end to end flows Participating fully in the development process through the entire software lifecycle. Participating fully in agile software development process Participate in regular meetings with stakeholders, prepare and document meetings, track progress. Use BDD techniques, collaborating closely with users, developers, and other testers. Make sure we are building the right thing. Ensure that the software the team builds is reliable and easy to support in production. Be prepared to take your turn on call providing 3rd line support when its needed Help your team to build, test and release software within short lead times and with minimum of waste. Work to develop and maintain a highly automated Continuous Delivery pipeline. Help create a culture of learning and continuous improvement within your team and beyond Self-motivated, multi-tasking with strong analytical and problem-solving skills, solution oriented with proven results. Your skills and experience Experience/Exposure Experience in analysis or operations of Fund accounting for a fund manager or funds services provider Experience in doing analysis for products like Simcorp Dimension and or Multifonds or a similar application Experience in analysis for building interfaces between applications including vendor apps. Knowledge of data for Fund Accounting & Performance attribution is good to have. Knowledge of latest app development paradigmsmicroservices, containerization, APIs is good to have Knowledge of SQL and relational databases is good to have, Knowledge and experience of GCP BigQuery is good to have Experience working in an agile team, practicing Scrum, Kanban or XP The ideal candidate will also have: Behavior Driven Development, particularly experience of how it can be used to define requirements in a collaborative manner to ensure the team builds the right thing and create a system of living documentation Will be Added Advantage if candidate has exposure to Architecture and design approaches that support rapid, incremental, and iterative delivery, such as Domain Driven Design, CQRS, Event Sourcing and micro services. Ability to quickly learn new tools/systems and understanding technology and functional design. Business Competencies: Communication - Experienced Financial Management - Basic Industry Knowledge - Experienced Innovation - Basic Managing Complexity - Basic Product Knowledge (internal & external) - Experienced Risk Management - Basic How well support you

Posted 3 weeks ago

Apply

7.0 - 12.0 years

40 - 45 Lacs

Pune

Work from Office

: Job Title - Data Platform Engineer - Tech Lead Location - Pune, India Role Description DB Technology is a global team of tech specialists, spread across multiple trading hubs and tech centers. We have a strong focus on promoting technical excellence our engineers work at the forefront of financial services innovation using cutting-edge technologies. DB Pune location plays a prominent role in our global network of tech centers, it is well recognized for its engineering culture and strong drive to innovate. We are committed to building a diverse workforce and to creating excellent opportunities for talented engineers and technologists. Our tech teams and business units use agile ways of working to create best solutions for the financial markets. CB Data Services and Data Platform We are seeking an experienced Software Engineer with strong leadership skills to join our dynamic tech team. In this role, you will lead a group of engineers working on cutting-edge technologies in Hadoop, Big Data, GCP, Terraform, Big Query, Data Proc and data management. You will be responsible for overseeing the development of robust data pipelines, ensuring data quality, and implementing efficient data management solutions. Your leadership will be critical in driving innovation, ensuring high standards in data infrastructure, and mentoring team members. Your responsibilities will include working closely with data engineers, analysts, cross-functional teams, and other stakeholders to ensure that our data platform meets the needs of our organization and supports our data-driven initiatives. Join us in building and scaling our tech solutions including hybrid data platform to unlock new insights and drive business growth. If you are passionate about data engineering, we want to hear from you! Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel.You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Technical Leadership: Lead a cross-functional team of engineers in the design, development, and implementation of on prem and cloud-based data solutions. Provide hands-on technical guidance and mentorship to team members, fostering a culture of continuous learning and improvement. Collaborate with product management and stakeholders to define technical requirements and establish delivery priorities. . Architectural and Design Capabilities: Architect and implement scalable, efficient, and reliable data management solutions to support complex data workflows and analytics. Evaluate and recommend tools, technologies, and best practices to enhance the data platform. Drive the adoption of microservices, containerization, and serverless architectures within the team. Quality Assurance: Establish and enforce best practices in coding, testing, and deployment to maintain high-quality code standards. Oversee code reviews and provide constructive feedback to promote code quality and team growth. Your skills and experience Technical Skills: Bachelor's or Masters degree in Computer Science, Engineering, or related field. 7+ years of experience in software engineering, with a focus on Big Data and GCP technologies such as Hadoop, PySpark, Terraform, BigQuery, DataProc and data management. Proven experience in leading software engineering teams, with a focus on mentorship, guidance, and team growth. Strong expertise in designing and implementing data pipelines, including ETL processes and real-time data processing. Hands-on experience with Hadoop ecosystem tools such as HDFS, MapReduce, Hive, Pig, and Spark. Hands on experience with cloud platform particularly Google Cloud Platform (GCP), and its data management services (e.g., Terraform, BigQuery, Cloud Dataflow, Cloud Dataproc, Cloud Storage). Solid understanding of data quality management and best practices for ensuring data integrity. Familiarity with containerization and orchestration tools such as Docker and Kubernetes is a plus. Excellent problem-solving skills and the ability to troubleshoot complex systems. Strong communication skills and the ability to collaborate with both technical and non-technical stakeholders Leadership Abilities: Proven experience in leading technical teams, with a track record of delivering complex projects on time and within scope. Ability to inspire and motivate team members, promoting a collaborative and innovative work environment. Strong problem-solving skills and the ability to make data-driven decisions under pressure. Excellent communication and collaboration skills. Proactive mindset, attention to details, and constant desire to improve and innovate. How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

32 - 37 Lacs

Bengaluru

Work from Office

: Job Title AI Engineer, AVP LocationBangalore, India Role Description We are seeking a talented and experienced AI Engineer to join our team. The ideal candidate will be hands-on and drive design, development, and implementation of AI based solutions for CB Tech. This role involves working with large datasets, conducting experiments, and staying updated with the latest advancements in AI and Machine Learning. This person is expected to innovate and lead the efforts in modernizing the engineering landscape by identifying AI use cases and provide local support. E.g. Test Automation using AI etc. If you are actively coding, have a passion for AI and want to be part of developing innovative products then apply today. What well offer you , 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities What Youll Do Design, develop and implement AI and Gen-AI based Agentic software systems on the cloud. Write MCP Servers and Clients. Collaborate with other development teams and SMEs to integrate shared services into products. Learn Deutsche Banks AI Governance framework and operate within safe AI principles. Leverage architecture decision trees to pick strategic AI patterns to solve business problems. Integrate Gen-AI APIs with cloud-native presentation (GKE, Cloud Run) and persistent layers (PostgreSQL, BQ). Run systems at scale while continuing to innovate and evolve. Work with data engineers and scientists to ensure effective data collection and preparation for training AI models. Continuously monitor the performance of AI solutions and implement improvements. Lead training sessions and create comprehensive documentation to empower end users. Function as an active member of an agile team. Your skills and experience AI ExpertiseProficiency in frameworks (Langchain, Streamlit or similar), libraries (Scikit-learn or similar) and cloud platforms (GCP preferable) for AI/ML. Prompt Engineering & RAGSkills in crafting effective prompts and enhancing AI outputs with external data integration. NLP KnowledgeStrong understanding of natural language processing and conversational AI technologies. Deployment & OperationsExperience in model deployment, monitoring, optimization (MLOps), and problem-solving. Proficiency with cloud-native orchestration systems (Docker/Kubernetes). Proficiency in Python or Java, and SQL. Knowledge of RESTful design. Experience working with different types of enterprise and real-world data sets structured, semi-structured and unstructured data. Experience putting ML/AI into production, and ability to talk through best practices and pitfalls. Relationship and consensus building skills. Skills That Will Help You Excel Stakeholder CommunicationAbility to explain AI concepts to non-technical audiences and collaborate cross-functionally. Adaptability & InnovationFlexibility in learning new tools and developing innovative solutions. Experience in GCP Vertex AI. Experience with cloud-native databases/warehouses (PG and BigQuery). Experience in data visualization and observability with a focus on real time serving and monitoring of time series data with alerts. Thought Leadership & AdvocacyDevelop awareness of industry developments and best practices Provide thought leadership in emerging technologies as they relate to AI topics. How well support you . . . . About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm

Posted 3 weeks ago

Apply

9.0 - 14.0 years

30 - 35 Lacs

Pune

Work from Office

: Job TitleProduction Specialist, AVP LocationPune, India Role Description Our organization within Deutsche Bank is AFC Production Services. We are responsible for providing technical L2 application support for business applications. The AFC (Anti-Financial Crime) line of business has a current portfolio of 25+ applications. The organization is in process of transforming itself using Google Cloud and many new technology offerings. As an Assistant Vice President, your role will include hands-on production support and be actively involved in technical issues resolution across multiple applications. You will also be working as application lead and will be responsible for technical & operational processes for all application you support. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support." What well offer you , 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Provide technical support by handling and consulting on BAU, Incidents/emails/alerts for the respective applications. Perform post-mortem, root cause analysis using ITIL standards of Incident Management, Service Request fulfillment, Change Management, Knowledge Management, and Problem Management. Manage regional L2 team and vendor teams supporting the application. Ensure the team is up to speed and picks up the support duties. Build up technical subject matter expertise on the applications being supported including business flows, application architecture, and hardware configuration. Define and track KPIs, SLAs and operational metrics to measure and improve application stability and performance. Conduct real time monitoring to ensure application SLAs are achieved and maximum application availability (up time) using an array of monitoring tools. Build and maintain effective and productive relationships with the stakeholders in business, development, infrastructure, and third-party systems / data providers & vendors. Assist in the process to approve application code releases as well as tasks assigned to support to perform. Keep key stakeholders informed using communication templates. Approach support with a proactive attitude, desire to seek root cause, in-depth analysis, and strive to reduce inefficiencies and manual efforts. Mentor and guide junior team members, fostering technical upskill and knowledge sharing. Provide strategic input into disaster recovery planning, failover strategies and business continuity procedures Collaborate and deliver on initiatives and install these initiatives to drive stability in the environment. Perform reviews of all open production items with the development team and push for updates and resolutions to outstanding tasks and reoccurring issues. Drive service resilience by implementing SRE(site reliability engineering) principles, ensuring proactive monitoring, automation and operational efficiency. Ensure regulatory and compliance adherence, managing audits,access reviews, and security controls in line with organizational policies. The candidate will have to work in shifts as part of a Rota covering APAC and EMEA hours between 07:00 IST and 09:00 PM IST (2 shifts). In the event of major outages or issues we may ask for flexibility to help provide appropriate cover. Weekend on-call coverage needs to be provided on rotational/need basis. Your skills and experience 9-15 years of experience in providing hands on IT application support. Experience in managing vendor teams providing 24x7 support. Preferred Team lead role experience, Experience in an investment bank, financial institution. Bachelors degree from an accredited college or university with a concentration in Computer Science or IT-related discipline (or equivalent work experience/diploma/certification). Preferred ITIL v3 foundation certification or higher. Knowledgeable in cloud products like Google Cloud Platform (GCP) and hybrid applications. Strong understanding of ITIL /SRE/ DEVOPS best practices for supporting a production environment. Understanding of KPIs, SLO, SLA and SLI Monitoring ToolsKnowledge of Elastic Search, Control M, Grafana, Geneos, OpenShift, Prometheus, Google Cloud Monitoring, Airflow,Splunk. Working Knowledge of creation of Dashboards and reports for senior management Red Hat Enterprise Linux (RHEL) professional skill in searching logs, process commands, start/stop processes, use of OS commands to aid in tasks needed to resolve or investigate issues. Shell scripting knowledge a plus. Understanding of database concepts and exposure in working with Oracle, MS SQL, Big Query etc. databases. Ability to work across countries, regions, and time zones with a broad range of cultures and technical capability. Skills That Will Help You Excel Strong written and oral communication skills, including the ability to communicate technical information to a non-technical audience and good analytical and problem-solving skills. Proven experience in leading L2 support teams, including managing vendor teams and offshore resources. Able to train, coach, and mentor and know where each technique is best applied. Experience with GCP or another public cloud provider to build applications. Experience in an investment bank, financial institution or large corporation using enterprise hardware and software. Knowledge of Actimize, Mantas, and case management software is good to have. Working knowledge of Big Data Hadoop/Secure Data Lake is a plus. Prior experience in automation projects is great to have. Exposure to python, shell, Ansible or other scripting language for automation and process improvement Strong stakeholder management skills ensuring seamless coordination between business, development, and infrastructure teams. Ability to manage high-pressure issues, coordinating across teams to drive swift resolution. Strong negotiation skills with interface teams to drive process improvements and efficiency gains. How well support you . . . About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 3 weeks ago

Apply

15.0 - 20.0 years

32 - 40 Lacs

Pune

Work from Office

: Job TitleSenior Engineer, VP LocationPune, India Role Description Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the endto-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel.You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support." What well offer you , 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities The candidate is expected to Hands-on engineering lead involved in analysis, design, design/code reviews, coding and release activities Champion engineering best practices and guide/mentor team to achieve high performance. Work closely with Business stakeholders, Tribe lead, Product Owner, Lead Architect to successfully deliver the business outcomes. Acquire functional knowledge of the business capability being digitized/re-engineered. Demonstrate ownership, inspire others, innovative thinking, growth mindset and collaborate for success. Your skills and experience Minimum 15 years of IT industry experience in Full stack development Expert in Java, Spring Boot, NodeJS, ReactJS, Strong experience in Big data processing Apache Spark, Hadoop, Bigquery, DataProc, Dataflow etc Strong experience in Kubernetes, OpenShift container platform Experience in Data streaming i.e. Kafka, Pub-sub etc Experience of working on public cloud GCP preferred, AWS or Azure Knowledge of various distributed/multi-tiered architecture styles Micro-services, Data mesh, Integrationpatterns etc Experience on modern software product delivery practices, processes and tooling and BIzDevOps skills such asCI/CD pipelines using Jenkins, Git Actions etc Experience on leading teams and mentoring developers Key Skill: Java Spring Boot NodeJS SQL/PLSQL ReactJS Advantageous: Having prior experience in Banking/Finance domain Having worked on hybrid cloud solutions preferably using GCP Having worked on product development How well support you . . . . About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 3 weeks ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

Bengaluru

Work from Office

A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. In your role, you will be responsible for: Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done Preferred technical and professional experience Experience with AEM Core Technologies OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence

Posted 3 weeks ago

Apply

3.0 - 5.0 years

11 - 16 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities Knowledge of design principles and fundamentals of architecture Understanding of performance engineering Knowledge of quality processes and estimation techniques Basic understanding of project domain Ability to translate functional / nonfunctional requirements to systems requirements Ability to design and code complex programs Ability to write test cases and scenarios based on the specifications Good understanding of SDLC and agile methodologies Awareness of latest technologies and trends Logical thinking and problem-solving skills along with an ability to collaborate Technical and Professional : Technology-Cloud Platform-GCP Database-Google BigQuery Preferred Skills: Technology-Cloud Platform-Google Big Data-GCP Technology-Cloud Platform-GCP Core Services-GCP Technology-Cloud Platform-GCP Data Analytics Technology-Cloud Platform-GCP Database

Posted 3 weeks ago

Apply

0.0 - 3.0 years

6 - 8 Lacs

Noida

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)

Posted 3 weeks ago

Apply

4.0 - 9.0 years

9 - 19 Lacs

Chennai

Hybrid

Position Description: Role: As a Software Engineer on our team, you will be instrumental in developing and maintaining key features for our applications. You'll be involved in all stages of the software development lifecycle, from design and implementation to testing and deployment. Responsibilities: Develop and Maintain Application Features: Implement new features and maintain existing functionality for both the front-end and back-end of our applications. Front-End Development: Build user interfaces using React or Angular, ensuring a seamless and engaging user experience. Back-End Development: Design, develop, and maintain robust and scalable back-end services using [Backend Tech - e.g., Node.js, Python/Django, Java/Spring, React]. Cloud Deployment: Deploy and manage applications on Google Cloud Platform (GCP), leveraging services like [GCP Tech - e.g., App Engine, Cloud Functions, Kubernetes]. Performance Optimization: Identify and address performance bottlenecks to ensure optimal speed and scalability of our applications. Code Reviews: Participate in code reviews to maintain code quality and share knowledge with team members. Unit Testing: Write and maintain unit tests to ensure the reliability and correctness of our code. SDLC Participation: Actively participate in all phases of the software development lifecycle, including requirements gathering, design, implementation, testing, and deployment. Collaboration: Work closely with product managers, designers, and other engineers to deliver high-quality software that meets user needs. Skills Required: Python, GCP, Angular, DevOps Skills Preferred: API, Tekton, TERRAFORM Experience Required: 4+ years of professional software development experience

Posted 3 weeks ago

Apply

0.0 - 1.0 years

8 - 10 Lacs

Hyderabad

Work from Office

Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)

Posted 3 weeks ago

Apply

4.0 - 8.0 years

22 - 25 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies