Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 9.0 years
10 - 20 Lacs
bengaluru
Work from Office
Job Title: GCP Data Engineer Work Mode: Onsite 4 days a week Base Location: Bangalore Experience Required: 4+ Years Job Summary: We are looking for a GCP Data Engineer with strong expertise in BigQuery and hands-on experience in building scalable data pipelines and analytical solutions on Google Cloud Platform. The ideal candidate will have a solid background in data modeling, ETL/ELT processes, and performance optimization, with exposure to other GCP data services like Dataflow, Pub/Sub, and Dataproc. Key Responsibilities: Design, develop, and optimize data pipelines and ETL processes using BigQuery as the primary data warehouse. Implement data models, partitioning, and clustering strategies for high-performance analytics. Work with Dataflow, Pub/Sub, and Composer (Airflow) for real-time and batch processing pipelines. Collaborate with cross-functional teams to integrate data solutions into business workflows. Ensure data quality, governance, and security standards are met. Perform performance tuning and cost optimization for BigQuery and related GCP services. Troubleshoot and resolve production data issues, ensuring reliability and scalability. Required Skills: 4+ years of experience in data engineering with strong focus on BigQuery . Proficiency in SQL and experience with Python for data processing. Hands-on experience with GCP services Dataflow, Pub/Sub, Dataproc, Composer. Strong understanding of data modeling, partitioning, and performance optimization . Knowledge of CI/CD pipelines and version control tools (Git). Preferred Skills: Experience with streaming data pipelines and real-time analytics. Exposure to Terraform or other Infrastructure as Code (IaC) tools. Familiarity with data lake architectures and hybrid data solutions. Basic understanding of ML pipelines on GCP (e.g., Vertex AI) is a plus. Tech Stack: Cloud: GCP (BigQuery, Dataflow, Pub/Sub, Dataproc, Composer) Programming: Python, SQL Orchestration: Airflow (Composer) Version Control / CI-CD: Git, Cloud Build, Jenkins Focus Areas: BigQuery optimization, scalable data pipelines, GCP ecosystem integration
Posted Date not available
5.0 - 8.0 years
5 - 8 Lacs
chennai
Work from Office
Mandatory Skills: GCP, Python, SQL, Snap logic, Airflow, Data flow 4+ years of experience in to GCP Data engineer. GCP Components (Composer, BQ, Data flow, Pub/Sub, Cloud functions, GCS storage) Intermediate SQL Advanced level ETL Processing Advanced level Python Advanced level Rest API Must Have Java Advanced level Other components in GCP Good to Have Snap logic Must Have Red shift Good to Have
Posted Date not available
6.0 - 10.0 years
4 - 8 Lacs
tiruchirapalli
Work from Office
Exp Level: 6-8 Years in SAP Excellent business communicator, talk business language,customer facing skills with accountability & ownership of customer success. Problem Solver and team player. Open for Travel Domestic and International. Collaborate with clients to understand their businessrequirements and translate them into technical specs for data sphere and SAC. Experience in configuring, developing, testing, anddeploying SAC. Experience in developing and configuring stories,dashboards, models, and data acquisition routines in SAC. Familiarity with SAP BW / 4HANA and/or SAP HANA developmentbest practices and/or SAP Analytics Cloud APIs. Data modelling and data transformation in Data Sphere/SAC. Experience working on SAP BW bridge to integrate andtransfer data between systems. Experience in developing and maintaining SAP Data Sphere -Tables, Views, Lookup, data flow, Task chain. Experience in building SAC story, Models, Teams, Roles.Hands on expertise in Period closing and year closing processes. Should have hands on experience in preparing following keydeliverable as per pre-defined project templates. Qualifications: Any Graduate / Post Graduate
Posted Date not available
15.0 - 20.0 years
35 - 40 Lacs
pune
Work from Office
: Job TitleLead Engineer LocationPune, India Role Description Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank What we'll offer you: As part of our flexible scheme, here are just some of the benefits that you'll enjoy 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your Key Responsibilities: The candidate is expected to; Hands-on engineering lead involved in analysis, design, design/code reviews, coding and release activities Champion engineering best practices and guide/mentor team to achieve high performance. Work closely with Business stakeholders, Tribe lead, Product Owner, Lead Architect to successfully deliver the business outcomes. Acquire functional knowledge of the business capability being digitized/re-engineered. Demonstrate ownership, inspire others, innovative thinking, growth mindset and collaborate for success. Your Skills & Experience: Minimum 15 years of IT industry experience in Full stack development Expert in Java, Spring Boot, NodeJS, SQL/PLSQL, ReactJS, Strong experience in Big data processing Apache Spark, Hadoop, Bigquery, DataProc, Dataflow etc Strong experience in Kubernetes, OpenShift container platform Experience with Databases Oracle, PostgreSQL, MongoDB, Redis/hazelcast, should understand data modeling, normalization, and performance optimization Experience in message queues (RabbitMQ/IBM MQ, JMS) and Data streaming i.e. Kafka, Pub-sub etc Experience of working on public cloud GCP preferred, AWS or Azure Knowledge of various distributed/multi-tiered architecture styles Micro-services, Data mesh, Integration patterns etc Experience on modern software product delivery practices, processes and tooling and BIzDevOps skills such as CI/CD pipelines using Jenkins, Git Actions etc Experience on designing solutions, based on DDD and implementing Clean / Hexagonal Architecture efficient systems that can handle large-scale operation Experience on leading teams and mentoring developers Focus on quality experience with TDD, BDD, Stress and Contract Tests Proficient in working with APIs (Application Programming Interfaces) and understand data formats like JSON, XML, YAML, Parquet etc Key Skills: Java Spring Boot NodeJS SQL/PLSQL ReactJS Advantageous: Having prior experience in Banking/Finance domain Having worked on hybrid cloud solutions preferably using GCP Having worked on product development How we'll support you: About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted Date not available
11.0 - 16.0 years
40 - 45 Lacs
pune
Work from Office
About The Role : Job Title - IT Architect Specialist, AVP Location - Pune, India Role Description This role is for a Senior business functional analyst for Group Architecture. This role will be instrumental in establishing and maintaining bank wide data policies, principles, standards and tool governance. The Senior Business Functional Analyst acts as a link between the business divisions and the data solution providers to align the target data architecture against the enterprise data architecture principles, apply agreed best practices and patterns. Group Architecture partners with each division of the bank to ensure that Architecture is defined, delivered, and managed in alignment with the banks strategy and in accordance with the organizations architectural standards. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Data Architecture: The candidate will work closely with stakeholders to understand their data needs and break out business requirements into implementable building blocks and design the solution's target architecture. GCP Data Architecture & Migration: A strong working experience on GCP Data architecture is must (BigQuery, Dataplex, Cloud SQL, Dataflow, Apigee, Pub/Sub, ...). Appropriate GCP architecture level certification. Experience in handling hybrid architecture & patterns addressing non- functional requirements like data residency, compliance like GDPR and security & access control. Experience in developing reusable components and reference architecture using IaaC (Infrastructure as a code) platforms such as terraform. Data Mesh: The candidate is expected to have proficiency in Data Mesh design strategies that embrace the decentralization nature of data ownership. The candidate must have good domain knowledge to ensure that the data products developed are aligned with business goals and provide real value. Data Management Tool: Access various tools and solutions comprising of data governance capabilities like data catalogue, data modelling and design, metadata management, data quality and lineage and fine-grained data access management. Assist in development of medium to long term target state of the technologies within the data governance domain. Collaborate with stakeholders, including business leaders, project managers, and development teams, to gather requirements and translate them into technical solutions. Your skills and experience Extensive experience in data architecture, within Financial Services Strong technical knowledge of data integration patterns, batch & stream processing, data lake/ data lake house/data warehouse/data mart, caching patterns and policy bases fine grained data access. Proven experience in working on data management principles, data governance, data quality, data lineage and data integration with a focus on Data Mesh Knowledge of Data Modelling concepts like dimensional modelling and 3NF. Experience of systematic structured review of data models to enforce conformance to standards. High level understanding of data management solutions e.g. Collibra, Informatica Data Governance etc. Proficiency at data modeling and experience with different data modelling tools. Very good understanding of streaming and non-streaming ETL and ELT approaches for data ingest. Strong analytical and problem-solving skills, with the ability to identify complex business requirements and translate them into technical solutions. How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted Date not available
9.0 - 14.0 years
27 - 42 Lacs
hyderabad
Work from Office
Overview We are seeking an experienced and strategic Business Analyst / Functional Lead to drive solution definition, business alignment, and successful delivery of Real-Time Decisioning initiatives and play a critical role in translating complex business needs into actionable functional requirements, guiding cross-functional teams, and shaping customer-centric decisioning strategies across digital channels. Responsibilities Gather, analyze, and document business and functional requirements for decisioning use cases (e.g., next-best-action, personalized offers, customer journeys). Act as the primary liaison between business stakeholders, product owners, and technical teams for real-time decisioning solutions. Define and maintain decision logic, business rules, and outcome scenarios in alignment with marketing and CX goals. Facilitate all Agile ceremonies including sprint planning, daily stand-ups, reviews, and retrospectives. Guide the team in Agile practices, track sprint progress, and manage delivery risks. Remove blockers and coordinate across business, design, tech, QA, and operations teams. Maintain ADO board, backlog grooming, sprint metrics, and continuous improvement initiatives. Collaborate with solution architects to design customer-centric, scalable real-time decisioning frameworks. Lead discovery and requirement workshops with marketing, data, and technology stakeholders. Own the functional design documents, user stories, and solution blueprints; ensure clarity, accuracy, and traceability. Work with engineering teams to define test scenarios and validate decisioning outputs. Support rollout, training, and adoption of decisioning platforms across business units. Continuously monitor and optimize decisioning logic and KPIs in partnership with analytics teams. Qualifications Candidate should have total 9 14 years of total IT experience and at least 3+ years of relevant work experience as an RTD Functional Lead and business analysis, functional consulting, or similar roles in MarTech, AdTech, or CX platforms. Bachelor's or Master's degree in computer science, information technology, or a related field. Strong understanding of real-time decisioning platforms such as Salesforce Marketing Cloud Personalization / Interaction Studio CleverTap Proven ability to map customer journeys and define decision strategies based on personas, behavior, and context. Skilled in requirement gathering, functional documentation, user story writing, and backlog management. Excellent understanding of data flows, business rules, segmentation, and targeting. Ability to translate business needs into logical rules, decision tables, and KPIs. Strong communication and stakeholder management skills across business and technical audiences.
Posted Date not available
5.0 - 9.0 years
20 - 25 Lacs
pune
Work from Office
Primary Responsibilities Provide engineering leadership, mentorship, technical direction to small team of other engineers (~6 members). Partner with your Engineering Manager to ensure engineering tasks understood, broken down and implemented to the highest of quality standards. Collaborate with members of the team to solve challenging engineering tasks on time and with high quality. Engage in code reviews and training of team members. Support continuous deployment pipeline code. Situationally troubleshoot production issues alongside the support team. Continually research and recommend product improvements. Create and integrate features for our enterprise software solution using the latest Python technologies. Assist and adhere to enforcement of project deadlines and schedules. Evaluate, recommend, and proposed solutions to existing systems. Actively communicate with team members to clarify requirements and overcome obstacles to meet the team goals. Leverage open-source and other technologies and languages outside of the Python platform. Develop cutting-edge solutions to maximize the performance, scalability, and distributed processing capabilities of the system. Provide troubleshooting and root cause analysis for production issues that are escalated to the engineering team. Work with development teams in an agile context as it relates to software development, including Kanban, automated unit testing, test fixtures, and pair programming. Requirement of 4-8 or more years experience as a Python developer on enterprise projects using Python, Flask, FastAPI, Django, PyTest, Celery and other Python frameworks. Software development experience including: object-oriented programming, concurrency programming, modern design patterns, RESTful service implementation, micro-service architecture, test-driven development, and acceptance testing. Familiarity with tools used to automate the deployment of an enterprise software solution to the cloud, Terraform, GitHub Actions, Concourse, Ansible, etc. Proficiency with Git as a version control system Experience with Docker and Kubernetes Experience with relational SQL and NoSQL databases, including MongoDB and MSSQL. Experience with object-oriented languages: Python, Java, Scala, C#, etc. Experience with testing tools such as PyTest, Wiremock, xUnit, mocking frameworks, etc. Experience with GCP technologies such as BigQuery, GKE, GCS, DataFlow, Kubeflow, and/or VertexAI Excellent problem solving and communication skills. Experience with Java and Spring a big plus. For individuals with disabilities that need additional assistance at any point , please email .
Posted Date not available
15.0 - 20.0 years
4 - 8 Lacs
bengaluru
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with data modeling and database design.- Strong understanding of ETL processes and data integration techniques.- Familiarity with cloud platforms and services related to data storage and processing.- Experience in performance tuning and optimization of data queries. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted Date not available
5.0 - 7.0 years
7 - 9 Lacs
gurugram
Work from Office
Skilled Multiple GCP services - GCS, Big Query, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets MA Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills
Posted Date not available
10.0 - 15.0 years
8 - 12 Lacs
bengaluru
Work from Office
Position Description: Education Qualification: Bachelor's degree in Computer Science or related field or higher with minimum 8 years of relevant experience. Your future duties and responsibilities: Scan and or review, research and analyses the Business domain organizations and generate requirements. Prepare, articulate, create and clarify Business requirements to Product teams and other stakeholders. Actively hosts forums or conducts Working group sessions User Group sessions to collate analyses feedback from the end users. Participates in Industry forums or community groups. This would help understand the underlying business nuances or the future trends to help assist the ERP to react faster. Should act demonstrate as the Subject Matter Expert for the areas identified or owned by him her. Mentoring and Grooming Business Analysts and SMEs in the team. Participate in transitioning the requirements and use cases to the designers, developer and ensure in a manner that helps determines the various integrations, flows and dependencies of features across the product. Reviews Solution Design documents, user stories and other functional artifacts to ensure that they are in alignment with the Business need. Work closely with Product Managers and internal stakeholders in identifying gaps, enhancements and priorities for the Product. Assisting Sales teams and other stake holders with demos, RFPs, product evaluations, prototypes, brainstormings activities Work with the delivery/development teams by driving towards Product road map Should be able to advise alternative solutions and provide the trade-offs as well as the final recommendation. Ability to research across provinces or international markets to understand the variances or alignment from a global perspective will be added advantage if worked In Government ERP Domain Should have knowledge on Business Processes like Procurement Lifecycle, Accounts Payable lifecycle, Accounts Receivable Lifecycle Required qualifications to be successful in this role: Must to have Skills- ERP Financial Functional Expert-Communication has to be excellent and should have knowledge on Financial sub-modules like Budget, Accounts Receivable, Accounts Payable, Procurement, Inventory. Good to have Skills- Government Domain Knowledge Skills: Account Management Business Process Analysis English Finance & Accounting Procurement.
Posted Date not available
5.0 - 8.0 years
6 - 10 Lacs
chennai, bengaluru
Work from Office
Position Description: Job Type: Work from Office (Monday - Friday) Shift Timing: 12:30 PM to 9:30 PM Strong Financial Services (preferred Banking) experience, Translate Finacial and accounting concepts into business and systems requirements, data analysis, identify data anomalies and provide remediation options, data mapping, strong data base design concepts, good familiarity with SQL, assist in the creation of meta data, data lineage and data flow diagrams, support UAT planning and execution functions. Your future duties and responsibilities: Collaborate with business stakeholders to understand their goals, processes, and requirements. Gather, document, and analyze business requirements, user stories, and workflows. Translate business needs into functional and technical specifications. Liaise between business units and IT teams to ensure solutions align with business objectives. Assist in the design, testing, implementation, and support of business systems and applications. Develop process models, data flow diagrams, and use cases. Support system integrations, data migrations, and application enhancements. Required qualifications to be successful in this role: Must have skills: Finacial and accounting concepts, data analysis, identify data anomalies, SQL. Good to have: Strong Financial Services (preferred Banking) experience creation of meta data, data lineage and data flow diagrams, UAT planning and execution. Skills: Apache Spark Python SQL
Posted Date not available
5.0 - 7.0 years
20 - 25 Lacs
chennai
Work from Office
Business unit: Finance The Senior Analyst - MI COE role requires an experienced designer of data visualizations and complex reporting with a strong understanding of the Financial metrics and how they influence business performance. The Senior Analyst MICOE would be supporting the BI Product Manager Falcon in Touchless MI agenda for Lubricants projects by understanding the Business requirements and delivering the solutions as per expectation. Position description - Accountabilities Swift understanding of the business model, expectations from business, linking it to the business strategy and the way KPIs are measured. Creative in building visualizations in Microsoft Power BI and SharePoint in the best possible way to derive. Facilitate design review sessions with key stakeholders to refine and elaborate the data visualizations. Work on validation and testing to ensure it meets the requirements. Creates the design specification, deployment plans and other technical documents for respective design activities. Support to troubleshooting problems, providing workarounds etc., Estimates the magnitude and time requirements to complete all tasks and provides accurate and timely updates to the team on progress. Ensure on time, high quality deliverables and meeting project milestones and deadlines (Project Plan On A Page) with minimal supervision. Assists and mentors other team members in the business application, development technologies etc., Participates in peer review of work products such as code, designs, and test plans produced by other team members. Run and Maintain the tools. Work on change management for tools. Ensure IRM compliance of tools, maintain evidence for the user access management and support any system audit. Individual displaying strong personal effectiveness particularly in stakeholder management and analytical skills. Working in a Global and Cross cultural environment to be able to coach, motivate other team members. Required Skills & Experience 5-7 years in a role developing BI tools and data Expert knowledge of Power BI, familiarity with Power Platform Background in Financial Reporting is preferred Exceptional data modelling skills, especially harmonizing across diverse data sources. Demonstrated experience developing end to end data flow structures, resulting in intuitive BI dashboards with high uptake. Must be an analytical thinker, with a strong design sense, and understand implications of the various design options available for a given visualization. Candidates should be results driven, detailed orientated and work well within a dynamic and creative team. Candidates should have proven ability to work with end users to refine identified business needs through in-depth design reviews and information sessions. Possess good written and oral communication skills as well as presentation skills. Ability to learn quickly and adapt to new environments. Technical Skills Global reporting system exposure (GSAP, GPMR, HANA & ECC). Data modelling skills via: Alteryx, python, SQL Data extraction through SQL. Coding skills would be a plus (Python, VBA, R, etc.).
Posted Date not available
6.0 - 10.0 years
2 - 6 Lacs
pune
Work from Office
We need someone who 6+ yrs exp and has hands on experience in migrating Google Analytics UA360 data to BigQuery. Experience working with Google Cloud data products (CloudSQL, Spanner, Cloud Storage, Pub/Sub, Dataflow, Dataproc, Bigtable, BigQuery, Dataprep, Composer, etc) Experience with IoT architectures and building real-time data streaming pipelines Experience operationalizing machine learning models on large datasets Demonstrated leadership and self-direction -- a willingness to teach others and learn new techniques Demonstrated skills in selecting the right statistical tools given a data analysis problem Understanding of Chaos Engineering Understanding of PCI, SOC2, and HIPAA compliance standards Understanding of the principle of least privilege and security best practices Experience working with Google Support. Understanding of cryptocurrency and blockchain technology
Posted Date not available
10.0 - 15.0 years
16 - 20 Lacs
chennai
Work from Office
If you like improving and impacting the business with a passion for accounting, reporting and analysis and innovation this could be your chance to make your mark in the energy industry. You will grow in a supportive team working for a significant Shell business powering the lives of millions and ensuring decisions are made based on accurate, timely and insightful analysis. Where you fit in One of the SAU in IGU (Integrated Gas, Deep Water or Conventional Oil and Gas) The Upstream Deep Water Reporting & Analysis Organization is a team of 800 colleagues providing accurate, timely and insightful analysis to deliver more and cleaner energy solutions. The PAR Lead will lead a team of Finance Advisors in Finance Operations R&A who directly support the one of the IGU business. This role serves as an integrator for the One Finance team and requires close collaboration with the Finance in the Business, Business, organizations across Finance Operations (natural teams), Group Reporting, controllers, Tax etc. This role will lead a team that is accountable to deliver Planning, LE and Appraisal for the business including monthly closing and review of Financial statements related for DW. Whats the role? Accountabilities are broadly in these areas: Leading a team (6 10 FTE) overseeing Planning, LE, Appraisal and monthly review of DW business Owner of Monthly, Quarterly and Annual Group Reporting processes Perform timely and effective financial reviews and commentaries to ensure group returns are correct. Collaborate closely with other Finance Operations teams and onshore Finance to successfully deliver various processes Group Reporting , statutory reporting, PAR team and Group. Build team capability and expertise around the corporate reporting system, Where relevant coach other team members in areas of expertise Champion and lead continuous improvement initiatives The Individual is expected to support the delivery of Performance & Appraisal reporting to one of the IGU organization. This role holder will help the organisation plan and manage their spend for maximum return. The role holder will be expected to render operational support to the month close and PAR related activities. Deliver periodic Management information & close the books on time Support annual or other strategic plans for the Business and relevant constituent parts, including Cash & Finance CAPEX related support. Support the team to develop materials for quarterly or other scheduled appraisal of performance Support in the preparation of periodic financial forecasts Support for Business model changes & new business roll out This role specifically requires understanding the business and working closely with senior stake holders (FM/VP/SVP/GM/EVPs) to provide them financial advices and key indicators for performance, support on business cases IP, strong support to the asset teams to work the latest estimate and ensure full potential of the business is reflected correctly. Role requires strong stake holder management and articulation skills. What we need from you? A Bachelors or Masters degree or Professional Qualification (CA/ICWA/CIMA/ACCA/MBA), in Accounting/Finance with exceptional numeracy skills - having prior relevant experience in Reporting and analysis will be considered as an added advantage. Min of 10 years of related experience Must possess strong analytical skills and be willing to work with ambiguous data. Financial Appraisal, Finance Controllership experience Interest to develop deep understanding of Financial and Management Information Systems and Data Flows as well as a passion for leveraging technology to automate Proactive approach and the ability to identify and support resolving First Time right issues (e.g.: accounting & reporting issues, MRD issues) in a dynamic environment Support development/design of new MI and Finance support processes, Continuous focus on improvement opportunities (ESSA), identifying actions to reduce complexity and promote best practice. Passion for the Deep Water business, The Energy Transition and a curiosity for how the business delivers value.
Posted Date not available
4.0 - 6.0 years
7 - 8 Lacs
hyderabad, chennai, bengaluru
Work from Office
Job Title: Apache NiFi Engineer (ETL/ELT) Location: India Experience: 4+ Years Employment Type: Full-time About the Role: We are looking for a skilled Apache NiFi Engineer with a strong background in ETL/ELT pipeline development , data integration , and Linux-based systems to join our data engineering team supporting mission-critical operations in the supply chain domain . This role demands expertise in building scalable data flows, optimizing data movement, and ensuring data reliability for downstream analytics and operational systems. Key Responsibilities: Design, develop, and maintain scalable data pipelines using Apache NiFi . Work on ETL/ELT processes to ingest, transform, and deliver data from diverse sources (e.g., APIs, databases, files). Ensure data quality, governance, lineage, and security across all data flows. Collaborate with data analysts, supply chain operations, and other engineering teams to gather requirements and deliver robust solutions. Monitor and troubleshoot NiFi workflows, system performance, and job failures. Implement best practices for data integration and real-time/near real-time data processing . Leverage scripting (e.g., Bash, Python) on Linux systems for automation and orchestration. Document system architecture, flow designs, and data mappings. Required Skills & Qualifications: 4+ years of professional experience in data engineering roles. Strong hands-on experience with Apache NiFi in a production environment. Proficient in ETL/ELT concepts and tools. Solid working knowledge of Linux/Unix environments and scripting. Familiarity with supply chain data (inventory, logistics, procurement, etc.) is a plus. Experience integrating data from relational databases , flat files , APIs , and cloud sources. Working knowledge of data formats such as JSON, XML, Avro, Parquet, etc. Ability to debug, optimize, and scale NiFi processors and flows. Strong problem-solving skills and attention to detail. Excellent communication and collaboration skills. Preferred Qualifications: Experience with cloud platforms (AWS, Azure, GCP) and cloud-native data services. Knowledge of other integration tools like Kafka , Airflow , or Informatica . Experience in data warehousing and BI tools . Familiarity with CI/CD and version control systems (Git, Jenkins, etc.).
Posted Date not available
2.0 - 4.0 years
4 - 8 Lacs
chennai
Work from Office
Position Overview As a key member of our data infrastructure team, you will design, develop, and maintain robust data pipelines, data warehouses, and analytics platforms that serve as the backbone of our digital marketing product development. In this role, you will collaborate with cross-functional teams including Data Scientists, Product Managers, and Marketing Technology specialists to ensure seamless data flow from various marketing channels, ad platforms, and customer touchpoints to our analytics dashboards and reporting systems. You'll be responsible for building scalable, reliable, and efficient data solutions that can handle high-volume marketing data processing and real-time campaign analytics. What You'll Do: Design and implement enterprise-grade data pipelines for marketing data ingestion and processing Build and optimize data warehouses and data lakes to support digital marketing analytics Ensure data quality, security, and compliance across all marketing data systems Create data models and schemas that support marketing attribution, customer journey analysis, and campaign performance tracking Develop monitoring and alerting systems to maintain data pipeline reliability for critical marketing operations Collaborate with product teams to understand digital marketing requirements and translate them into technical solutions You'll be the backbone behind the data infrastructure that powers CustomerLabs' digital marketing platform, making marketers' lives easier and better. Your work directly translates to smarter automation, clearer insights, and more successful campaigns - helping marketers focus on what they do best while we handle the complex data heavy lifting. We believe in the power of data to transform lives, just like the Dragon Warrior transformed the Valley of Peace. Every line of code you write, every pipeline you build, and every insight you enable has the potential to change how marketers work and succeed. We're not just building data systems - we're building the future of digital marketing, one insight at a time. Key Responsibilities: Data Pipeline Development Design, build, and maintain robust, scalable data pipelines and ETL/ELT processes Develop data ingestion frameworks to collect data from various sources (databases, APIs, files, streaming sources) Implement data transformation and cleaning processes to ensure data quality and consistency Optimize data pipeline performance and reliability Data Infrastructure Management • Design and implement data warehouse architectures Manage and optimize database systems (SQL and NoSQL) Implement data lake solutions and data governance frameworks Ensure data security, privacy, and compliance with regulatory requirements Data Modeling and Architecture Design and implement data models for analytics and reporting Create and maintain data dictionaries and documentation Develop data schemas and database structures Implement data versioning and lineage tracking Collaboration and Support Work closely with Data Scientists, Analysts, and Business stakeholders Provide technical support for data-related issues and queries Monitoring and Maintenance Implement monitoring and alerting systems for data pipelines Perform regular maintenance and optimization of data systems Troubleshoot and resolve data pipeline issues Conduct performance tuning and capacity planning Required Qualifications Experience 2+ years of experience in data engineering or related roles Proven experience with ETL/ELT pipeline development Experience with cloud data platform (GCP) Experience with big data technologies (Spark, Kafka) Technical Skills Programming Languages: Python, SQL, Golang (preferred) Databases: PostgreSQL, MySQL, Redis • Big Data Tools: Apache Spark, Apache Kafka, Apache Airflow, DBT, Dataform Cloud Platforms: GCP (BigQuery, Dataflow, Cloud run, Cloud SQL, Cloud Storage, Pub/Sub, App Engine, Compute Engine etc.) Data Warehousing: Google BigQuery Version Control: Git, GitHub Containerization: Docker Soft Skills Strong problem-solving and analytical thinking Excellent communication and collaboration skills Ability to work independently and in team environments Strong attention to detail and data quality Continuous learning mindset Preferred Qualifications Additional Experience Experience with real-time data processing and streaming Knowledge of machine learning pipelines and MLOps Experience with data governance and data catalog tools Familiarity with business intelligence tools (Tableau, Power BI, Looker, etc. Experience using AI-powered tools such as Cursor, Claude, Copilot, ChatGPT to accelerate coding, automate tasks, or assist in system design. Industry Software Development Employment Type Full-time
Posted Date not available
5.0 - 7.0 years
5 - 9 Lacs
bengaluru
Work from Office
We are looking for an experienced 5+ years Python Developer with hands-on expertise in Google Cloud Platform (GCP) services. The ideal candidate will be responsible for developing and deploying scalable applications, data processing pipelines, and cloud-native solutions using Python and GCP tools. Responsibilities Develop and maintain Python-based applications and services. Design and implement data pipelines and cloud functions using GCP services such as Cloud Functions, Cloud Run, Pub/Sub, Dataflow, and Big Query. Integrate APIs and third-party services. Optimize performance and scalability of cloud-based applications. Collaborate with DevOps and data teams to build CI/CD pipelines and manage infrastructure using tools like Terraform or Deployment Manager. Write clean, maintainable, and well-documented code. Troubleshoot and resolve technical issues in production and development environments.
Posted Date not available
2.0 - 7.0 years
5 - 9 Lacs
noida
Work from Office
About the role: Needs to work closely and communicate effectively with internal and external stakeholders in an ever-changing, rapid growth environment with tight deadlines. This role involves analyzing healthcare data and model on proprietary tools. Be able to take up new initiatives independently and collaborate with external and internal stakeholders. Be a strong team player. Be able to create and define SOPs, TATs for ongoing and upcoming projects. What will you need: Graduate in any discipline (preferably via regular attendance) from a recognized educational institute with good academic track record Should have Live hands-on experience of at-least 2 year in Advance Analytical Tool (Power BI, Tableau, SQL) should have solid understanding of SSIS (ETL) with strong SQL & PL SQL Connecting to data sources, importing data and transforming data for Business Intelligence. Should have expertise in DAX & Visuals in Power BI and live Hand-On experience on end-to-end project Strong mathematical skills to help collect, measure, organize and analyze data. Interpret data, analyze results using advance analytical tools & techniques and provide ongoing reports Identify, analyze, and interpret trends or patterns in complex data sets Ability to communicate with technical and business resources at many levels in a manner that supports progress and success. Ability to understand, appreciate and adapt to new business cultures and ways of working. Demonstrates initiative and works independently with minimal supervision.
Posted Date not available
2.0 - 7.0 years
5 - 9 Lacs
noida
Work from Office
About the role: Needs to work closely and communicate effectively with internal and external stakeholders in an ever-changing, rapid growth environment with tight deadlines. This role involves analyzing healthcare data and model on proprietary tools. Be able to take up new initiatives independently and collaborate with external and internal stakeholders. Be a strong team player. Be able to create and define SOPs, TATs for ongoing and upcoming projects. What will you need: Graduate in any discipline (preferably via regular attendance) from a recognized educational institute with good academic track record Should have Live hands-on experience of at-least 2 year in Advance Analytical Tool (Power BI, Tableau, SQL) should have solid understanding of SSIS (ETL) with strong SQL & PL SQL Connecting to data sources, importing data and transforming data for Business Intelligence. Should have expertise in DAX & Visuals in Power BI and live Hand-On experience on end-to-end project Strong mathematical skills to help collect, measure, organize and analyze data. Interpret data, analyze results using advance analytical tools & techniques and provide ongoing reports Identify, analyze, and interpret trends or patterns in complex data sets Ability to communicate with technical and business resources at many levels in a manner that supports progress and success. Ability to understand, appreciate and adapt to new business cultures and ways of working. Demonstrates initiative and works independently with minimal supervision.
Posted Date not available
4.0 - 9.0 years
3 - 7 Lacs
mumbai, pune, bengaluru
Work from Office
Your Role Design, implement, and administer Cribl Stream, Edge, and Search environments for enterprise-scale observability. Deploy, configure, maintain, and optimize Cribl solutions for data streaming, log routing, and telemetry processing. Design and manage data pipelines for logs, metrics, and observability data across distributed systems. Configure parsers, reducers, enrichments, filters, and routing rules for efficient log processing. Integrate Cribl with monitoring and SIEM platforms such as Splunk, Elastic, and Datadog. Your Profile 4 to 12 years of experience in observability and log management using Cribl solutions. Expertise in deploying, configuring, and managing Cribl Stream, Edge, and Search. Proficiency in Python, Bash, and familiarity with automation tools and DevOps pipelines. Strong understanding of observability architecture, log processing, and telemetry data flows. Experience integrating Cribl with SIEM and monitoring tools like Splunk, Elastic, and Datadog.
Posted Date not available
4.0 - 8.0 years
8 - 14 Lacs
hyderabad, bengaluru
Work from Office
Looking for a GCP Data Engineer with 4+ yrs exp. Must have strong skills in Python, BigQuery, DataFlow, PubSub, GCS, Airflow, and DevOps. Experience in cloud migration, automation, and scripting (Python/Shell) required. GCP certification is a plus.
Posted Date not available
5.0 - 10.0 years
18 - 25 Lacs
chennai
Hybrid
Skills Required: Full stack + Data Engineer Position Spring Boot, Angular, Cloud Computing Skills Preferred: Google Cloud Platform , Data Flow, Dataproc, Data Fusion, , Tekton,Cloud SQL, AIRFLOW, POSTGRES, Airflow PySpark, Python, API Experience Required: 5+ years of overall experience with proficiency in Java, angular or any javascript technology with experience in designing and deploying cloud-based data pipelines and microservices using GCP tools like , Dataflow, and Dataproc. Ability to leverage best in-class data platform technologies (Apache Beam, Kafka, ) to deliver platform features, and design & orchestrate platform services to deliver data platform capabilities. Service-Oriented Architecture and Microservices: Strong understanding of SOA, microservices, and their application within a cloud data platform context. Develop robust, scalable services using Java Spring Boot, Python, Angular, and GCP technologies. Full-Stack Development: Knowledge of front-end and back-end technologies, enabling collaboration on data access and visualization layers (e.g., React, Node.js). Design and develop RESTful APIs for seamless integration across platform services. Implement robust unit and functional tests to maintain high standards of test coverage and quality. Database Management: Experience with relational (e.g., PostgreSQL, MySQL) and NoSQL databases, as well as columnar databases like BigQuery. Data Governance and Security: Understanding of data governance frameworks and implementing RBAC, encryption, and data masking in cloud environments. CI/CD and Automation: Familiarity with CI/CD pipelines, Infrastructure as Code (IaC) tools like Terraform, and automation frameworks. Manage code changes with GitHub and troubleshoot and resolve application defects efficiently. Ensure adherence to SDLC best practices, independently managing feature design, coding, testing, and production releases. Problem-Solving: Strong analytical skills with the ability to troubleshoot complex data platform and microservices issues.
Posted Date not available
5.0 - 10.0 years
27 - 32 Lacs
mumbai
Work from Office
Role Description Process and Service Excellence team (PSXT) is part of the PB Operating Model function under the global PB Chief Operating Office. The purpose of the team is to combine process know-how with operational improvements and automation capabilities. The objective is to increase client satisfaction and capture operational efficiencies by front to back digitalization, automation and non-tech process improvements in close collaboration with key stakeholders. PSXT strives to link the overall PB strategy to client journeys, product offering, IT platform and organizational set-up by ensuring a process design with E2E views & cost transparency. We work closely with Process Owners and the Transformation organisation across the PB division and rive accountability, ensure stakeholder alignments, deliver process and service enhancements, take care of communications, employee engagement and feedback loops for continuous process improvements. We are seeking a driven, capable and experienced Process Excellence Analyst (Assistant Vice President) with solid foundation in banking processes to actively support the excellence of key private bank processes in alignment with Target Operating Model and strategic enterprise architecture. The successful candidate will contribute to the design and implementation of process enhancements internationally, focusing on automation and digitalization, and stakeholder engagement. This role is ideal for a professional with several years of experience looking to take on more ownership and influence within a collaborative change delivery environment. Your key responsibilities Actively contribute to process and service excellence initiatives, projects or key workstreams. Analyse processes including impact analysis with the aim to increase client experience and automation, reduce costs and processing times in alignment with target architecture and process design principles. Identify pain points, control gaps, and improvement opportunities. Perform process mapping and supply data and analytics capabilities to the organisation. Support in assessing budget requirements and creating cost views. Contribute to the design of optimized to-be processes using best practice frameworks. Cooperate with business, IT stakeholders and control functions to optimize processes. Support stakeholder analysis and interactions and help manage relationships with Process Owners, Front Office, Operations, control functions and technology teams. Actively contribute to workshops, requirements sessions, and process walkthroughs to gather input and drive alignment. Communicate effectively with both technical and non-technical stakeholders. Your skills and experience Bachelors degree in Business, Finance, Economics, or Information Systems. 68 years of experience as a Process Expert, Consultant, Business Analyst or Change Practitioner, preferably in personal banking, private banking or wealth management. Hands-on experience with process analysis, modelling and documentation. Six Sigma Green Belt or equivalent certification (beneficial). Working knowledge of relevant private banking systems, platforms, or data flows is advantageous. Understanding of key regulatory impacts (e.g., KYC/AML, MiFID II) preferred. Proficient in business analysis and process mapping tools. Key Competencies: Strong analytical and conceptual thinking skills. Detail-oriented with the ability to see the bigger picture. Prior process data mining and analytics experience are beneficial. Clear and structured communicator with confidence in leading discussions. Comfortable working in fast-paced, multi-stakeholder environments. Proactive, collaborative, and eager to take initiative and accountability.
Posted Date not available
4.0 - 5.0 years
7 - 8 Lacs
chennai
Hybrid
Job Title: Data Engineering Engineer II Location: Chennai Work Type: Hybrid Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place one that benefits lives, communities and the planet Position Description: Employees in this job function are responsible for designing, building, and maintaining data solutions including data infrastructure, pipelines, etc. for collecting, storing, processing and analyzing large volumes of data efficiently and accurately Key Responsibilities: Collaborate with business and technology stakeholders to understand current and future data requirements Design, build and maintain reliable, efficient and scalable data infrastructure for data collection, storage, transformation, and analysis Plan, design, build and maintain scalable data solutions including data pipelines, data models, and applications for efficient and reliable data workflow Design, implement and maintain existing and future data platforms like data warehouses, data lakes, data lakehouse etc. for structured and unstructured data Design and develop analytical tools, algorithms, and programs to support data engineering activities like writing scripts and automating tasks Ensure optimum performance and identify improvement opportunities Skills Required: GCP, Big Query,Data Flow, Dataproc, Data Fusion Experience Required: Engineer II Exp: 4+ years Data Engineering work experience Education Required: Bachelor's Degree TekWissen Group is an equal opportunity employer supporting workforce diversity.
Posted Date not available
14.0 - 16.0 years
22 - 27 Lacs
gurugram
Work from Office
Job Title - GN - SONG - Service - CX - Value Architect - Senior Manager Management Level: 6-Senior Manager Location: Gurugram Must-have skills: Sales Distribution and Marketing Strategy Good to have skills: Ability to leverage design thinking, business process optimization, and stakeholder management skills. Job Summary : This role involves driving strategic initiatives, managing business transformations, and leveraging industry expertise to create value-driven solutions. Roles & Responsibilities: Provide strategic advisory services, conduct market research, and develop data-driven recommendations to enhance business performance. You will work closely with our clients as consulting professionals who design, build and implement initiatives that can help enhance business performance. As part of these, you are expected to - Translate strategic objectives into high-impact use cases in the specific area of expertise. Understand clients business priorities and focus areas to identify the right business scenarios and impacted value levers (KPIs) to include in the business case. Ideate and execute on compelling value creation workshops. Conduct detailed qualitative and quantitative research to lay the foundation of a strong business case. Own every stage of the value creation process, from research and identification to value drafting and dashboarding. Define value architecting requirements and work with Accenture teams to deliver solutions. Advise clients on industry best practices and client examples of value creation and realization Accurately estimate time to complete work. Continually experiment with new tools, technologies and sharpen analytical skills. Ability to research and provide strategic, goal-driven solutions for clients. Lead and Collaborate with both offshore & onshore cross functional and technical teams, including client-side managers, business heads, and other stakeholders across the organization. Provide useful contributions to team meetings and conversations, actively participating in client meetings and workshops- Ability to create hypothesis based on understanding of clients issues. Bring your best skills forward to excel at the role: Apply best of breed Excel practices- Deep-dive with solid knowledge of formulas & macros to bring in speed & efficiency. Maximize experience in developing interactive models: Use relevant dashboard creation platforms (Power BI, Tableau, etc.) to design and apply interactive dashboards. Innovate with Creativity: Demonstrate an ability to work in a fast-paced environment with the ability to abstract value into compelling business story. Participate in pre-sales activities including response to RFPs, creating proofs of concept, creating effective presentations, demonstrating solutions during client orals, effort and cost estimation process, etc. Participate in practice-specific initiatives including creating points of view, creating reusable assets on contact center space, performing analysis on industry research and market trends and bringing in innovative solutions, etc. Professional & Technical Skills: - Relevant experience in the required domain. - Strong analytical, problem-solving, and communication skills. - Ability to work in a fast-paced, dynamic environment. Deep understanding of Customer Service function for two industries Good understanding of sales & marketing as a function Solid experience in developing quantitative models. Conducting qualitative & quantitative research Anchoring client/senior stakeholder conversations Creating engaging storyboards using the best data visualization tools such as Power BI, Tableau, etc. Passionate about story telling and excellent visual skills (using power point, figma tools) Additional Information: - Opportunity to work on innovative projects. - Career growth and leadership exposure. About Our Company | Accenture Qualification Experience: 14 to 16 Years Educational Qualification: Minimum 15 Years of Education
Posted Date not available
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |