Jobs
Interviews

905 Data Flow Jobs - Page 29

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 - 4.0 years

3 - 7 Lacs

Kolkata

Work from Office

Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : SAP Ariba Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Support Engineer, you will act as software detectives, providing a dynamic service that identifies and solves issues within multiple components of critical business systems. Your typical day will involve collaborating with various teams to troubleshoot and resolve software-related challenges, ensuring that business operations run smoothly and efficiently. You will engage in problem-solving activities, analyze system performance, and contribute to the continuous improvement of application support processes, all while maintaining a focus on delivering exceptional service to stakeholders. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor system performance and proactively address potential issues. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Ariba.- Strong understanding of application support processes and methodologies.- Experience with troubleshooting and resolving software issues.- Familiarity with system integration and data flow management.- Ability to work collaboratively in a team-oriented environment. Additional Information:- The candidate should have minimum 5 years of experience in SAP Ariba.- This position is based at our Kolkata office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 months ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : SAP BW/4HANA Data Modeling & Development Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. A typical day involves collaborating with cross-functional teams to gather insights, analyzing user needs, and translating them into functional specifications. You will engage in discussions to refine application designs and ensure alignment with business objectives, while also participating in testing and validation processes to guarantee that the applications meet the defined requirements effectively. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with stakeholders to gather and analyze requirements for application design.- Participate in the testing and validation of applications to ensure they meet business needs. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BW/4HANA Data Modeling & Development.- Good to have- SAP ABAP, CDP views- Strong understanding of data modeling concepts and best practices.- Experience with application design methodologies and tools.- Ability to analyze and interpret complex business requirements.- Familiarity with integration techniques and data flow management. Additional Information:- The candidate should have minimum 3 years of experience in SAP BW/4HANA Data Modeling & Development.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 months ago

Apply

7.0 - 12.0 years

13 - 18 Lacs

Pune

Work from Office

Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : Microsoft Power Business Intelligence (BI) Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application, model and design the application data structure, storage, and integration. You will play a crucial role in shaping the data architecture of the organization and ensuring seamless data flow. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead data architecture discussions and decisions- Develop data models and database design- Implement data governance policies Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Power Business Intelligence (BI)- Strong understanding of data modeling and database design- Experience with ETL processes and tools- Knowledge of data integration and data warehousing concepts- Hands-on experience with SQL and database management- Good To Have Skills: Experience with data visualization tools Additional Information:- The candidate should have a minimum of 7.5 years of experience in Microsoft Power Business Intelligence (BI)- This position is based at our Pune office- A 15 years full-time education is required Qualification 15 years full time education

Posted 3 months ago

Apply

7.0 - 12.0 years

13 - 18 Lacs

Bengaluru

Work from Office

Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : SAP HCM On Premise ABAP, SAP ABAP BOPF Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application, model and design the application data structure, storage, and integration. You will play a crucial role in shaping the data architecture of the project and ensuring seamless data flow. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead data governance initiatives to ensure data quality and integrity- Develop data models and database design for efficient data storage- Implement data security measures to protect sensitive information Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP HCM On Premise ABAP, SAP ABAP BOPF- Strong understanding of data modeling and database design- Experience in data integration and ETL processes- Knowledge of data governance and data security best practices- Hands-on experience with SAP HANA database management Additional Information:- The candidate should have a minimum of 7.5 years of experience in SAP HCM On Premise ABAP.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 3 months ago

Apply

5.0 - 9.0 years

14 - 19 Lacs

Chennai

Work from Office

Project description We are seeking a highly skilled Senior Power BI Developer with strong expertise in Power BI, SQL Server, and data modeling to join our Business Intelligence team. In this role, you will lead the design and development of interactive dashboards, robust data models, and data pipelines that empower business stakeholders to make informed decisions. You will work collaboratively with cross-functional teams and drive the standardization and optimization of our BI architecture. Responsibilities Power BI Dashboard Development (UI Dashboards) Design, develop, and maintain visually compelling, interactive Power BI dashboards aligned with business needs. Collaborate with business stakeholders to gather requirements, develop mockups, and refine dashboard UX. Implement advanced Power BI features like bookmarks, drill-throughs, dynamic tooltips, and DAX calculations. Conduct regular UX/UI audits and performance tuning on reports. Data Modeling in SQL Server & Dataverse Build and manage scalable, efficient data models in Power BI, Dataverse, and SQL Server. Apply best practices in dimensional modeling (star/snowflake schema) to support analytical use cases. Ensure data consistency, accuracy, and alignment across multiple sources and business areas. Perform optimization of models and queries for performance and load times. Power BI Dataflows & ETL Pipelines Develop and maintain reusable Power BI Dataflows for centralized data transformations. Create ETL processes using Power Query, integrating data from diverse sources including SQL Server, Excel, APIs, and Dataverse. Automate data refresh schedules and monitor dependencies across datasets and reports. Ensure efficient data pipeline architecture for reuse, scalability, and maintenance. Skills Must have Experience6+ years in Business Intelligence or Data Analytics with a strong focus on Power BI and SQL Server. Technical Skills: Expert-level Power BI development, including DAX, custom visuals, and report optimization. Strong knowledge of SQL (T-SQL) and relational database design. Experience with Dataverse and Power Platform integration. Proficiency in Power Query, Dataflows, and ETL development. ModelingProven experience in dimensional modeling, star/snowflake schema, and performance tuning. Data IntegrationSkilled in connecting and transforming data from various sources, including APIs, Excel, and cloud data services. CollaborationAbility to work with stakeholders to define KPIs, business logic, and dashboard UX. Nice to have N/A OtherLanguagesEnglishC1 Advanced SenioritySenior

Posted 3 months ago

Apply

3.0 - 5.0 years

10 - 13 Lacs

Chennai

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)

Posted 3 months ago

Apply

5.0 - 7.0 years

15 - 17 Lacs

Chennai

Work from Office

Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)

Posted 3 months ago

Apply

5.0 - 10.0 years

15 - 20 Lacs

Chennai

Work from Office

Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow) Mandatory Key Skills Google Cloud Platform,GCS,DataProc,Big Query,Data Flow,Composer,Data Processing,Java*

Posted 3 months ago

Apply

3.0 - 8.0 years

10 - 18 Lacs

Guwahati

Work from Office

Design and implement scalable data architectures to optimize data flow and analytics capabilities Develop ETL pipelines, data warehouses, and real-time data processing systems Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery Work closely with data scientists to enhance machine learning models with structured and unstructured data Prior experience in handling large-scale datasets is preferred

Posted 3 months ago

Apply

3.0 - 8.0 years

10 - 18 Lacs

Kochi

Work from Office

Design and implement scalable data architectures to optimize data flow and analytics capabilities Develop ETL pipelines, data warehouses, and real-time data processing systems Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery Work closely with data scientists to enhance machine learning models with structured and unstructured data Prior experience in handling large-scale datasets is preferred

Posted 3 months ago

Apply

3.0 - 8.0 years

10 - 18 Lacs

Kanpur

Work from Office

Design and implement scalable data architectures to optimize data flow and analytics capabilities Develop ETL pipelines, data warehouses, and real-time data processing systems Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery Work closely with data scientists to enhance machine learning models with structured and unstructured data Prior experience in handling large-scale datasets is preferred

Posted 3 months ago

Apply

5.0 - 8.0 years

17 - 20 Lacs

Kolkata

Work from Office

Key Responsibilities Architect and implement scalable data solutions using GCP (BigQuery, Dataflow, Pub/Sub, Cloud Storage, Composer, etc.) and Snowflake. Lead the end-to-end data architecture including ingestion, transformation, storage, governance and consumption layers. Collaborate with business stakeholders, data scientists and engineering teams to define and deliver enterprise data strategy. Design robust data pipelines (batch and real-time) ensuring high data quality, security and availability. Define and enforce data governance, data cataloging and metadata management best practices. Evaluate and select appropriate tools and technologies to optimize data architecture and cost efficiency. Mentor junior architects and data engineers, guiding them on design best practices and technology standards. Collaborate with DevOps teams to ensure smooth CI/CD pipelines and infrastructure automation for data Skills & Qualifications : 3+ years of experience in data architecture, data engineering, or enterprise data platform roles. 3+ years of hands-on experience in Google Cloud Platform (especially BigQuery, Dataflow, Cloud Composer, Data Catalog). 3+ years of experience designing and implementing Snowflake-based data solutions. Deep understanding of modern data architecture principles (Data Lakehouse, ELT/ETL, Data Mesh, etc.). Proficient in Python, SQL and orchestration tools like Airflow / Cloud Composer. Experience in data modeling (3NF, Star, Snowflake schemas) and designing data marts and warehouses. Strong understanding of data privacy, compliance (GDPR, HIPAA) and security principles in cloud environments. Familiarity with tools like dbt, Apache Beam, Looker, Tableau, or Power BI is a plus. Excellent communication and stakeholder management skills. GCP or Snowflake certification preferred (e.g., GCP Professional Data Engineer, SnowPro Qualifications : Experience working with hybrid or multi-cloud data strategies. Exposure to ML/AI pipelines and support for data science workflows. Prior experience in leading architecture reviews, PoCs and technology roadmaps

Posted 3 months ago

Apply

10.0 - 15.0 years

11 - 15 Lacs

Jhagadia

Work from Office

Develop, implement, and maintain the organization's MIS to ensure accurate and real-time reporting of key business metrics. Oversee the preparation and distribution of daily, weekly, and monthly reports to various departments and senior management. Ensure data accuracy, integrity, and consistency across all reporting platforms. Design and maintain dashboards for business performance monitoring. Analyze data trends and provide insights to management for informed decision-making. Establish and maintain cost accounting systems and procedures for accurate tracking of material, labor, and overhead costs. Review and update cost standards, analyzing variances and taking corrective actions when necessary. Collaborate with other departments to monitor and control project costs, ensuring alignment with budget and financial goals. Perform cost analysis and prepare cost reports to monitor financial performance and support pricing decisions. Conduct regular audits to ensure compliance with costing policies and industry standards. Provide regular cost analysis reports, highlighting variances between actual and budgeted figures, and recommend corrective actions. Support financial forecasting and budgeting processes by providing relevant data and insights. Assist in month-end and year-end closing processes by ensuring accurate costing and reporting entries. Review profitability analysis reports and identify areas for cost optimization.

Posted 3 months ago

Apply

5.0 - 7.0 years

15 - 20 Lacs

Chennai

Work from Office

Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow) Mandatory Key Skills agile development,Data Processing,Python,Shell Script,SQL,Google Cloud Platform*,GCS*,DataProc*,Big Query*,Data Flow*

Posted 3 months ago

Apply

7.0 - 9.0 years

19 - 22 Lacs

Chennai

Work from Office

This role is for 7+ years experienced Software Engineer with data engineering knowledge and following skill set. 1.) End 2 End Full Stack 2.) GCP - Services like Big Query, Astronomer, Terraform, Airflow, Data flow, GCP Architecture 3.) Python Fullstack Java with Cloud Mandatory Key Skills Software Engineering,Big Query,Terraform,Airflow,Data flow,GCP Architecture,Java,Cloud,data engineering*

Posted 3 months ago

Apply

4.0 - 6.0 years

7 - 8 Lacs

Gurugram

Work from Office

Job Title: Ab Initio Developer Location: Gurugram Experience: 4-5 years Employment Type: Full Time Job Summary: We are seeking an experienced Ab Initio Developer to design, develop, and maintain high-volume, enterprise-grade ETL solutions for our data warehouse environment. The ideal candidate will have strong technical expertise in Ab Initio components, SQL, UNIX scripting, and the ability to work collaboratively with both business and technical teams to deliver robust data integration solutions. Key Responsibilities: Analyze, design, implement, and maintain large-scale, multi-terabyte data warehouse ETL applications that operate 24/7 with high performance and reliability. Develop logical and physical data models to support data warehousing and business intelligence initiatives. Lead and participate in complex ETL development projects using Ab Initio, ensuring quality and efficiency. Translate business requirements into system and data flows, mappings, and transformation logic. Create detailed design documentation, including high-level (HLD) and low-level design (LLD) specifications. Conduct design reviews, capture feedback, and facilitate additional sessions as required. Develop, test, and deploy ETL workflows using Ab Initio components such as Rollup, Scan, Join, Partition, Gather, Merge, Interleave, Lookup, etc. Perform SQL database programming and optimize SQL queries for performance. Develop and maintain UNIX shell scripts to automate ETL workflows and system processes. Collaborate with Release Management, Configuration Management, Quality Assurance, Architecture, Database Support, and other development teams. Ensure adherence to source control standards using EME or similar tools. Provide ongoing support and maintenance of ETL processes and troubleshoot issues as needed. Required Skills & Qualifications: Hands-on development experience with Ab Initio components (Rollup, Scan, Join, Partition by key, Round Robin, Gather, Merge, Interleave, Lookup, etc.) Strong background in designing and delivering complex, large-volume data warehouse applications Experience with source-code control tools such as EME Proficient in SQL database programming, including query optimization and performance tuning Good working knowledge of UNIX scripting and Oracle SQL/PL-SQL Strong technical expertise in preparing detailed design documents (HLD, LLD) and unit testing Ability to understand and communicate effectively with both business and technical stakeholders Strong problem-solving skills and attention to detail Ability to work independently as well as part of a team

Posted 3 months ago

Apply

4.0 - 6.0 years

7 - 8 Lacs

Gurugram

Work from Office

Job Title: Ab Initio Developer Location: Gurugram Experience: 4-5 years Employment Type: Full Time Job Summary: We are seeking an experienced Ab Initio Developer to design, develop, and maintain high-volume, enterprise-grade ETL solutions for our data warehouse environment. The ideal candidate will have strong technical expertise in Ab Initio components, SQL, UNIX scripting, and the ability to work collaboratively with both business and technical teams to deliver robust data integration solutions. Key Responsibilities: Analyze, design, implement, and maintain large-scale, multi-terabyte data warehouse ETL applications that operate 24/7 with high performance and reliability. Develop logical and physical data models to support data warehousing and business intelligence initiatives. Lead and participate in complex ETL development projects using Ab Initio, ensuring quality and efficiency. Translate business requirements into system and data flows, mappings, and transformation logic. Create detailed design documentation, including high-level (HLD) and low-level design (LLD) specifications. Conduct design reviews, capture feedback, and facilitate additional sessions as required. Develop, test, and deploy ETL workflows using Ab Initio components such as Rollup, Scan, Join, Partition, Gather, Merge, Interleave, Lookup, etc. Perform SQL database programming and optimize SQL queries for performance. Develop and maintain UNIX shell scripts to automate ETL workflows and system processes. Collaborate with Release Management, Configuration Management, Quality Assurance, Architecture, Database Support, and other development teams. Ensure adherence to source control standards using EME or similar tools. Provide ongoing support and maintenance of ETL processes and troubleshoot issues as needed. Skills and Qualifications 4-5 years of experience in Ab Initio development. Ab Initio: Proficient in using Ab Initio tools, such as GDE and Enterprise Metadata Environment (EME). ETL Concepts: Understanding of ETL processes, data transformations, and data warehousing. SQL: Knowledge of SQL for data retrieval and manipulation. Unix/Linux Shell Scripting: Familiarity with Unix/Linux shell scripting for automation and scripting tasks. Problem-Solving: Ability to identify and solve technical issues related to Ab Initio application

Posted 3 months ago

Apply

0.0 - 1.0 years

0 Lacs

New Delhi, Jammu

Work from Office

Teqtive IT Services Pvt Ltd is looking for Graphics Design Intern to join our dynamic team and embark on a rewarding career journey. Collaborating with clients or team members to determine design requirements and project goalsDeveloping and creating visual contentSelecting and manipulating appropriate images, fonts, and other design elements to enhance the visual impact of designsUsing graphic design software, such as Adobe Photoshop, Illustrator, and InDesign, to produce final designsPresenting design concepts and presenting revisions to clients or team membersManaging multiple projects and meeting tight deadlinesEnsuring designs meet brand guidelines and quality standards

Posted 3 months ago

Apply

1.0 - 5.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Job TitleData Engineer Experience5"“8 Years LocationDelhi, Pune, Bangalore (Hyderabad & Chennai also acceptable) Time ZoneAligned with UK Time Zone Notice PeriodImmediate Joiners Only Role Overview: We are seeking experienced Data Engineers to design, develop, and optimize large-scale data processing systems You will play a key role in building scalable, efficient, and reliable data pipelines in a cloud-native environment, leveraging your expertise in GCP, BigQuery, Dataflow, Dataproc, and more Key Responsibilities: Design, build, and manage scalable and reliable data pipelines for real-time and batch processing. Implement robust data processing solutions using GCP services and open-source technologies. Create efficient data models and write high-performance analytics queries. Optimize pipelines for performance, scalability, and cost-efficiency. Collaborate with data scientists, analysts, and engineering teams to ensure smooth data integration and transformation. Maintain high data quality, enforce validation rules, and set up monitoring and alerting. Participate in code reviews, deployment activities, and provide production support. Technical Skills Required: Cloud PlatformsGCP (Google Cloud Platform)- mandatory Key GCP ServicesDataproc, BigQuery, Dataflow Programming LanguagesPython, Java, PySpark Data Engineering ConceptsData Ingestion, Change Data Capture (CDC), ETL/ELT pipeline design Strong understanding of distributed computing, data structures, and performance tuning Required Qualifications & Attributes: 5"“8 years of hands-on experience in data engineering roles Proficiency in building and optimizing distributed data pipelines Solid grasp of data governance and security best practices in cloud environments Strong analytical and problem-solving skills Effective verbal and written communication skills Proven ability to work independently and in cross-functional teams Show more Show less

Posted 3 months ago

Apply

10.0 - 15.0 years

30 - 40 Lacs

Bhopal, Pune, Gurugram

Hybrid

Job Title: Senior Data Engineer GCP | Big Data | Airflow | dbt Company: Xebia Location: All Xebia locations Experience: 10+ Years Employment Type: Full Time Notice Period: Immediate to Max 30 Days Only Job Summary Join the digital transformation journey of one of the world’s most iconic global retail brands! As a Senior Data Engineer , you’ll be part of a dynamic Digital Technology organization, helping build modern, scalable, and reliable data products to power business decisions across the Americas. You'll work in the Operations Data Domain, focused on ingesting, processing, and optimizing high-volume data pipelines using Google Cloud Platform (GCP) and other modern tools. Key Responsibilities Design, develop, and maintain highly scalable big data pipelines (batch & streaming) Collaborate with cross-functional teams to understand data needs and deliver efficient solutions Architect robust data solutions using GCP-native services (BigQuery, Pub/Sub, Cloud Functions, etc.) Build and manage modern Data Lake/Lakehouse platforms Create frameworks and reusable components for scalable ingestion and processing Implement data governance, security, and ensure regulatory compliance Mentor junior engineers and lead an offshore team of 8+ engineers Monitor pipeline performance, troubleshoot bottlenecks, and ensure data quality Engage in code reviews, CI/CD deployments, and agile product releases Contribute to internal best practices and engineering standards Must-Have Skills & Qualifications 8+ years in data engineering with strong hands-on experience in production-grade pipelines Expertise in GCP Data Services – BigQuery, Vertex AI, Pub/Sub, etc. Proficiency in dbt (Data Build Tool) for data transformation Strong programming skills in Python, Java, or Scala Advanced SQL & NoSQL knowledge Experience with Apache Airflow for orchestration Hands-on with Git, GitHub Actions , Jenkins for CI/CD Solid understanding of data warehousing (BigQuery, Snowflake, Redshift) Exposure to tools like Hadoop, Spark, Kafka , Databricks (nice to have) Familiarity with BI tools like Tableau, Power BI, or Looker (optional) Strong leadership qualities to manage offshore engineering teams Excellent communication skills and stakeholder management experience Preferred Education Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field Notice Period Requirement Only Immediate Joiners or candidates with Max 30 Days Notice Period will be considered. How to Apply If you are passionate about solving real-world data problems and want to be part of a global data-driven transformation, apply now by sending your resume to vijay.s@xebia.com with the subject line: "Sr Data Engineer Application – [Your Name]" Kindly include the following details in your email: Full Name Total Experience Current CTC Expected CTC Current Location Preferred Location Notice Period / Last Working Day Key Skills Please do not apply if you are currently in process with any other role at Xebia or have recently interviewed.

Posted 3 months ago

Apply

5.0 - 8.0 years

27 - 42 Lacs

Bengaluru

Work from Office

Years Of Exp - 5-12 Yrs Location - PAN India OFSAA Data Modeler Experience in design, build ,customize OFSAA Data model ,Validation of data model Excellent knowledge in Data model guidelines for Staging. processing and Reporting tables. Knowledge on Data model Support on configuring the UDPs, subtype and supertype relationship enhancements Experience on OFSAA platform (OFSAAI) with one or more of following OFSAA modules: o OFSAA Financial Solution Data Foundation - (Preferred) o OFSA Data Integrated Hub - Optional Good in SQL and PL/SQL. Strong in Data Warehouse Principles, ETL/Data Flow tools. Should have excellent Analytical and Communication skills. OFSAA Integration SME - DIH/Batch run framework Experience in ETL process, familiar with OFSAA. DIH setup in EDS, EDD, T2T, etc. Familiar with different seeded tables, SCD, DIM, hierarchy, look ups, etc Worked with FSDF in knowing the STG, CSA, FACT table structures Have working with different APIs and out of box connectors, etc. Familiar with Oracle patching and SR

Posted 3 months ago

Apply

5.0 - 10.0 years

15 - 27 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Job Title: Data Engineer GCP Company: Xebia Location: Hybrid - Any Xebia Location Experience: 5+ Years Salary: As per industry standards Job Type: Full Time About the Role: Xebia is hiring a seasoned Data Engineer (L4) to join a high-impact team building scalable data platforms using GCP, Databricks, and Airflow . If you thrive on architecting future-ready solutions and have strong experience in big data transformations, we’d love to hear from you. Project Overview: We currently manage 1000+ Data Pipelines using Databricks Clusters for end-to-end data transformation ( Raw Silver Gold ) with orchestration handled via Airflow — all on Google Cloud Platform (GCP) . Curated datasets are delivered through BigQuery and Databricks Notebooks . Our roadmap includes migrating to a GCP-native data processing framework optimized for Spark workloads. Key Responsibilities: Design and implement a GCP-native data processing framework Analyze and plan migration of existing workloads to cloud-native architecture Ensure data availability, integrity, and consistency Build reusable tools and standards for the Data Engineering team Collaborate with stakeholders and document processes thoroughly Required Experience: 5+ years in Data Engineering with strong data architecture experience Hands-on expertise in Databricks , Airflow , BigQuery , and PySpark Deep knowledge of GCP services for data processing (Dataflow, Dataproc, etc.) Familiarity with data lake table formats like Delta, Iceberg Experience with orchestration tools ( Airflow , Dagster , or similar) Key Skills: Python programming Strong understanding of data lake architectures and cloud-native best practices Excellent problem-solving and communication skills Notice Period Requirement: Only Immediate Joiners or Candidates with Max 30 Days Notice Period Will Be Considered How to Apply: Interested candidates can share their details and updated resume with vijay.s@xebia.com in the following format: Full Name: Total Experience (Must be 5+ years): Current CTC: Expected CTC: Current Location: Preferred Location: Notice Period / Last Working Day (if serving notice): Primary Skill Set: Note: Please apply only if you have not recently applied or interviewed for any open roles at Xebia.

Posted 3 months ago

Apply

1.0 - 3.0 years

10 - 15 Lacs

Kolkata, Gurugram, Bengaluru

Hybrid

Salary: 10 to 16 LPA Exp: 1 to 3 years Location: Gurgaon / Bangalore/ Kolkata (Hybrid) Notice: Immediate to 30 days..!! Key Skills: GCP, Cloud, Pubsub, Data Engineer

Posted 3 months ago

Apply

3.0 - 8.0 years

15 - 30 Lacs

Gurugram, Bengaluru

Hybrid

Salary: 15 to 30 LPA Exp: 3 to 8 years Location: Gurgaon / Bangalore (Hybrid) Notice: Immediate to 30 days..!! Key Skills: GCP, Cloud, Pubsub, Data Engineer

Posted 3 months ago

Apply

7.0 - 11.0 years

9 - 13 Lacs

Hyderabad

Work from Office

Were Hitachi Digital Services, a global digital solutions and transformation business with a bold vision of our worlds potential Were people-centric and here to power good Every day, we future-proof urban spaces, conserve natural resources, protect rainforests, and save lives This is a world where innovation, technology, and deep expertise come together to take our company and customers from whats now to whats next We make it happen through the power of acceleration, Imagine the sheer breadth of talent it takes to bring a better tomorrow closer to today We dont expect you to fitevery requirement your life experience, character, perspective, and passion for achieving great things in the world are equally as important to us, The Team Were a leader in cutting-edge innovation, the transformative power of cloud technology, and converged and hyperconverged solutions Our mission is to empower clients to securely store, manage, and modernize their digital core, unlocking valuable insights and driving data-driven value, This strong, diverse, and collaborative group of technology professionals collaborate with teams to support our customers as they store, enrich, activate, and monetise their data, brining value to every line of their business, The Role We are seeking an experienced Data Architect with expertise in Workday Reporting and data automation The ideal candidate will have 10-12 years of experience, with a strong background in data architecture, reporting, and process automation, Key Responsibilities Workday Reporting Expertise Design and develop complex Workday reports (Advanced, Composite, and Matrix reports), Deliver data-driven insights using Workday's reporting tools, Ensure the integrity and alignment of reporting solutions with organizational goals, Data Architecture Create and implement robust data architecture frameworks, Manage seamless end-to-end data flows and system integrations, Optimize data storage, retrieval, and transformation processes for performance and scalability, Automation and Process Optimization Develop automation strategies for repetitive tasks using tools and scripts, Innovate data automation solutions to minimize manual efforts, Maintain quality, consistency, and timeliness in automated processes, Stakeholder Collaboration Partner with HR, IT, and business teams to understand reporting and data needs, Serve as a subject matter expert in Workday Reporting and data automation, Lead workshops and training sessions to enhance team understanding of reporting tools and processes, Continuous Improvement Identify and implement opportunities to improve reporting and data processes, Stay updated on emerging trends in data architecture and Workday technologies, Championing diversity, equity, and inclusion Diversity, equity, and inclusion (DEI) are integral to our culture and identity Diverse thinking, a commitment to allyship, and a culture of empowerment help us achieve powerful results We want you to be you, with all the ideas, lived experience, and fresh perspective that brings We support your uniqueness and encourage people from all backgrounds to apply and realize their full potential as part of our team, How We Look After You We help take care of your today and tomorrow with industry-leading benefits, support, and services that look after your holistic health and wellbeing Were also champions of life balance and offer flexible arrangements that work for you (role and location dependent) Were always looking for new ways of working that bring out our best, which leads to unexpected ideas So here, youll experience a sense of belonging, and discover autonomy, freedom, and ownership as you work alongside talented people you enjoy sharing knowledge with, Were proud to say were an equal opportunity employer and welcome all applicants for employment without attention to race, colour, religion, sex, sexual orientation, gender identity, national origin, veteran, age, disability status or any other protected characteristic Should you need reasonable accommodations during the recruitment process, please let us know so that we can do our best to set you up for success,

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies