Jobs
Interviews

905 Data Flow Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 6.0 years

6 - 10 Lacs

pune

Work from Office

You bring systems design experience with the ability to architect and explain complex systems interactions, data flows, common interfaces and APIs. You bring a deep understanding of and experience with software development and programming languages such as Java/Kotlin, and Shell scripting. You have hands-on experience with the following technologies as a senior software developer: Java/Kotlin, Spring, Spring Boot, Wiremock, Docker, Terraform, GCP services (Kubernetes, CloudSQL, PubSub, Storage, Logging, Dashboards), Oracle & amp; Postgres, SQL, PgWeb, Git, Github & amp; Github Actions, GCP Professional Data Engineering certification Data Pipeline Development: Designing, implementing, and optimizing data pipelines on GCP using PySpark for efficient and scalable data processing. ETL Workflow Development: Building and maintaining ETL workflows for extracting, transforming, and loading data into various GCP services. GCP Service Utilization: Leveraging GCP services like BigQuery, Cloud Storage, Dataflow, and Dataproc for data storage, processing, and analysis. Data Transformation: Utilizing PySpark for data manipulation, cleansing, enrichment, and validation. Performance Optimization: Ensuring the performance and scalability of data processing jobs on GCP. Collaboration: Working with data scientists, analysts, and other stakeholders to understand data requirements and translate them into technical solutions. Data Quality and Governance: Implementing and maintaining data quality standards, security measures, and compliance with data governance policies on GCP. Troubleshooting and Support: Diagnosing and resolving issues related to data pipelines and infrastructure. Staying Updated: Keeping abreast of the latest GCP services, PySpark features, and best practices in data engineering. Required Skills: GCP Expertise: Strong understanding of GCP services like BigQuery, Cloud Storage, Dataflow, and Dataproc. PySpark Proficiency: Demonstrated experience in using PySpark for data processing, transformation, and analysis. Python Programming: Solid Python programming skills for data manipulation and scripting. Data Modeling and ETL: Experience with data modeling, ETL processes, and data warehousing concepts. SQL: Proficiency in SQL for querying and manipulating data in relational databases. Big Data Concepts: Understanding of big data principles and distributed computing concepts. Communication and Collaboration: Ability to effectively communicate technical solutions and collaborate with cross-functional teams

Posted Just now

Apply

3.0 - 8.0 years

10 - 18 Lacs

chandigarh

Work from Office

Design and implement scalable data architectures to optimize data flow and analytics capabilities. Develop ETL pipelines, data warehouses, and real-time data processing systems. Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery. Work closely with data scientists to enhance machine learning models with structured and unstructured data. Prior experience in handling large-scale datasets is preferred. Mandatory Key Skills Data analytics,ETL,SQL,Python,Google Big Query,AWS Redshift,Data architecture*

Posted Just now

Apply

3.0 - 8.0 years

10 - 18 Lacs

varanasi

Work from Office

Design and implement scalable data architectures to optimize data flow and analytics capabilities Develop ETL pipelines, data warehouses, and real-time data processing systems Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery Work closely with data scientists to enhance machine learning models with structured and unstructured data Prior experience in handling large-scale datasets is preferred Mandatory Key SkillsETL pipelines,data warehouses,SQL,Python,AWS Redshift,Google BigQuery,ETL*

Posted 2 hours ago

Apply

3.0 - 8.0 years

10 - 18 Lacs

coimbatore

Work from Office

Design and implement scalable data architectures to optimize data flow and analytics capabilities Develop ETL pipelines, data warehouses, and real-time data processing systems Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery Work closely with data scientists to enhance machine learning models with structured and unstructured data Prior experience in handling large-scale datasets is preferred Mandatory Key SkillsPython,AWS Redshift,Google BigQuery,ETL pipelines,data warehousing,data architectures,SQL*

Posted 2 hours ago

Apply

3.0 - 8.0 years

10 - 18 Lacs

mysuru

Work from Office

Design and implement scalable data architectures to optimize data flow and analytics capabilities. Develop ETL pipelines, data warehouses, and real-time data processing systems. Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery. Work closely with data scientists to enhance machine learning models with structured and unstructured data. Prior experience in handling large-scale datasets is preferred. Mandatory Key SkillsData analytics,ETL,SQL,Python,Google Big Query,AWS Redshift,Data architecture*

Posted 2 hours ago

Apply

3.0 - 8.0 years

10 - 18 Lacs

kanpur

Work from Office

Design and implement scalable data architectures to optimize data flow and analytics capabilities Develop ETL pipelines, data warehouses, and real-time data processing systems Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery Work closely with data scientists to enhance machine learning models with structured and unstructured data Prior experience in handling large-scale datasets is preferred Mandatory Key Skillspython,data warehousing,etl,amazon redshift,bigquery,data engineering,data architecture,aws,machine learning,data flow,etl pipelines,real-time data processing,java,spring boot,microservices,spark,kafka,cassandra,scala,nosql,mongodb,rest,redis,SQL*

Posted 3 hours ago

Apply

3.0 - 8.0 years

10 - 18 Lacs

nagpur

Work from Office

Design and implement scalable data architectures to optimize data flow and analytics capabilities. Develop ETL pipelines, data warehouses, and real-time data processing systems. Must have expertise in SQL, Python, and cloud data platforms like AWS Redshift or Google BigQuery. Work closely with data scientists to enhance machine learning models with structured and unstructured data. Prior experience in handling large-scale datasets is preferred. Mandatory Key SkillsSQL,Python,data warehousing,etl,amazon redshift,bigquery,data engineering,AWS,machine learning,real-time data processing,Java,spring boot,microservices,spark,Kafka,Cassandra,Scala,NoSQL,mongodb,Redis,data architecture*

Posted 3 hours ago

Apply

4.0 - 8.0 years

12 - 22 Lacs

chennai

Work from Office

Role & responsibilities Job Summary: As a GCP Data Engineer you will be responsible for developing, optimizing, and maintaining data pipelines and infrastructure. Your expertise in SQL and Python will be instrumental in managing and transforming data, while your familiarity with cloud technologies will be considered an asset as we explore opportunities to enhance data engineering processes. Job Description: Building scalable Data Pipelines Design, implement, and maintain end-to-end data pipelines to efficiently extract, transform, and load (ETL) data from diverse sources. Ensure data pipelines are reliable, scalable, and performance oriented. SQL Expertise: Write and optimize complex SQL queries for data extraction, transformation, and reporting. Collaborate with analysts and data scientists to provide structured data for analysis. Cloud Platform Experience: Utilize cloud services to enhance data processing and storage capabilities. Work towards the integration of tools into the data ecosystem. Documentation and Collaboration: Document data pipelines, procedures, and best practices to facilitate knowledge sharing. Collaborate closely with cross-functional teams to understand data requirements and deliver solutions. Required skills: 4+ years of experience with SQL, Python, 4+ GCP BigQuery, DataFlow, GCS, Dataproc. 4+ years of experience building out data pipelines from scratch in a highly distributed and fault-tolerant manner. Comfortable with a broad array of relational and non-relational databases. Proven track record of building applications in a data-focused role (Cloud and Traditional Data Warehouse) Experience with CloudSQL, Cloud Functions and Pub/Sub, Cloud Composer etc., Inquisitive, proactive, and interested in learning new tools and techniques. Familiarity with big data and machine learning tools and platforms. Comfortable with open source technologies including Apache Spark, Hadoop, Kafka. Strong oral, written and interpersonal communication skills Comfortable working in a dynamic environment where problems are not always well-defined. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status.

Posted 3 hours ago

Apply

7.0 - 12.0 years

7 - 17 Lacs

pune, chennai, bengaluru

Work from Office

• Handson experience in objectoriented programming using Python, PySpark, APIs, SQL, BigQuery, GCP • Building data pipelines for huge volume of data • Dataflow Dataproc and BigQuery • Deep understanding of ETL concepts

Posted 5 hours ago

Apply

8.0 - 12.0 years

25 - 37 Lacs

pune, bengaluru

Work from Office

re looking for an experienced GCP Technical Lead to architect, design, and lead the development of scalable cloud-based solutions. The ideal candidate should have strong expertise in Google Cloud Platform (GCP) , data engineering, and modern cloud-native architectures, along with the ability to mentor a team of engineers. Key Responsibilities Lead the design and development of GCP-based solutions (BigQuery, Dataflow, Composer, Pub/Sub, GKE, etc.) Define cloud architecture best practices and ensure adherence to security, scalability, and performance standards. Collaborate with stakeholders to understand requirements and translate them into technical designs & roadmaps . Lead and mentor a team of cloud/data engineers, providing guidance on technical challenges. Implement and optimize ETL/ELT pipelines , data lake, and data warehouse solutions on GCP. Drive DevOps/CI-CD practices using Cloud Build, Terraform, or similar tools. Ensure cost optimization, monitoring, and governance within GCP environments. Work with cross-functional teams on cloud migrations and modernization projects . Required Skills & Qualifications Strong experience in GCP services : BigQuery, Dataflow, Pub/Sub, Cloud Storage, Composer, GKE, etc. Expertise in data engineering, ETL, and cloud-native development . Hands-on experience with Python, SQL, and Shell scripting . Knowledge of Terraform, Kubernetes, and CI/CD pipelines . Familiarity with data security, IAM, and compliance on GCP . Proven experience in leading technical teams and delivering large-scale cloud solutions. Excellent problem-solving, communication, and leadership skills. Preferred GCP Professional Cloud Architect / Data Engineer certification . Experience with machine learning pipelines (Vertex AI, AI Platform)

Posted 5 hours ago

Apply

10.0 - 12.0 years

6 - 11 Lacs

bengaluru

Work from Office

Oracle Health & AI (OHAI) is a newly formed business unit committed to transforming the healthcare industry through our expertise in IaaS and SaaS. Our mission is to deliver patient-centered care and make advanced clinical tools accessible globally (). We're assembling a team of innovative technologists to build the next-generation health platforma greenfield initiative driven by entrepreneurship, creativity, and energy. If you thrive in a fast-paced, innovative environment, we invite you to help us create a world-class engineering team with a meaningful impact. The OHAI Patient Accounting Analytics Team focuses on delivering cutting-edge reporting metrics and visualizations for healthcare financial data. Our goal is to transform healthcare by automating insurance and patient billing processes, helping optimize operations and reimbursement processes. Our solutions leverage a blend of reporting platforms across both on-premises and cloud infrastructure. As we expand reporting capabilities on the cloud and make use of AI, were looking for talented professionals to join us on this exciting journey. Responsibilities What? We are looking for an accomplished and experienced Consulting Member of Technical Staff with in-depth knowledge of Oracle Analytics Cloud (OAC) to lead the design, development, integration, and optimization of analytics solutions. In this role, you will serve as a technical leader, guiding solution architecture and promoting best practices in data modeling, visualization, and cloud analytics. You will collaborate with cross-functional teams and provide mentorship to other engineers. In addition to strong proficiency in OAC, experience with Oracle Machine Learning (OML) on Autonomous Data Warehouse (ADW) and within OAC is required. The ideal candidate will be skilled in integrating OML capabilities, developing advanced analytics solutions, and supporting data-driven business strategies to unlock actionable insights. Minimum Qualifications Bachelor's/Master's degree in Computer Science, Information Systems, Data Science or a related field. 10+ years of experience in analytics, business intelligence, or data engineering roles, with 3+ years hands-on with Oracle Analytics Cloud. Deep expertise in OAC features: Data Flows, Visualization, Semantic Modeling, Security, and Scheduling. Advanced SQL skills and proficiency integrating OAC with diverse data sources (Oracle DB, REST APIs, cloud and on-prem sources). Experience with cloud infrastructure and deployment (OCI preferred). Demonstrated ability to deliver scalable, enterprise-grade analytics solutions. Knowledge of security, privacy, and role-based access best practices. Strong collaboration, documentation, and presentation skills. Preferred Qualifications Experience with healthcare / financial systems. Oracle Analytics Cloud and/or OCI certifications. Experience with other BI/analytics platforms (e.g., Tableau, Power BI). Proficiency in scripting/programming for automation (e.g., Python, Shell).

Posted 5 hours ago

Apply

10.0 - 14.0 years

11 - 15 Lacs

chennai, bengaluru

Work from Office

An experienced consulting professional who understands solutions, industry best practices, multiple business processes or technology designs within a product/technology family. Operates independently to provide quality work products to an engagement. Performs varied and complex duties and tasks that need independent judgment, in order to implement Oracle products and technology to meet customer needs. Applies Oracle methodology, company procedures, and leading practices. Operates independently to provide quality work products to an engagement. Performs varied and complex duties and tasks that need independent judgment, in order to implement Oracle products and technology to meet customer needs. Applies Oracle methodology, company procedures, and leading practices. Demonstrates expertise to deliver functional and technical solutions on moderately complex customer engagements. May act as the team lead on projects. Effectively consults with management of customer organizations. Participates in business development activities. Develops and configures detailed solutions for moderately complex projects.10-12 years of experience relevant to this position. Ability to communicate effectively. Ability to build rapport with team members and clients. Ability to travel as needed. Responsibilities The candidate is expected to have 10 to 12 years of expert domain knowledge in HCM covering the hire to retire cycle. S/he must have been a part of at least 5 end-to-end HCM implementations of which at least 2 should have been with HCM Cloud. The candidate must have expert working experience in 1 or more of these modules along with the Payroll module Time and Labor Absence Management Talent Benefits Compensation Recruiting (ORC) Core HR In-depth understanding of HCM Cloud business process and their data flow. The candidate should have been in client facing roles and interacted with customers in requirement gathering workshops, design, configuration, testing and go-live. Should have strong written and verbal communication skills, personal drive, flexibility, team player, problem solving, influencing and negotiating skills and organizational awareness and sensitivity, engagement delivery, continuous improvement and knowledge sharing and client management. Good leadership capability with strong planning and follow up skills, mentorship, Work Allocation, monitoring and status updates to Project Manager Assist in the identification, assessment and resolution of complex functional issues/problems. Interact with client frequently around specific work efforts/deliverables Candidate should be open for domestic or international travel for short as well as long duration.

Posted 5 hours ago

Apply

10.0 - 14.0 years

19 - 25 Lacs

chennai, bengaluru

Work from Office

An experienced consulting professional who understands solutions, industry best practices, multiple business processes or technology designs within a product/technology family. Operates independently to provide quality work products to an engagement. Performs varied and complex duties and tasks that need independent judgment, in order to implement Oracle products and technology to meet customer needs. Applies Oracle methodology, company procedures, and leading practices. Operates independently to provide quality work products to an engagement. Performs varied and complex duties and tasks that need independent judgment, in order to implement Oracle products and technology to meet customer needs. Applies Oracle methodology, company procedures, and leading practices. Demonstrates expertise to deliver functional and technical solutions on moderately complex customer engagements. May act as the team lead on projects. Effectively consults with management of customer organizations. Participates in business development activities. Develops and configures detailed solutions for moderately complex projects.10-12 years of experience relevant to this position. Ability to communicate effectively. Ability to build rapport with team members and clients. Ability to travel as needed. Responsibilities The candidate is expected to have 10 to 12 years of expert domain knowledge in HCM covering the hire to retire cycle. S/he must have been a part of at least 5 end-to-end HCM implementations of which at least 2 should have been with HCM Cloud. The candidate must have expert working experience in 1 or more of these modules along with the Payroll module Time and Labor Absence Management Talent Benefits Compensation Recruiting (ORC) Core HR In-depth understanding of HCM Cloud business process and their data flow. The candidate should have been in client facing roles and interacted with customers in requirement gathering workshops, design, configuration, testing and go-live. Should have strong written and verbal communication skills, personal drive, flexibility, team player, problem solving, influencing and negotiating skills and organizational awareness and sensitivity, engagement delivery, continuous improvement and knowledge sharing and client management. Good leadership capability with strong planning and follow up skills, mentorship, Work Allocation, monitoring and status updates to Project Manager Assist in the identification, assessment and resolution of complex functional issues/problems. Interact with client frequently around specific work efforts/deliverables Candidate should be open for domestic or international travel for short as well as long duration.

Posted 5 hours ago

Apply

6.0 - 11.0 years

8 - 12 Lacs

bengaluru

Work from Office

At Oracle Health, we put humans at the heart of every conversation. Our mission is to create a human-centric healthcare experience powered by unified global data.As a global leader were looking for a Data Engineer with BI to join an exciting project for replacing existing Data warehouse systems with the Oracle's own data warehouse to manage storage of all the internal corporate data to provide insights that will help our teams to make critical business decisions Join us and create the future! Roles and Responsibilities Proficient in writing and optimising SQL queries for data extraction Translate client requirements to technical design that junior team members can implement Developing the code that aligns with the technical design and coding standards Review design and code implemented by other team members. Recommend better design and efficient code Conduct Peer design and Code Reviews for early detection of defects and code quality Documenting ETL processes and data flow diagrams Optimizing data extraction and transformation processes for better performance Performing data quality checks and debugging issues Conducting root cause analysis for data issues and implementing fixes Collaborating with more experienced developers on larger projects, collaborate with stakeholders on the requirements Participate in the requirements, design and implementation discussions Participating in learning and development opportunities to enhance technical skills Test storage system after transferring the data Exposures to Business Intelligence platforms like OAC, Power BI or Tableau Technical Skills Set : You must be strong in PLSQL concepts such as tables, keys, DDL, DML commands, etc. You need be proficient in writing, debugging complex SQL queries, Views and Stored Procedures. Strong hands on in Python / PySpark programming As a Data Engineer, you must be strong in data modelling, ETL / ELT concepts, programming / scripting Python language, You must be proficient in the following ETL process automation tools Oracle Data Integrator (ODI) Oracle Data Flow Oracle Database / Autonomous Data warehouse Should possess working knowledge in any of the cloud platform like Oracle Cloud (Preferred), Microsoft Azure, AWS You must be able to create technical design, build prototypes, build and maintain high performing data pipelines, optimise ETL pipelines Good knowledge on Business Intelligence development tools like OAC, PowerBI Good to have Microsoft ADF and Data Lakes, Databricks

Posted 5 hours ago

Apply

10.0 - 14.0 years

19 - 25 Lacs

chennai, bengaluru

Hybrid

An experienced consulting professional who has an understanding of solutions, industry best practices, multiple business processes or technology designs within a product/technology family. Operates independently to provide quality work products to an engagement. Performs varied and complex duties and tasks that need independent judgment, in order to implement Oracle products and technology to meet customer needs. Applies Oracle methodology, company procedures, and leading practices.Operates independently to provide quality work products to an engagement. Performs varied and complex duties and tasks that need independent judgment, in order to implement Oracle products and technology to meet customer needs. Applies Oracle methodology, company procedures, and leading practices. Demonstrates expertise to deliver functional and technical solutions on moderately complex customer engagements. May act as the team lead on projects. Effectively consults with management of customer organizations. Participates in business development activities. Develops and configures detailed solutions for moderately complex projects. 10-12 years of experience relevant to this position. Ability to communicate effectively. Ability to build rapport with team members and clients. Ability to travel as needed. Responsibilities The candidate is expected to have 10 to 12 years of expert domain knowledge in HCM covering the hire to retire cycle. S/he must have been a part of at least 5 end-to-end HCM implementations of which at least 2 should have been with HCM Cloud. The candidate must have expert working experience in 1 or more of these modules along with the Payroll module Time and Labor Absence Management Talent Benefits Compensation Recruiting (ORC) In-depth understanding of HCM Cloud business process and their data flow. The candidate should have been in client facing roles and interacted with customers in requirement gathering workshops, design, configuration, testing and go-live. Should have strong written and verbal communication skills, personal drive, flexibility, team player, problem solving, influencing and negotiating skills and organizational awareness and sensitivity, engagement delivery, continuous improvement and knowledge sharing and client management. Good leadership capability with strong planning and follow up skills, mentorship, Work Allocation, monitoring and status updates to Project Manager Assist in the identification, assessment and resolution of complex functional issues/problems. Interact with client frequently around specific work efforts/deliverables Candidate should be open for domestic or international travel for short as well as long duration.

Posted 5 hours ago

Apply

6.0 - 9.0 years

15 - 20 Lacs

bhubaneswar, hyderabad

Work from Office

Company: UpwardIQ Software Solutions Pvt. Ltd. Role: Sr. Java backend developer Experience: 6+ years Location: Hyderabad Work Mode: Hybrid (3 days Office) Role & responsibilities: Build reactive microservices using Spring WebFlux Design REST APIs & integrate with frontend and third-party systems Work with MongoDB : schema design, indexing, aggregation Collaborate with cross-functional teams in an Agile environment Ensure high performance, maintainability, and scalability Key Skills Required: Java, Spring Boot, Spring WebFlux MongoDB Schema Design, Aggregation Pipelines Reactive Programming, Project Reactor REST API Development Unit & Integration Testing JUnit, Mockito Git, Docker, Basic CI/CD If interested send your updated resume to soniyak@upwardiq.com

Posted 6 hours ago

Apply

0.0 years

10 - 14 Lacs

bengaluru

Work from Office

Job Title: AI Engineering Specialist Job Description: Happiest Minds is seeking a talented and experienced AI Engineering Specialist to join our dynamic team. As an AI Engineering Specialist, you will play a critical role in designing and implementing innovative artificial intelligence solutions that enhance our products and services. **Key Responsibilities:** - Design, develop, and implement AI applications using Java/ Python and other relevant technologies. - Collaborate with cross-functional teams to gather requirements and deliver solutions tailored to client needs. - Build responsive web applications using React JS and Angular 10 to integrate AI functionalities. - Develop and optimize AI models, algorithms, and frameworks, with a focus on Generative AI and advanced machine learning techniques. - Conduct data analysis and model evaluations to ensure high-quality outputs and performance. - Stay up-to-date with the latest trends in AI and machine learning, contributing to continuous improvement of existing solutions. - Mentor and guide junior engineers and team members in AI best practices and methodologies. **Required Skills and Qualifications:** - Minimum 5 years to a maximum of 8 years of relevant experience in application development - Proficiency in programming languages such as Java and Python. - Hands-on experience with front-end frameworks like React JS and Angular 10. - Strong knowledge of artificial intelligence concepts, tools, and methodologies, particularly in Generative AI. - Demonstrated ability to work collaboratively in a team-oriented environment. - Excellent problem-solving skills and attention to detail. - Strong communication skills, both verbal and written.

Posted 1 day ago

Apply

4.0 - 7.0 years

0 - 0 Lacs

mohali

Work from Office

Position: Senior Analyst Location: Mohali Department: Data Engineering & Analytics Experience Required: 47 years Role Overview We are seeking a highly skilled Senior Analyst with strong expertise in data engineering and analytics to join our team. The ideal candidate will have solid experience working with SQL, BigQuery, Python, Pandas, Airflow, and cloud technologies on Google Cloud Platform (GCP). You will play a key role in designing, building, and optimizing data pipelines, ensuring data quality, and supporting analytics initiatives across the organization. Key Responsibilities Develop, optimize, and maintain scalable ETL/ELT pipelines using Airflow and GCP services. Write efficient, high-performance SQL queries for data extraction, transformation, and analysis. Work extensively with BigQuery to design and optimize large-scale analytical queries and datasets. Utilize Python and Pandas for data wrangling, cleaning, and advanced analytical workflows. Collaborate with stakeholders to gather requirements, translate them into technical solutions, and deliver actionable insights. Manage version control and collaboration workflows using Git/Bitbucket. Ensure best practices in data governance, quality, and security across data processes. Contribute to the automation and monitoring of pipelines for reliability and efficiency. (Preferred) Work with Apache Spark for large-scale distributed data processing where required. Required Skills & Qualifications Bachelors or Master’s degree in Computer Science, Data Engineering, Information Systems, or related field. 4+ years of professional experience in data engineering, analytics, or a related field. Strong expertise in SQL (query optimization, advanced joins, window functions). Hands-on experience with BigQuery and GCP services. Proficiency in Python (with libraries like Pandas, NumPy) for data manipulation and analysis. Experience with Airflow for orchestration and scheduling. Familiarity with Git/Bitbucket for version control and collaborative development. Strong problem-solving and analytical skills with attention to detail. Preferred Skills (Nice-to-Have) Working knowledge of Apache Spark for distributed data processing. Exposure to data visualization tools (e.g. Power BI). Experience in CI/CD pipelines and DevOps practices for data workflows. Interested candidate can drop their CV at rashi.malhotra@primotech.com

Posted 2 days ago

Apply

9.0 - 14.0 years

27 - 42 Lacs

hyderabad

Work from Office

Overview We are seeking an experienced and strategic Business Analyst / Functional Lead to drive solution definition, business alignment, and successful delivery of Real-Time Decisioning initiatives and play a critical role in translating complex business needs into actionable functional requirements, guiding cross-functional teams, and shaping customer-centric decisioning strategies across digital channels. Responsibilities Gather, analyze, and document business and functional requirements for decisioning use cases (e.g., next-best-action, personalized offers, customer journeys). Act as the primary liaison between business stakeholders, product owners, and technical teams for real-time decisioning solutions. Define and maintain decision logic, business rules, and outcome scenarios in alignment with marketing and CX goals. Facilitate all Agile ceremonies including sprint planning, daily stand-ups, reviews, and retrospectives. Guide the team in Agile practices, track sprint progress, and manage delivery risks. Remove blockers and coordinate across business, design, tech, QA, and operations teams. Maintain ADO board, backlog grooming, sprint metrics, and continuous improvement initiatives. Collaborate with solution architects to design customer-centric, scalable real-time decisioning frameworks. Lead discovery and requirement workshops with marketing, data, and technology stakeholders. Own the functional design documents, user stories, and solution blueprints; ensure clarity, accuracy, and traceability. Work with engineering teams to define test scenarios and validate decisioning outputs. Support rollout, training, and adoption of decisioning platforms across business units. Continuously monitor and optimize decisioning logic and KPIs in partnership with analytics teams. Qualifications Candidate should have total 9 14 years of total IT experience and at least 3+ years of relevant work experience as an RTD Functional Lead and business analysis, functional consulting, or similar roles in MarTech, AdTech, or CX platforms. Bachelor's or Master's degree in computer science, information technology, or a related field. Strong understanding of real-time decisioning platforms such as Salesforce Marketing Cloud Personalization / Interaction Studio CleverTap Proven ability to map customer journeys and define decision strategies based on personas, behavior, and context. Skilled in requirement gathering, functional documentation, user story writing, and backlog management. Excellent understanding of data flows, business rules, segmentation, and targeting. Ability to translate business needs into logical rules, decision tables, and KPIs. Strong communication and stakeholder management skills across business and technical audiences.

Posted 2 days ago

Apply

5.0 - 8.0 years

10 - 14 Lacs

bengaluru

Work from Office

About The Role Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : IBM Sterling B2B Integrator Good to have skills : DevOps Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and facilitating communication among stakeholders to drive project success. You will also engage in problem-solving activities, providing guidance and support to your team while ensuring that best practices are followed throughout the development process. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training and knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in IBM Sterling B2B Integrator.- Good To Have Skills: Experience with DevOps.- Strong understanding of application design and architecture principles.- Experience in managing application lifecycle and deployment processes.- Familiarity with integration patterns and data flow management. Additional Information:- The candidate should have minimum 5 years of experience in IBM Sterling B2B Integrator.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 days ago

Apply

1.0 - 4.0 years

4 - 8 Lacs

bengaluru

Work from Office

About The Role Project Role : Technology Support Engineer Project Role Description : Resolve incidents and problems across multiple business system components and ensure operational stability. Create and implement Requests for Change (RFC) and update knowledge base articles to support effective troubleshooting. Collaborate with vendors and help service management teams with issue analysis and resolution. Must have skills : TIBCO Administration Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Technology Support Engineer, you will engage in resolving incidents and problems across various business system components, ensuring operational stability throughout the day. Your responsibilities will include creating and implementing Requests for Change, updating knowledge base articles, and collaborating with vendors to assist service management teams in issue analysis and resolution. Each day will involve a mix of troubleshooting, documentation, and teamwork to maintain seamless operations and enhance service delivery. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training sessions for team members to enhance their skills and knowledge.- Monitor system performance and proactively identify areas for improvement. Professional & Technical Skills: - Must To Have Skills: Proficiency in TIBCO Administration.- Strong understanding of system integration and data flow management.- Experience with incident management and problem resolution processes.- Familiarity with service management tools and methodologies.- Ability to create and maintain comprehensive documentation for processes and procedures. Additional Information:- The candidate should have minimum 7.5 years of experience in TIBCO Administration.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 days ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

bengaluru

Work from Office

About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : TIBCO Administration Good to have skills : TIBCO BusinessWorks Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and troubleshooting to ensure that the applications function as intended, contributing to the overall success of the projects you are involved in. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in continuous learning to stay updated with the latest technologies and best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in TIBCO Administration.- Good To Have Skills: Experience with TIBCO BusinessWorks.- Strong understanding of application development methodologies.- Experience with troubleshooting and resolving application issues.- Familiarity with integration patterns and data flow management. Additional Information:- The candidate should have minimum 3 years of experience in TIBCO Administration.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 days ago

Apply

6.0 - 11.0 years

6 - 15 Lacs

chennai

Hybrid

Position Description: Employees in this job function are responsible for designing, building, and maintaining data solutions including data infrastructure, pipelines, etc. for collecting, storing, processing and analyzing large volumes of data efficiently and accurately Key Responsibilities: 1) Collaborate with business and technology stakeholders to understand current and future data requirements 2) Design, build and maintain reliable, efficient and scalable data infrastructure for data collection, storage, transformation, and analysis 3) Plan, design, build and maintain scalable data solutions including data pipelines, data models, and applications for efficient and reliable data workflow 4) Design, implement and maintain existing and future data platforms like data warehouses, data lakes, data lakehouse etc. for structured and unstructured data 5) Design and develop analytical tools, algorithms, and programs to support data engineering activities like writing scripts and automating tasks 6) Ensure optimum performance and identify improvement opportunities Skills Required: Python, Dataproc, dataflow, GCP Cloud Run, Agile Software Development, DataForm, TERRAFORM, Big Query, Data Fusion, GCP, Cloud SQL, KAFKA Experience Required: Bachelor's degree in Computer Science, Engineering, or a related technical field 5+ years of SQL development experience 5+ years of analytics/data product development experience required 3+ years of Google cloud experience with solutions designed and implemented at production scale Experience working in GCP native (or equivalent) services like Big Query, Google Cloud Storage, Dataflow, Dataproc, etc 2+ Experience working with Airflow for scheduling and orchestration of data pipelines 1+ Experience working with Terraform to provision Infrastructure as Code 2 + years professional development experience in Python Experience Preferred: In-depth understanding of Google's product technology (or other cloud platform) and underlying architectures Experience with development eco-system such as Tekton/Cloud Build, Git Experience in working with DBT/Dataform Education Required: Bachelor's Degree Additional Information : You will work on ingesting, transforming, and analyzing large datasets to support the Enterprise Securitization Solution Experience with large scale solution and operationalization of data lakes, data warehouses, and analytics platforms on Google Cloud Platform or other cloud environments is a must Work in collaborative environment that leverages paired programming Work on a small agile team to deliver curated data products Work effectively with product owners, data champions and other technical experts Demonstrate technical knowledge and communication skills with the ability to advocate for well-designed solutions Develop exceptional analytical data products using both streaming and batch ingestion patterns on Google CloudPlatform with solid data warehouse principles Be the Subject Matter Expert in Data Engineering with a focus on GCP native services and other well integrated third-party technologies

Posted 3 days ago

Apply

7.0 - 12.0 years

1 - 2 Lacs

pune

Work from Office

Role & responsibilities Job Title: GCP Data Engineer Location: Pune, India Experience: 5+ years Fulltime Work from office Job Description: We are seeking an experienced GCP Data Engineer with strong expertise in building and managing scalable data pipelines on Google Cloud Platform. The role involves designing ETL/ELT workflows, optimizing BigQuery and Dataflow solutions, and ensuring secure, efficient data availability for analytics and reporting. Preferred candidate profile

Posted 3 days ago

Apply

7.0 - 12.0 years

25 - 27 Lacs

hyderabad

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc) Keywords :dataproc,pyspark,data flow,kafka,cloud storage,terraform,oops,cloud spanner,hadoop,java,hive,spark,mapreduce,big data,gcp,aws,javascript,mysql,postgresql,sql server,oracle,bigtable,software development,sql*,python development*,python*,bigquery*,pandas*

Posted 3 days ago

Apply

Exploring Data Flow Jobs in India

The data flow job market in India is booming with opportunities for skilled professionals. With the increasing reliance on data-driven decision-making across industries, the demand for data flow experts is on the rise. Whether you are a recent graduate or an experienced professional looking to transition into this field, there are ample job openings waiting for you in India.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Delhi-NCR

These cities are known for their strong presence of tech companies and offer a plethora of opportunities for data flow roles.

Average Salary Range

The average salary range for data flow professionals in India varies based on experience and expertise. Entry-level positions can expect to earn between INR 4-6 lakhs per annum, while experienced professionals with advanced skills can command salaries upwards of INR 15 lakhs per annum.

Career Path

In the data flow domain, a typical career path may include roles such as: - Junior Data Analyst - Data Engineer - Data Scientist - Senior Data Architect - Chief Data Officer

As you gain experience and expertise, you can progress to higher positions with increased responsibilities and leadership opportunities.

Related Skills

Apart from expertise in data flow tools and technologies, professionals in this field are often expected to have skills in: - Data visualization - Machine learning - Statistical analysis - Programming languages (Python, R, SQL)

Interview Questions

  • What is ETL and how does it relate to data flow? (basic)
  • Explain the difference between batch processing and real-time processing. (medium)
  • How would you handle missing data in a dataset? (medium)
  • Can you explain the concept of data normalization and why it is important? (medium)
  • What is the difference between supervised and unsupervised learning? (basic)
  • How would you optimize a data pipeline for performance? (advanced)
  • Can you describe a challenging data flow problem you encountered in a previous project and how you solved it? (advanced)
  • What is the role of Apache Kafka in data flow architectures? (medium)
  • How do you ensure data quality and consistency in a data flow process? (medium)
  • Explain the concept of data lineage and its importance in data flow management. (advanced)
  • What are the advantages of using a distributed data processing framework like Apache Spark? (medium)
  • How do you handle data security and privacy issues in a data flow environment? (advanced)
  • Can you explain the concept of data partitioning and its benefits in parallel processing? (medium)
  • How would you approach data profiling and data quality assessment in a new dataset? (medium)
  • What are the key components of a data flow architecture? (basic)
  • How do you handle data skew in distributed data processing? (advanced)
  • Explain the concept of data replication and its use cases in data flow management. (medium)
  • How do you stay updated with the latest trends and technologies in the data flow domain? (basic)
  • Can you describe a scenario where you had to optimize a data flow process for cost efficiency? (advanced)
  • What are the common challenges faced in designing and implementing data pipelines? (medium)
  • How do you ensure data integrity and consistency in a distributed data processing environment? (advanced)
  • Can you explain the difference between stream processing and batch processing? (basic)
  • Describe a time when you had to troubleshoot a data flow issue in a production environment. (medium)
  • How would you handle a sudden increase in data volume in a data flow pipeline? (advanced)

Closing Remark

As you embark on your journey to explore data flow jobs in India, remember to equip yourself with the necessary skills and knowledge to stand out in a competitive job market. Prepare diligently, showcase your expertise, and apply confidently to secure exciting opportunities in this growing field. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies