Jobs
Interviews

905 Data Flow Jobs - Page 15

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

15.0 - 20.0 years

15 - 19 Lacs

Gurugram

Work from Office

Project Role : Technology Architect Project Role Description : Design and deliver technology architecture for a platform, product, or engagement. Define solutions to meet performance, capability, and scalability needs. Must have skills : SAP Plant Maintenance (PM) Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Technology Architect, you will design and deliver technology architecture for a platform, product, or engagement. Your typical day will involve collaborating with various teams to define solutions that meet performance, capability, and scalability needs. You will engage in discussions to ensure that the architecture aligns with business objectives and technical requirements, while also addressing any challenges that arise during the development process. Your role will require you to stay updated with the latest technology trends and best practices to ensure that the solutions you propose are innovative and effective. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Evaluate and recommend new technologies that can improve system performance. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Plant Maintenance (PM).- Strong understanding of technology architecture principles.- Experience with system integration and data flow management.- Ability to analyze and optimize system performance.- Familiarity with cloud technologies and their application in architecture design. Additional Information:- The candidate should have minimum 7.5 years of experience in SAP Plant Maintenance (PM).- This position is based at our Gurugram office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

6.0 - 11.0 years

15 - 25 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Warm welcome from SP Staffing Services! Reaching out to you regarding permanent opportunity !! Job Description: Exp: 6-12 yrs Location: Hyderabad/Bangalore/Pune/Gurgaon Skill: GCP Data Engineer Interested can share your resume to sangeetha.spstaffing@gmail.com with below inline details. Full Name as per PAN: Mobile No: Alt No/ Whatsapp No: Total Exp: Relevant Exp in GCP: Rel Exp in Big Query: Rel Exp in Bigdata: Current CTC: Expected CTC: Notice Period (Official): Notice Period (Negotiable)/Reason: Date of Birth: PAN number: Reason for Job Change: Offer in Pipeline (Current Status): Availability for virtual interview on weekdays between 10 AM- 4 PM(plz mention time): Current Res Location: Preferred Job Location: Whether educational % in 10th std, 12th std, UG is all above 50%? Do you have any gaps in between your education or Career? If having gap, please mention the duration in months/year:

Posted 1 month ago

Apply

3.0 - 8.0 years

0 - 1 Lacs

Bengaluru

Hybrid

position: Contract to Hire(C2H) Skill: SAP Production Control Experience:3+ Location: Bang Notice Period: Immediate to 15 Days 1-2 years of experience in SAP Basis / administration support role Experience in different SAP systems in landscape and data flow SAP ECC/SAP S4 batch jobs administration and scheduling Experience with Redwood BPA batch jobs administration and scheduling will be an added advantage Experience in SAP BI / Business Objects Data Services (BODS) will be an added advantage This role involves working in 24x7 rotational shifts Configure and schedule SAP batch jobs/job chains in Redwood BPA Troubleshoot failed batch processes per set guidelines Monitor and kill batch jobs in SAP S4 via SM37 and SM50 during troubleshooting Monitor Business Objects Data Services (BODS) Support mutiple SAP Security tasks and projects after training

Posted 1 month ago

Apply

4.0 - 7.0 years

12 - 22 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Job description Location: Kumbalgodu, Kengeri, Bangalore (Onsite) Type: Full-time | Monday to Saturday Experience: 3+ years in ERP Implementation We at Girish Exports are transitioning from a legacy ERP (Visual Gems) to a custom-built system on Zoho Creator . We're looking for a practical, hands-on ERP Implementation Lead who understands real-world operations and knows how to bring tech and people together. What Youll Do: Lead the planning and rollout of our ERP system across departments Work closely with developers and business users to map operations into usable system workflows Design modular data flows that connect upstream and downstream processes Collaborate with department heads to drive adoption and coordinate training plans Ensure the ERP system supports teams like merchandising, production, stores, finance, HR, and maintenance Identify bottlenecks, simplify processes, and make sure solutions work in the real world , not just on paper Occasional travel will be required factory units expenses will be fully covered by the company You Should Have: 3+ years of ERP implementation experience in complex, real-world setups Mandatory hands-on experience with Zoho Creator Strong understanding of operational workflows, data architecture, and process mapping Ability to work with non-tech users (shop floor, stores, admin) and ensure smooth adoption Excellent communication and cross-functional collaboration skills A mindset focused on outcomes, not just systems Why Join Us? If you're excited by the idea of driving real change and making a tangible impact on day-to-day operations, this is the role for you. You'll help shape a custom-built ERP system from the ground up and if using data-driven insights to improve how things actually work on the ground excites you, you'll thrive here.

Posted 1 month ago

Apply

4.0 - 9.0 years

5 - 14 Lacs

Pune, Chennai, Bengaluru

Work from Office

Dear Candidate, This is with reference to your profile on the job portal. Deloitte India Consulting has an immediate requirement for the following role. Job Summary: We are looking for a skilled GCP Data Engineer to design, build, and maintain scalable data pipelines and solutions on Google Cloud Platform . The ideal candidate will have hands-on experience with GCP services, data warehousing, ETL processes, and big data technologies. Key Responsibilities: Design and implement scalable data pipelines using Cloud Dataflow , Apache Beam , and Cloud Composer . Develop and maintain data models and data marts in BigQuery . Build ETL/ELT workflows to ingest, transform, and load data from various sources. Optimize data storage and query performance in BigQuery and other GCP services. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements. Ensure data quality, integrity, and security across all data solutions. Monitor and troubleshoot data pipeline issues and implement improvements. Required Skills & Qualifications: Bachelors or Master’s degree in Computer Science, Engineering, or related field. 3+ years of experience in data engineering, with at least 1–2 years on Google Cloud Platform . Proficiency in SQL , Python , and Apache Beam . Hands-on experience with GCP services like BigQuery , Cloud Storage , Cloud Pub/Sub , Cloud Dataflow , and Cloud Composer . Experience with data modeling , data warehousing , and ETL/ELT processes. Familiarity with CI/CD pipelines , Terraform , and Git . Strong problem-solving and communication skills. Nice to Have: GCP certifications (e.g., Professional Data Engineer ). Incase if you are interested, please share your updated resume along with the following details.(Mandatory) To smouni@deloitte.com Candidate Name Mobile No. Email ID Skill Total Experience Education Details Current Location Requested location Current Firm Current CTC Exp CTC Notice Period/LWD Feedback

Posted 1 month ago

Apply

12.0 - 20.0 years

25 - 40 Lacs

Kolkata, Hyderabad, Pune

Work from Office

GCP Data Architect

Posted 1 month ago

Apply

5.0 - 10.0 years

12 - 22 Lacs

Kolkata, Hyderabad, Pune

Work from Office

GCP Engineer, Lead GCP Engineer

Posted 1 month ago

Apply

12.0 - 17.0 years

27 - 35 Lacs

Madurai, Chennai

Hybrid

Dear Candidate, Greetings of the day!! I am Kantha, and I'm reaching out to you regarding an exciting opportunity with TechMango. You can connect with me on LinkedIn https://www.linkedin.com/in/kantha-m-ashwin-186ba3244/ Or Email: kanthasanmugam.m@techmango.net Techmango Technology Services is a full-scale software development services company founded in 2014 with a strong focus on emerging technologies. It holds a primary objective of delivering strategic solutions towards the goal of its business partners in terms of technology. We are a full-scale leading Software and Mobile App Development Company. Techmango is driven by the mantra Clients Vision is our Mission. We have a tendency to stick on to the current statement. To be the technologically advanced & most loved organization providing prime quality and cost-efficient services with a long-term client relationship strategy. We are operational in the USA - Chicago, Atlanta, Dubai - UAE, in India - Bangalore, Chennai, Madurai, Trichy. Job Title: Technical GCP Data Architect/Lead Location: Madurai Experience: 12+ Years Notice Period: Immediate Job Summary We are seeking a hands-on Technical GCP Data Architect/Lead with deep expertise in real-time streaming data architectures to help design, build, and optimize data pipelines in our Google Cloud Platform (GCP) environment. The ideal candidate will have strong architectural vision and be comfortable rolling up their sleeves to build scalable, low-latency streaming data pipelines using Pub/Sub, Dataflow (Apache Beam) , and BigQuery . Key Responsibilities Architect and implement end-to-end streaming data solutions on GCP using Pub/Sub , Dataflow , and BigQuery . Design real-time ingestion, enrichment, and transformation pipelines for high-volume event data. Work closely with stakeholders to understand data requirements and translate them into scalable designs. Optimize streaming pipeline performance, latency, and throughput. Build and manage orchestration workflows using Cloud Composer (Airflow) . Drive schema design, partitioning, and clustering strategies in BigQuery for both real-time and batch datasets. Define SLAs, monitoring, logging, and alerting for streaming jobs using Cloud Monitoring , Error Reporting , and Stackdriver . Experience with the data modeling. Ensure robust security, encryption, and access controls across all data layers. Collaborate with DevOps for CI/CD automation of data workflows using Terraform , Cloud Build , and Git . Document streaming architecture, data lineage, and deployment runbooks. Required Skills & Experience 10+ years of experience in data engineering or architecture. 3+ years of hands-on GCP data engineering experience. Strong expertise in: Google Pub/Sub Dataflow (Apache Beam) BigQuery (including streaming inserts) Cloud Composer (Airflow) Cloud Storage (GCS) Solid understanding of streaming design patterns , exactly-once delivery , and event-driven architecture . Deep knowledge of SQL and NoSQL data modeling. Hands-on experience with monitoring and performance tuning of streaming jobs. Experience using Terraform or equivalent for infrastructure as code. Familiarity with CI/CD pipelines for data workflows.

Posted 1 month ago

Apply

4.0 - 8.0 years

20 - 35 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Salary: 20 to 35 LPA Exp: 3 to 7 years Location: Gurgaon/Pune/Bengalore Notice: Immediate to 30 days..!! Job Profile: Experienced Data Engineer with a strong foundation in designing, building, and maintaining scalable data pipelines and architectures. Skilled in transforming raw data into clean, structured formats for analytics and business intelligence. Proficient in modern data tools and technologies such as SQL, T-SQL, Python, Databricks, and cloud platforms (Azure). Adept at data wrangling, modeling, ETL/ELT development, and ensuring data quality, integrity, and security. Collaborative team player with a track record of enabling data-driven decision-making across business units. As a Data engineer, Candidate will work on the assignments for one of our Utilities clients. Collaborating with cross-functional teams and stakeholders involves gathering data requirements, aligning business goals, and translating them into scalable data solutions. The role includes working closely with data analysts, scientists, and business users to understand needs, designing robust data pipelines, and ensuring data is accessible, reliable, and well-documented. Regular communication, iterative feedback, and joint problem-solving are key to delivering high-impact, data-driven outcomes that support organizational objectives. This position requires a proven track record of transforming processes, driving customer value, cost savings with experience in running end-to-end analytics for large-scale organizations. Design, build, and maintain scalable data pipelines to support analytics, reporting, and advanced modeling needs. Collaborate with consultants, analysts, and clients to understand data requirements and translate them into effective data solutions. Ensure data accuracy, quality, and integrity through validation, cleansing, and transformation processes. Develop and optimize data models, ETL workflows, and database architectures across cloud and on-premises environments. Support data-driven decision-making by delivering reliable, well-structured datasets and enabling self-service analytics. Provides seamless integration with cloud platforms (Azure), making it easy to build and deploy end-to-end data pipelines in the cloud Scalable clusters for handling large datasets and complex computations in Databricks, optimizing performance and cost management. Must to have Client Engagement Experience and collaboration with cross-functional teams Data Engineering background in Databricks Capable of working effectively as an individual contributor or in collaborative team environments Effective communication and thought leadership with proven record. Candidate Profile: Bachelors/masters degree in economics, mathematics, computer science/engineering, operations research or related analytics areas 3+ years experience must be in Data engineering. Hands on experience on SQL, Python, Databricks, cloud Platform like Azure etc. Prior experience in managing and delivering end to end projects Outstanding written and verbal communication skills Able to work in fast pace continuously evolving environment and ready to take up uphill challenges Is able to understand cross cultural differences and can work with clients across the globe.

Posted 1 month ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Surat

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 month ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Varanasi

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 month ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Visakhapatnam

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications.

Posted 1 month ago

Apply

3.0 - 7.0 years

3 - 7 Lacs

Pune, Maharashtra, India

On-site

Design, develop and maintain data pipelines using Python and SQL programming language on GCP. Experience in Agile methodologies, ETL, ELT, Data movement and Data processing skills. Work with Cloud Composer to manage and process batch data jobs efficiently. Develop and optimize complex SQL queries for data analysis, extraction, and transformation. Develop and deploy google cloud services using Terraform. Implement CI CD pipeline using GitHub Action Consume and Hosting REST API using Python. Monitor and troubleshoot data pipelines, resolving any issues in a timely manner. Ensure team collaboration using Jira, Confluence, and other tools. Ability to quickly learn new any existing technologies Strong problem-solving skills. Write advanced SQL and Python scripts. Certification on Professional Google Cloud Data engineer will be an added advantage. Your skills and experience 6+ years of IT experience, as a hands-on technologist. Proficient in Python for data engineering. Proficient in SQL. Hands on experience on GCP Cloud Composer, Data Flow, Big Query, Cloud Function, Cloud Run and well to have GKE Hands on experience in REST API hosting and consumptions. Proficient in Terraform/ Hashicorp. Experienced in GitHub and Git Actions Experienced in CI-CD Experience in automating ETL testing using python and SQL. Good to have APIGEE. Good to have Bit Bucket

Posted 1 month ago

Apply

9.0 - 14.0 years

9 - 14 Lacs

Bengaluru, Karnataka, India

On-site

Role Description Deutsche Bank has set for itself ambitious goals in the areas of Sustainable Finance, ESG Risk Mitigation as well as Corporate Sustainability. As Climate Change throws new Challenges and opportunities, Bank has set out to invest in developing a Sustainability Technology Platform, Sustainability data products and various sustainability applications which will aid Banks goals. As part of this initiative, we are building an exciting global team of technologists who are passionate about Climate Change, want to contribute to greater good leveraging their Technology Skillset in multiple areas predominantly in Cloud / Hybrid Architecture. As part of this Role, we are seeking a highly experienced GCP Data & BI Subject Matter Expert (SME) to join our growing team. In this senior role, you will be a trusted advisor, providing technical expertise and strategic direction across all things data and BI on GCP. Your key responsibilities Technical Expertise In-depth knowledge of GCP data services (BigQuery, Cloud Storage, Dataflow, etc.). Design and optimize complex data pipelines for efficient data ingestion, transformation, and analysis. Partner with product management group and other business stakeholders to gather requirements, translate them into technical specifications, and design effective BI solutions (Tableau, Looker) Design and develop complex data models, leveraging expertise in relational and dimensional modeling techniques. Advocate for best practices in data governance, security, and compliance on GCP. Collaboration & Mentorship Collaborate with data engineers,analysts, and business stakeholders to understand data requirements and drive data-driven decision-making. Mentor and guide junior team members on GCP technologies and BI best practices. Foster a culture of innovation and continuous improvement within the data and BI domain. Staying Current Track emerging trends and innovations in GCP, BI tools, and data analytics methodologies. Proactively research and recommend new technologies and solutions to enhance our data, BI capabilities. Your skills and experience 9+ years of experience in data warehousing, data management, and business intelligence. Proven expertise in Google Cloud Platform (GCP) and its data services (BigQuery, Cloud Storage, Dataflow, etc.). Strong understanding of data governance, security, and compliance principles on GCP. Experience designing and implementing complex data pipelines. In-depth knowledge of relational and dimensional modeling techniques for BI. Experience with T SQL or PL SQL or Ansi SQL Experience with leading BI tools and platforms (Tableau, Looker). Excellent communication, collaboration, and problem-solving skills. Ability to translate technical concepts into clear, actionable insights for business stakeholders. Strong leadership presence and ability to influence and inspire others. Knowledge of Sustainable Finance / ESG Risk / CSRD / Regulatory Reporting will be a plus Knowledge of cloud infrastructure and data governance best practices will be a plus. Knowledge of Terraform will be a plus

Posted 1 month ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Lucknow

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 month ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Ludhiana

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

jaipur, rajasthan

On-site

As a Senior Business Analyst, you will play a crucial role in bridging the gap between customers, stakeholders, and technical teams. Your responsibilities will include gathering, documenting, and managing business and functional requirements while preparing comprehensive documentation such as BRDs, user stories, and use cases. It will be essential for you to break down requirements into epics, user stories, and tasks to ensure clarity and understanding. You will be tasked with analyzing current and future state processes to identify improvement opportunities, driving project planning, risk assessment, and effort estimation. Collaboration with development, QA, and product teams will be key to ensuring smooth project delivery. Additionally, you will be involved in UAT planning and execution, change management, training, and communication planning. Your role will extend to working with data science and AI teams to translate business needs into AI/ML use cases and contributing towards identifying AI opportunities across internal and customer-facing processes. Your proficiency in requirement elicitation, documenting business and functional requirements, and hands-on experience with tools like JIRA, Confluence, and Azure DevOps will be crucial. The ability to create BRDs, User Stories, Process Flows, and Wireframes, along with strong stakeholder management and communication skills, will be essential in this role. Deep understanding of Agile methodologies, Scrum frameworks, backlog grooming, estimation, and Agile ceremonies are prerequisites. Familiarity with SQL, advanced Excel for data analysis, exposure to AI/ML-driven solutions, and strong problem-solving abilities are also required. Preferred skills include Scrum Master Certification, Business Analytics certifications, experience with wireframing/prototyping tools, exposure to BI/Analytics tools, familiarity with cloud platforms, experience with API integrations and data migration projects, domain knowledge in specific industries, and understanding of AI concepts and ethical considerations. A bachelor's degree in Computer Science, MCA, or related technical field is a must, along with excellent interpersonal skills, a customer-centric attitude, ownership mindset, and leadership skills. Being flexible, adaptable, and a quick learner of new domains and tools will be beneficial in this role. A passion for emerging technologies like AI, ML, RPA, and automation will further enhance your performance in this position.,

Posted 1 month ago

Apply

4.0 - 9.0 years

10 - 20 Lacs

Bengaluru

Remote

Job Description: Job Title: Apache beam software engineer Work Mode: Remote Base Location: Bengaluru Experience Required: 4 to 6 Years Job Summary: We are looking for a Software Engineer with hands-on experience in Apache Beam , Google Cloud Dataflow , and Dataproc , focusing on building reusable data processing frameworks . This is not a traditional data engineering role. The ideal candidate will have strong software development skills in Java or Python and experience in building scalable, modular data processing components and frameworks for batch and streaming use cases. Key Responsibilities: Design and develop framework-level components using Apache Beam , GCP Dataflow , and Dataproc . Build scalable, reusable libraries and abstractions in Python or Java for distributed data processing. Work closely with architects to implement best practices for designing high-performance data frameworks. Ensure software reliability, maintainability, and testability through strong coding and automation practices. Participate in code reviews, architectural discussions, and performance tuning initiatives. Contribute to internal tooling or SDK development for data engineering platforms. Required Skills: 4 to 6 years of experience as a Software Engineer working on distributed systems or data processing frameworks. Strong programming skills in Java and/or Python . Deep experience with Apache Beam and GCP Dataflow . Hands-on experience with GCP Dataproc , especially for building scalable custom batch or streaming jobs. Solid understanding of streaming vs batch processing concepts. Familiarity with CI/CD pipelines , GitHub , and test automation. Preferred Skills: Experience with workflow orchestration tools such as Airflow (Composer) . Exposure to Pub/Sub and BigQuery (from a system integration perspective). Understanding of observability , logging , and error-handling in distributed data pipelines. Experience building internal libraries, SDKs, or tools to support data teams. Tech Stack: Cloud: GCP (Dataflow, Dataproc, Pub/Sub, Composer) Programming: Java, Python Frameworks: Apache Beam DevOps: GitHub, CI/CD (Cloud Build, Jenkins) Focus Areas: Framework/library development, scalable distributed data processing, component-based architecture

Posted 1 month ago

Apply

1.0 - 4.0 years

2 - 5 Lacs

Hyderabad

Work from Office

Job Overview Plan A Technologies is looking for an experienced SQL Developer with hands-on experience in designing, developing, and optimizing high-performance SQL solutions in high-volume environments This role requires deep technical expertise in SQL development, performance tuning, and a solid understanding of data flows, ETL processes, and relational database structures The ideal candidate will have a proven ability to translate complex business requirements into scalable and secure database logic, collaborate cross-functionally with product and engineering teams, and thrive in dynamic, data-intensive industries such as finance, gaming, or transactional systems This is a fast-paced job with room for significant career growth Please note: you must have at least 8+ years of experience as SQL developer to be considered for this role Job Responsibility & Experience 8+ years of strong SQL development experience in a high-volume environment Develop and maintain complex and high-performing SQL scripts, stored procedures, functions, and views Exposure to ETL, data migration, or data warehousing concepts Translate business requirements into scalable, efficient, and secure database logic Perform in-depth query performance analysis and optimization on large-scale data sets Collaborate with product managers, analysts, and developers to build and enhance database components Identify bottlenecks and troubleshoot performance or logic issues in existing systems Expertise in query optimization, indexing strategies, and execution plan analysis Experience in financial, gaming, or transactional systems Deep understanding of relational database concepts Strong problem-solving and analytical mindset with the ability to understand and interpret business logic Self-motivated and able to take ownership of deliverables with minimal supervision Excellent communication skills, both verbal and written, to interact with cross-functional teams and business users Have solid written and verbal English skills Initiative and drive to do great things About The Company/Benefits Plan A Technologies is an American software development and technology advisory firm that brings top-tier engineering talent to clients around the world Our software engineers tackle custom product development projects, staff augmentation, major integrations and upgrades, and much more The team is far more hands-on than the giant outsourcing shops, but still big enough to handle major enterprise clients Read more about us here: PlanAtechnologies, Location: Work From Home 100% of the time, or come in to one of our global offices Up to you Great colleagues and an upbeat work environment: You'll join an excellent team of supportive engineers and project managers who work hard but don't ever compete with each other Benefits: Youll get a generous vacation schedule, brand new laptop, and other goodies If this sounds like you, we'd love to hear from you!

Posted 1 month ago

Apply

2.0 - 5.0 years

9 - 13 Lacs

Ahmedabad

Work from Office

Role Expectations Advanced Analytics, Leadership & Cross-functional Collaboration Lead cross-functional projects using advanced data modelling and analysis techniques to discover insights that will guide strategic decisions and uncover optimization opportunities Manage and optimize processes for data intake, validation, mining and engineering as well as modelling, visualization and communication deliverables Develop Strategies for effective data analysis and reporting Define company-wide metrics and relevant data sources Build systems to transform raw data into actionable business insights Work closely with business leaders and stakeholders to understand their needs and translate them into functional and technical requirements Champion a data-driven culture, promoting the use of insights across the organization for informed decision-making Communicate technical complexities clearly to non-technical stakeholders and align diverse teams around common goals Lead digital transformation initiatives to foster innovation and modernization IT Management And Strategic Planning Oversee the architectural planning, development, and operation of all IT systems, ensuring their scalability, performance, security, and continuous integration/continuous deployment (CI/CD) Evaluate, select, and implement cutting-edge technology platforms and infrastructure to enable business growth and competitive advantage Develop and manage an IT budget to optimize resource allocation and ensure ROI on IT investments Establish IT policies, standards, and procedures in line with best practices and regulatory compliance Drive the talent management lifecycle of the IT team, including hiring, training, coaching, and performance management Profile We Are Looking At Working in/anchoring Analytics team for DTC business & Market places A person who lives, breathes and dreams numbers Marico Information classification: Official Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field Master's degree or MBA is a plus Minimum of 3-4 years of experience in IT management and/or advanced analytics roles Proficiency in advanced analytics techniques (e-g , machine learning) and tools (e-g , Python, R, SQL, Tableau, Hadoop) Extensive experience with IT systems architecture, cloud-based solutions (AWS, Google Cloud, Azure), and modern development methodologies (Agile, Scrum, DevOps) Proven ability to lead and develop a high-performing team Strong communication, strategic thinking, and project management skills Familiarity with data privacy standards and regulations (e-g , GDPR, CCPA) Experience in creating breakthrough visualizations Understanding of RDMS, Data Architecture/Schemas, Data Integrations, Data Models and Data Flows is a must Technical Ideal to have o Exposure to our tech stack PHP Microsoft workflows knowledge Experience in the beauty and personal care industry is desirable

Posted 1 month ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Vadodara

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 month ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Agra

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 month ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Nagpur

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 month ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Jaipur

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 month ago

Apply

6.0 - 10.0 years

16 - 25 Lacs

Faridabad

Work from Office

Job Summary: We are looking for a highly skilled Senior Data Engineer with expertise in Snowflake, DBT (Data Build Tool), and SAP Data Services (SAP DS). The ideal candidate will be responsible for building scalable data pipelines, designing robust data models, and ensuring high data quality across enterprise platforms. Key Responsibilities: Design, build, and optimize data pipelines and ETL/ELT workflows using Snowflake and DBT Integrate and manage data from various sources using SAP Data Services Develop and maintain scalable data models, data marts, and data warehouses Work closely with data analysts, business stakeholders, and BI teams to support reporting and analytics needs Implement best practices in data governance, data lineage, and metadata management Monitor data quality, troubleshoot issues, and ensure data integrity Optimize Snowflake data warehouse performance (partitioning, caching, query tuning) Automate data workflows and deploy DBT models with CI/CD tools (e.g., Git, Jenkins) Document architecture, data flows, and technical specifications

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies