Home
Jobs

607 Dataflow Jobs - Page 25

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4 - 8 years

10 - 13 Lacs

Chennai, Hyderabad

Work from Office

Naukri logo

What youll be doing... You will be leading multiple software engineering teams internally and externally to develop and maintain the Verizon Value Realtime personalization platform. Implementing Omni Channel personalization by building capabilities like real time data capture & event triggers, Cross channel transaction flexibility and customer communications & journey visibility. Delivering personalized engagement, dynamic segmentation, faster decision making based on real time customer behavior and preferences, unlock customer insights, use predictive analytics and measures campaign success to improve strategy and drive higher ROI. Working with and developing the team to deliver high-quality, high-value hyper personalization across all value brands. Where you'll be working... In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. What were looking for... Youll need to have: Bachelors degree and four or more years of work experience. Four or more years of Omni Channel Personalization and Unified Customer Profile Management for existing customers via CDP Experience in building and maintaining real time data capture and event triggers to provide immediate resolution to critical issues through real time notifications(SMS, Email or Push), reducing churn, improving engagement, and increasing conversion rates Experience in Tealium Tag manager, Tealium Audiencestream Experience in Big Query, Cassandra & Oracle databases Extensive Knowledge or Experience in PZAI Models Experience in developing and building Next Best Experiences using PEGA Decisioning Experience in architecting and overseeing the integration of various data sources, including transactional systems, global and regional data assets. Experience in data pipeline development to build new scripts to develop the metrics or re-engineer scripts created by the data science teams, such that they can be easily plugged into existing ML Ops frameworks. Experience in designing and developing the integration of the AI/ML models outputs, build the ML Ops components to support and manage the lifecycle of AI based models built by the regional and global data science teams. Experience in establishing and enforcing data quality standards and governance policies around data, metrics management ensuring data quality and governance Providing leadership on complex projects, mentoring the team with specific emphasis on cross channel hyper personalization experiences Acting as a liaison between IT and business community to develop business and system requirements based on input gathered from a variety of sources including analysis and feedback from end user, subject matter experts and architects. Presenting problem statements/data/conclusions/recommendations in a consumable, meaningful, and insightful manner. Even better if you have one or more of the following: Knowledge in Realtime streaming (Kafka) Knowledge in Google Analytics, Airflow, Dataflow Knowledge or experience in multi brand/multi channel hyper personalization

Posted 3 months ago

Apply

4 - 9 years

15 - 18 Lacs

Hyderabad

Hybrid

Naukri logo

GCP Data Engineer Skills-GCP, Big query, Python, Airflow, Dataflow, SQL etc Location-Hyderabad-Hybrid Mode Budget- Max 30% Hike on current CTC

Posted 3 months ago

Apply

4 - 9 years

8 - 12 Lacs

Hyderabad

Hybrid

Naukri logo

We're seeking curious engineers to work within our feature teams. You will support customers by ensuring the products we deliver are fit for purpose and meet the quality and standards expected. We're looking for someone who wants to further develop their engineering skillset. Well support you to be able to be able to take on technical challenges and develop your understanding of how data solutions and services are developed, tested and implemented. If this excites you then you would be an asset to our team! We are pioneering the transformation of current processes from our on-premises platforms and have started our journey in deploying solutions on the Google Cloud Platform. This role sits within our platforms Analytics Lab who are looking to create strategic data products and build data and analytics capability. We work using agile delivery practices, so a self-led individual capable of accurately estimating and planning their own work would be valued highly. What Youd Get Involved With: Design, develop, maintain and improve data processes to support regulatory and prudential change with high quality solutions and providing oversight and leadership to help others do the same. Building data pipelines for current and future analytics and reporting solutions Implement and embody engineering standards, using constructive feedback to create opportunities for learning. Work with the Product Owner and customers to understand, refine and prioritise items for the feature team backlog Using strong problem solving skills and a combination of technical knowledge, experience and judgement to identify available options and clearly set-out the way forward. Key skills required: Passion for software and data engineering, adopting the mindset of a curious engineer Experience of DBT, SQL, Python, Java, SAS or other open source technologies used for analytics Ability to understand business requirements and create business ready solutions Show well-developed interpersonal, communication and influencing skills, particularly the ability to convey key business information arising from complex issues to non-technical people Desirable skills include: Cloud understanding, particularly GCP Knowledge of Terraform Data engineering background and good knowledge of waterfall and agile development practices. Insights into industry solutions for data management, storage and analytics coupled with experience of financial data, including Credit Risk, capital and impairment processes

Posted 3 months ago

Apply

13 - 20 years

45 - 50 Lacs

Pune

Work from Office

Naukri logo

Role Description As a solution architect supporting the individual business aligned, strategic or regulatory book of work, you will be working closely with the projects' business stakeholders, business analysts, data solution architects, engineers, the lead solution architect, the principal architect and the Chief Architect to help projects break out business requirements into implementable building blocks and design the solution's target architecture. It is the solution architect's responsibility to align the design against the general (enterprise) architecture principles, apply agreed best practices and patterns and help the engineering teams deliver against the architecture in an event driven, service oriented environment. Your key responsibilities In projects work with SMEs and stakeholders deriving the individual components of the solution Design the target architecture for a new solution or when adding new capabilities to an existing solution. Assure proper documentation of the High-Level Design and Low Level design of the delivered solution Quality assures the delivery against the agreed and approved architecture, i.e. provide delivery guidance and governance Prepare the High-Level Design for review and approval by design authorities for projects to proceed into implementation Support creation of the Low-Level Design as it is being delivered to support final go live Your skills and experience Very proficient at designing / architecting solutions in an event driven environment leveraging service oriented principles Proficient at Java and the delivery of Spring based services Proficient at building systems in a decoupled, event driven environment leveraging messaging / streaming, i.e Kafka Very good experience designing and implementing data streaming and data ingest pipelines for heavy throughput while providing different guarantees (At least once, Exactly once) Very good understanding of non-streaming ETL and ELT approaches for data ingest Solid understanding of containerized, distributed systems and building auto-scalable, state-less services in a cluster (Concepts of Quorum, Consensus) Solid understanding of standard RDBM systems and a proficiency at data engineering level in T-SQL and/or PL/SQL and/or PL/pgSQL, preferably all Good understanding of how RBMS generally work but also specific tuning experience on SQL Server, Oracle or PostgreSQL are welcome Understanding of modeling / implementing different data modelling approaches as well as understanding of respective pros and cons (e.g. Normalized, Denormalized, Star, Snowflake, DataVault 2.0, ...) A strong working experience on GCP architecture is a benefit (APIGee, BigQuery, Pub/Sub, Cloud SQL, DataFlow, ...) - appropriate GCP architecture level certification even more so Experience leveraging other languages more than welcome (C#, Python)

Posted 3 months ago

Apply

5 - 8 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Job Description: We are seeking skilled and dynamic Data Engineers specializing in AWS, Azure, Databricks, and GCP. The ideal candidate will have a strong background in data engineering, with a focus on data ingestion, transformation, and warehousing. They should also possess excellent knowledge of PySpark or Spark, and a proven ability to optimize performance in Spark job executions. Key Responsibilities Design, build, and maintain scalable data pipelines for a variety of cloud platforms including AWS, Azure, Databricks, and GCP. Implement data ingestion and transformation processes to facilitate efficient data warehousing. Utilize cloud services to enhance data processing capabilities: AWS: Glue, Athena, Lambda, Redshift, Step Functions, DynamoDB, SNS. Azure: Data Factory, Synapse Analytics, Functions, Cosmos DB, Event Grid, Logic Apps, Service Bus. GCP: Dataflow, BigQuery, DataProc, Cloud Functions, Bigtable, Pub/Sub, Data Fusion. Optimize Spark job performance to ensure high efficiency and reliability. Stay proactive in learning and implementing new technologies to improve data processing frameworks. Collaborate with cross-functional teams to deliver robust data solutions. Work on Spark Streaming for real-time data processing as necessary. Qualifications 8-11 years of experience in data engineering with a strong focus on cloud environments. Proficiency in PySpark or Spark is mandatory. Proven experience with data ingestion, transformation, and data warehousing. In-depth knowledge and hands-on experience with cloud services(AWS/Azure/GCP): Demonstrated ability in performance optimization of Spark jobs. Strong problem-solving skills and the ability to work independently as well as in a team. Cloud Certification (AWS, Azure, or GCP) is a plus. Familiarity with Spark Streaming is a bonus. Mandatory Skill Sets Python, Pyspark, SQL with (AWS or Azure or GCP) Preferred Skill Sets Python, Pyspark, SQL with (AWS or Azure or GCP) Years Of Experience Required 8-11 years Education Qualification BE/BTECH, ME/MTECH, MBA, MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Natural Language Processing (NLP) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Reasoning, Analytical Thinking, Application Software, Business Data Analytics, Business Management, Business Technology, Business Transformation, Coaching and Feedback, Communication, Creativity, Documentation Development, Embracing Change, Emotional Regulation, Empathy, Implementation Research, Implementation Support, Implementing Technology, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Performance Assessment {+ 21 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 4 months ago

Apply

5 - 8 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

WHAT YOU DO AT AMD CHANGES EVERYTHING We care deeply about transforming lives with AMD technology to enrich our industry, our communities, and the world. Our mission is to build great products that accelerate next-generation computing experiences - the building blocks for the data center, artificial intelligence, PCs, gaming and embedded. Underpinning our mission is the AMD culture. We push the limits of innovation to solve the world’s most important challenges. We strive for execution excellence while being direct, humble, collaborative, and inclusive of diverse perspectives. AMD together we advance_ MTS SOFTWARE DEVELOPMENT ENGINEER - PERFORMANCE ARCHITECT The Role Performance modelling and evaluation of ACAP workloads to eliminate bottlenecks as early as possible and guide the architecture of future generation devices. This is a challenging role in the FPGA Silicon Architecture Group in AECG business unit of AMD in Hyderabad. About The Team AECG group in AMD designs cutting edge FPGAs and Adaptable SOCs consisting of processor subsystems and associated peripherals, programmable fabric, memory controllers, I/O interfaces and interconnect. Key Responsibilities Modelling and simulation of workload dataflow networks and clock accurate SOC components.Performance analysis and identification of bottlenecksQuick prototyping, long-term design decisions, and exploring novel architecturesEnhancement of the existing tools and knowledgebaseCollaborating with architects in the development of next generation devicesCollaborating with customer facing teams to identify scope of optimization for future market scenariosBreaking down system level designs into simpler dataflow models and identify bottlenecks, capture memory and communication overheadsKnowledge sharing with teammates through thorough documentation Preferred Experience Preferred experience in SOC architecture OR Performance analysis.Experienced in modelling and simulationExperience in developing clock accurate models and analytical models of dataflows.Strong background in Computer architecture, Hardware performance metrics and bottlenecks.Experience in performance profiling, creating experiments to address various use-cases and doing design space exploration.Good to have experience of creation of designs for ACAP devices or HLS.Good communication skills Academic Credentials B.Tech/M.Tech/PhD in Electrical/Computer Engineering, Computer Science or related fields, with appropriate prior experience Benefits offered are described: AMD benefits at a glance. AMD does not accept unsolicited resumes from headhunters, recruitment agencies, or fee-based recruitment services. AMD and its subsidiaries are equal opportunity, inclusive employers and will consider all applicants without regard to age, ancestry, color, marital status, medical condition, mental or physical disability, national origin, race, religion, political and/or third-party affiliation, sex, pregnancy, sexual orientation, gender identity, military or veteran status, or any other characteristic protected by law. We encourage applications from all qualified candidates and will accommodate applicants’ needs under the respective laws throughout all stages of the recruitment and selection process.

Posted 5 months ago

Apply

0.0 - 8.0 years

0 Lacs

Chennai, Tamil Nadu

On-site

Indeed logo

Job Information Date Opened 03/20/2023 Job Type Permanent RSD NO 6676 Industry IT Services Min Experience 6 Max Experience 8 City Chennai State/Province Tamil Nadu Country India Zip/Postal Code 600018 Job Description 6-8 years of IT experience with 6+ years of experience as Google Cloud Platform Data Engineer with hands on experience on BigQuery & Java is must. Should be Expert in Core & Advanced Java. Python is value added. Hands on experience on building Apache Beam/Cloud Dataflow pipelines for batch and Streaming using Java & Python. Hands on experience in Apache Airflow/Cloud Composer to create the end to end pipelines. Design and implement data engineering, ingestion and curation functions. Understanding of Data warehousing concepts. Experience in implementing SQL Standards and implementing various scenarios. At Indium diversity, equity, and inclusion (DEI) are the cornerstones of our values. We champion DEI through a dedicated council, expert sessions, and tailored training programs, ensuring an inclusive workplace for all. Our initiatives, including the WE@IN women empowerment program and our DEI calendar, foster a culture of respect and belonging. Recognized with the Human Capital Award, we are committed to creating an environment where every individual thrives. Join us in building a workplace that values diversity and drives innovation.

Posted 2 years ago

Apply

Exploring Dataflow Jobs in India

The dataflow job market in India is currently experiencing a surge in demand for skilled professionals. With the increasing reliance on data-driven decision-making in various industries, the need for individuals proficient in managing and analyzing dataflow is on the rise. This article aims to provide job seekers with valuable insights into the dataflow job landscape in India.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Delhi

These cities are known for their thriving tech ecosystems and are home to numerous companies actively hiring for dataflow roles.

Average Salary Range

The average salary range for dataflow professionals in India varies based on experience levels. Entry-level positions can expect to earn between INR 4-6 lakhs per annum, while experienced professionals can command salaries upwards of INR 12-15 lakhs per annum.

Career Path

In the dataflow domain, a typical career path may involve starting as a Junior Data Analyst or Data Engineer, progressing to roles such as Senior Data Scientist or Data Architect, and eventually reaching positions like Tech Lead or Data Science Manager.

Related Skills

In addition to expertise in dataflow tools and technologies, dataflow professionals are often expected to have proficiency in programming languages such as Python or R, knowledge of databases like SQL, and familiarity with data visualization tools like Tableau or Power BI.

Interview Questions

  • What is dataflow and how is it different from data streaming? (basic)
  • Explain the difference between batch processing and real-time processing. (medium)
  • How do you handle missing or null values in a dataset? (basic)
  • Can you explain the concept of data lineage? (medium)
  • What is the importance of data quality in dataflow processes? (basic)
  • How do you optimize dataflow pipelines for performance? (medium)
  • Describe a time when you had to troubleshoot a dataflow issue. (medium)
  • What are some common challenges faced in dataflow projects? (medium)
  • How do you ensure data security and compliance in dataflow processes? (medium)
  • What are the key components of a dataflow architecture? (medium)
  • Explain the concept of data partitioning in dataflow. (advanced)
  • How would you handle a sudden increase in data volume in a dataflow pipeline? (advanced)
  • What role does data governance play in dataflow processes? (medium)
  • Can you discuss the advantages and disadvantages of using cloud-based dataflow solutions? (medium)
  • How do you stay updated with the latest trends and technologies in dataflow? (basic)
  • What is the significance of metadata in dataflow management? (medium)
  • Walk us through a dataflow project you have worked on from start to finish. (medium)
  • How do you ensure data quality and consistency across different data sources in a dataflow pipeline? (medium)
  • What are some best practices for monitoring and troubleshooting dataflow pipelines? (medium)
  • How do you handle data transformations and aggregations in a dataflow process? (basic)
  • What are the key performance indicators you would track in a dataflow project? (medium)
  • How do you collaborate with cross-functional teams in a dataflow project? (basic)
  • Can you explain the concept of data replication in dataflow management? (advanced)
  • How do you approach data modeling in a dataflow project? (medium)
  • Describe a challenging dataflow problem you encountered and how you resolved it. (advanced)

Closing Remark

As you navigate the dataflow job market in India, remember to showcase your skills and experiences confidently during interviews. Stay updated with the latest trends in dataflow and continuously upskill to stand out in a competitive job market. Best of luck in your job search journey!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies