Home
Jobs

547 Dataflow Jobs - Page 16

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 years

0 Lacs

India

Remote

Linkedin logo

Job Post :- AI/ML Engineer Experience - 4+ years Location - Remote Key Responsibilities: Design, build, and maintain ML infrastructure on GCP using tools such as Vertex AI, GKE, Dataflow, BigQuery, and Cloud Functions. Develop and automate ML pipelines for model training, validation, deployment, and monitoring using tools like Kubeflow Pipelines, TFX, or Vertex AI Pipelines. Work with Data Scientists to productionize ML models and support experimentation workflows. Implement model monitoring and alerting for drift, performance degradation, and data quality issues. Manage and scale containerized ML workloads using Kubernetes (GKE) and Docker. Set up CI/CD workflows for ML using tools like Cloud Build, Bitbucket, Jenkins, or similar. Ensure proper security, versioning, and compliance across the ML lifecycle. Maintain documentation, artifacts, and reusable templates for reproducibility and auditability. Having GCP MLE Certification is Plus Show more Show less

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Gurugram, Haryana, India

Remote

Linkedin logo

What would a typical day at your work be like? You will lead and manage the delivery of projects and be responsible for the delivery of project and team goals. Build & support data ingestion and processing pipelines. This will entail extract, load and transform of data from a wide variety of sources using latest data frameworks and technologies. Design, build, test, and maintain machine learning infrastructure and frameworks to empower data scientists to rapidly iterate on model development. Own and lead client engagement and communication on technical projects. Define project scopes and track project progress and delivery. Plan and execute project architecture and allocate work to team. Keep up to date with advances in big data technologies and run pilots to design the data architecture to scale with the increased data volume. Partner with software engineering teams to drive completion of multi-functional projects. What Do We Expect? Minimum 6 years of overall experience in data engineering and 2+ years leading a team as team lead and doing project management. Experience working with a global team and remote clients. Hands on experience in building data pipelines on various infrastructures. Knowledge of statistical and machine learning techniques. Hands on experience in integrating machine learning in data pipelines. Ability to work hands-on with the data engineers in the team in design and development of the solution using the relevant big data technologies and data warehouse concepts Strong knowledge of advanced SQL, data warehousing concepts, DataMart designing. Have strong experience in modern data platform components such as Spark, Python, etc. Experience with setting up and maintaining Data warehouse (Google BigQuery, Redshift, Snowflake) and Data Lakes (GCS, AWS S3 etc.) for an organization. Experience in building data pipeline with AWS Glue, Azure Data Factory and Google Dataflow. Experience with relational SQL and NoSQL databases, including Postgres and Cassandra / MongoDB. Strong problem solving and communication skills Show more Show less

Posted 2 weeks ago

Apply

0.0 - 5.0 years

0 Lacs

Gurugram, Haryana

On-site

Indeed logo

Senior Data Engineer(GCP, Python) Gurgaon, India Information Technology 314204 Job Description About The Role: Grade Level (for internal use): 10 S&P Global Mobility The Role: Senior Data Engineer Department overview Automotive Insights at S&P Mobility, leverages technology and data science to provide unique insights, forecasts and advisory services spanning every major market and the entire automotive value chain—from product planning to marketing, sales and the aftermarket. We provide the most comprehensive data spanning the entire automotive lifecycle—past, present and future. With over 100 years of history, unmatched credentials, and the largest base of customers than any other provider, we are the industry benchmark for clients around the world, helping them make informed decisions to capitalize on opportunity and avoid risk. Our solutions are used by nearly every major OEM, 90% of the top 100 tier one suppliers, media agencies, governments, insurance companies, and financial stakeholders to provide actionable insights that enable better decisions and better results. Position summary S&P Global is seeking an experienced and driven Senior data Engineer who is passionate about delivering high-value, high-impact solutions to the world’s most demanding, high-profile clients. The ideal candidate must have at least 5 years of experience in developing and deploying data pipelines on Google Cloud Platform (GCP). They should be passionate about building high-quality, reusable pipelines using cutting-edge technologies. This role involves designing, building, and maintaining scalable data pipelines, optimizing workflows, and ensuring data integrity across multiple systems. The candidate will collaborate with data scientists, analysts, and software engineers to develop robust and efficient data solutions. Responsibilities Design, develop, and maintain scalable ETL/ELT pipelines. Optimize and automate data ingestion, transformation, and storage processes. Work with structured and unstructured data sources, ensuring data quality and consistency. Develop and maintain data models, warehouses, and databases. Collaborate with cross-functional teams to support data-driven decision-making. Ensure data security, privacy, and compliance with industry standards. Troubleshoot and resolve data-related issues in a timely manner. Monitor and improve system performance, reliability, and scalability. Stay up-to-date with emerging data technologies and recommend improvements to our data architecture and engineering practices. What you will need: Strong programming skills using python. 5+ years of experience in data engineering, ETL development, or a related role. Proficiency in SQL and experience with relational (PostgreSQL, MySQL, etc.) and NoSQL (DynamoDB, MongoDB etc…) databases. Proficiency building data pipelines in Google cloud platform(GCP) using services like DataFlow, Cloud Batch, BigQuery, BigTable, Cloud functions, Cloud Workflows, Cloud Composer etc.. Strong understanding of data modeling, data warehousing, and data governance principles. Should be capable of mentoring junior data engineers and assisting them with technical challenges. Familiarity with orchestration tools like Apache Airflow. Familiarity with containerization and orchestration. Experience with version control systems (Git) and CI/CD pipelines. Excellent problem-solving skills and ability to work in a fast-paced environment. Excellent communication skills. Hands-on experience with snowflake is a plus. Experience with big data technologies is a plus. Experience in AWS is a plus. Should be able to convert business queries into technical documentation. Education and Experience Bachelor’s degree in Computer Science, Information Systems, Information Technology, or a similar major or Certified Development Program 5+ years of experience building data pipelines using python & GCP (Google Cloud platform). About Company Statement: S&P Global delivers essential intelligence that powers decision making. We provide the world’s leading organizations with the right data, connected technologies and expertise they need to move ahead. As part of our team, you’ll help solve complex challenges that equip businesses, governments and individuals with the knowledge to adapt to a changing economic landscape. S&P Global Mobility turns invaluable insights captured from automotive data to help our clients understand today’s market, reach more customers, and shape the future of automotive mobility. About S&P Global Mobility At S&P Global Mobility, we provide invaluable insights derived from unmatched automotive data, enabling our customers to anticipate change and make decisions with conviction. Our expertise helps them to optimize their businesses, reach the right consumers, and shape the future of mobility. We open the door to automotive innovation, revealing the buying patterns of today and helping customers plan for the emerging technologies of tomorrow. For more information, visit www.spglobal.com/mobility. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 314204 Posted On: 2025-05-30 Location: Gurgaon, Haryana, India

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

India

Remote

Linkedin logo

Job Description KLDiscovery, a leading global provider of electronic discovery, information governance and data recovery services, is currently seeking a Senior Software Engineer for an exciting new opportunity. This person will develop core parts of our eDiscovery offerings, including software development, testing, and systems automation. They will collaborate with team members, product owners, designers, architects, and other development teams to research relevant technologies and build innovative solutions that enhance our offerings and exceed customer needs. If you like working in a creative, technology-driven, high energy, collaborative, casual environment, and you have strong software development abilities, this is the opportunity for you! Hybrid or remote, work from home opportunity. Responsibilities Create, Validate and Review program code per specifications. Develop automated unit and API tests. Support bug fixes and implement enhancements to applications in Production. Create, design and review SW documentation. Utilize, communicate, and enforce coding standards. Provide Technical Support to applications in Production within defined SLA. Adhere to development processes and workflows. Assist and mentor team demonstrating technical excellence. Detects problems and areas that need improvement early and raises issues. Qualifications Fluent English (C1) At least 4 years of commercial, hands-on software development experience in C#/.NET and C++ Experience with ASP.NET Core Blazor Experience with desktop applications (Winforms preferred) Experience with background jobs and workers (e.g. Hangfire) Experience with Angular is a plus Creating dataflow/sequence/C4 diagrams Good understanding of at least one of architectural/design patterns: MVC/MVP/MVVM/Clean/Screaming/Hexagonal architectures .NET memory model and performance optimizations solutions Writing functional tests. Writing structure tests. Understanding modularity and vertical slices. Data privacy and securing desktop apps. Ability to design functionalities based on requirements Experience with Entity Framework Core Our Cultural Values Entrepreneurs At Heart, We Are a Customer First Team Sharing One Goal And One Vision. We Seek Team Members Who Are Humble - No one is above another; we all work together to meet our clients’ needs and we acknowledge our own weaknesses Hungry - We all are driven internally to be successful and to continually expand our contribution and impact Smart - We use emotional intelligence when working with one another and with clients Our culture shapes our actions, our products, and the relationships we forge with our customers. Who We Are KLDiscovery provides technology-enabled services and software to help law firms, corporations, government agencies and consumers solve complex data challenges. The company, with offices in 26 locations across 17 countries, is a global leader in delivering best-in-class eDiscovery, information governance and data recovery solutions to support the litigation, regulatory compliance, internal investigation and data recovery and management needs of our clients. Serving clients for over 30 years, KLDiscovery offers data collection and forensic investigation, early case assessment, electronic discovery and data processing, application software and data hosting for web-based document reviews, and managed document review services. In addition, through its global Ontrack Data Recovery business, KLDiscovery delivers world-class data recovery, email extraction and restoration, data destruction and tape management. KLDiscovery has been recognized as one of the fastest growing companies in North America by both Inc. Magazine (Inc. 5000) and Deloitte (Deloitte’s Technology Fast 500) and CEO Chris Weiler has been honored as a past Ernst & Young Entrepreneur of the Year™. Additionally, KLDiscovery is an Orange-level Relativity Best in Service Partner, a Relativity Premium Hosting Partner and maintains ISO/IEC 27001 Certified data centers. KLDiscovery is an Equal Opportunity Employer. Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

What you’ll do: Design, develop, and operate high scale applications across the full engineering stack. Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Research, create, and develop software applications to extend and improve on Equifax Solutions. Manage sole project priorities, deadlines, and deliverables. Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What experience you need : Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart: Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI Cloud Certification strongly preferred Show more Show less

Posted 2 weeks ago

Apply

10.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

We are looking for an experienced Integration Technical Lead with over 10 years of in-depth experience in Oracle Fusion Middleware technologies such as SOA Suite, Oracle Service Bus (OSB), and Oracle Data Integrator (ODI). Candidate will be responsible for leading integration initiatives including custom development, platform customization, and day-to-day operational support. A strong interest in Google Cloud Platform (GCP) is highly desirable, with clear opportunities for training and skill development. ShyftLabs is a growing data product company founded in early 2020 and works primarily with Fortune 500 companies. We deliver digital solutions built to help accelerate the growth of businesses in various industries by focusing on creating value through innovation. Job Responsibilities: 1. Integration Leadership & Development: Lead end-to-end integration design and development across on-premise and cloud systems using Oracle SOA, OSB, and ODI. Drive new integration projects, from requirements gathering through to deployment and support. Develop, customize, and maintain reusable integration components and templates. Translate complex business processes into scalable, secure, and performant integration solutions. 2. Platform Customization & Optimization: Customize Oracle Fusion middleware components to meet specific business needs and performance objectives. Evaluate existing integrations and enhance them for greater efficiency and lower latency. Implement best practices in integration design, error handling, and performance tuning. 3. Operational Excellence & Support: Own the operational stability of integration platforms including monitoring, incident resolution, and root cause analysis. Manage daily operations such as deployments, patches, backups, and performance reviews. Collaborate with IT support teams to maintain integration SLAs, uptime and reliability. 4. Cloud Integration & GCP Adoption: Contribute to the design of hybrid and cloud-native integration architectures using GCP. Learn and eventually implement integration patterns using tools like Apigee, Pub/Sub, Cloud Functions, and Dataflow. Participate in GCP migration initiative for legacy integration assets Basic Qualifications: 10+ years of hands-on experience with Oracle SOA Suite, OSB, and ODI in enterprise environments. Expertise in web services (REST/SOAP), XML, XSD, XSLT, XPath, and service orchestration. Strong skills in platform customization, new integration development, integration monitoring, alerting, and troubleshooting processes and long-term system maintenance. Experience with performance optimization, fault tolerance, and secure integrations. Excellent communication and team leadership skills. Preferred Qualifications: Exposure to Google Cloud Platform (GCP) or strong interest and ability to learn. Familiarity with GCP services for integration (Pub/Sub, Cloud Store/Functions). Understanding of containerized deployments using Docker and Kubernetes. Experience with DevOps tools and CI/CD pipelines for integration delivery. We are proud to offer a competitive salary alongside a strong insurance package. We pride ourselves on the growth of our employees, offering extensive learning and development resources. Show more Show less

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

India

On-site

Linkedin logo

Description The Position We are seeking a seasoned engineer with a passion for changing the way millions of people save energy. You’ll work within the Engineering team to build and improve our platforms to deliver flexible and creative solutions to our utility partners and end users and help us achieve our ambitious goals for our business and the planet. We are seeking a skilled and passionate Data Engineer - Business Intelligence with expertise in Data Engineering and BI Reporting to join our development team. As a Data Engineer, you will play a crucial role developing different components, harnessing the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, data processing and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. You will also work on creating BI reports as well as development of a Business Intelligence platform that will enable users to create reports and dashboards based on their requirements. You will coordinate with the rest of the team working on different layers of the infrastructure. Therefore, a commitment to collaborative problem solving, sophisticated design, and quality product is important. You will own the development and its quality independently and be responsible for high quality deliverables. And you will work with a great team with excellent benefits. Responsibilities & Skills You should: Be excited to work with talented, committed people in a fast-paced environment. Have a proven experience as a Data Engineer with a focus on BI reporting.. Be designing, building, and maintaining high performance solutions with reusable, and reliable code. Use a rigorous approach for product improvement and customer satisfaction. Love developing great software as a seasoned product engineer. Be ready, able, and willing to jump onto a call with stakeholders to help solve problems. Be able to deliver against several initiatives simultaneously. Have a strong eye for detail and quality of code. Have an agile mindset. Have strong problem-solving skills and attention to detail. Required Skills (Data Engineer) You ideally have 2+ or more years of professional experience. Design, build, and maintain scalable data pipelines and ETL processes to support business analytics and reporting needs. Strong Experience with SQL for querying and transforming large datasets, and optimizing query performance in relational databases. Proficiency in Python for building and automating data pipelines, ETL processes, and data integration workflows. Familiarity with big data frameworks such as Apache Spark or PySpark for distributed data processing. Strong Understanding of data modeling principles for building scalable and efficient data architectures (e.g., star schema, snowflake schema). Good to have experience with Databricks for managing and processing large datasets, implementing Delta Lake, and leveraging its collaborative environment. Knowledge of Google Cloud Platform (GCP) services like BigQuery, Dataflow, Pub/Sub, and Cloud Storage for end-to-end data engineering solutions. Familiarity with version control systems such as Git and CI/CD pipelines for managing code and deploying workflows. Awareness of data governance and security best practices, including access control, data masking, and compliance with industry standards. Exposure to monitoring and logging tools like Datadog, Cloud Logging, or ELK stack for maintaining pipeline reliability. Ability to understand business requirements and translate them into technical requirements. Inclination to design solutions for complex data problems. Ability to deliver against several initiatives simultaneously as a multiplier. Demonstrable experience with writing unit and functional tests. Required Skills (BI Reporting) Strong experience in developing Business Intelligence reports and dashboards via tools such as Tableau, PowerBI, Sigma etc. Ability to analyse and deeply understand the data, relate it to the business application and derive meaningful insights from the data. Required The following experiences are not required, but you'll stand out from other applicants if you have any of the following, in our order of importance: You are an experienced developer - a minimum of 2+ years of professional experience. Work experience & strong proficiency in Python, SQL and BI Reporting and its associated frameworks (like Flask, FastAPI etc.). Experience with cloud infrastructure like AWS/GCP or other cloud service provider experience CI/CD experience You are a Git guru and revel in collaborative workflows You work on the command line confidently and are familiar with all the goodies that the linux toolkit can provide Familiarity with Apache Spark and PySpark. Qualifications Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Uplight provides equal employment opportunities to all employees and applicants and prohibits discrimination and harassment of any type without regard to race (including hair texture and hairstyles), color, religion (including head coverings), age, sex, national origin, caste, disability status, genetics, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. L&A Business Consultant Working as part of the Consulting team, you will take part in engagements related to a wide range of topics. Some examples of domains in which you will support our clients include the following: Proficient in Individual and Group Life Insurance concepts, different type of Annuity products etc. Proficient in different insurance plans - Qualified/Non-Qualified Plans, IRA, Roth IRA, CRA, SEP Solid knowledge on the Policy Life cycle Illustrations/Quote/Rating New Business & Underwriting Policy Servicing and Administration Billing & Payment Claims Processing Disbursement (Systematic withdrawals, RMD, Surrenders) Regulatory Changes & Taxation Understanding of business rules of Pay-out Understanding on upstream and downstream interfaces for policy lifecycle Experience in DXC Platforms – Vantage, wmA, nbA, CSA, Cyber-life, Life70, Life Asia, PerformancePlus Consulting Skills – Experience in creating business process map for future state architecture, creating WBS for overall conversion strategy, requirement refinement process in multi-vendor engagement. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Technology Skills - Experienced in data migration projects, ensuring seamless transfer of data between systems while maintaining data integrity and security. Skilled in data analytics, utilizing various tools and techniques to extract insights and drive informed decision-making. Strong understanding of data governance principles and best practices, ensuring data quality and compliance. Collaborative team player, able to work closely with stakeholders and technical teams to define requirements and implement effective solutions. Industry certifications (AAPA/LOMA) will be added advantage. Experience on these COTS product is preferrable. FAST ALIP OIPA wmA We expect you to work effectively as a team member and build good relationships with the client. You will have the opportunity to expand your domain knowledge and skills and will be able to collaborate frequently with other EY professionals with a wide variety of expertise. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. L&A Business Consultant Working as part of the Consulting team, you will take part in engagements related to a wide range of topics. Some examples of domains in which you will support our clients include the following: Proficient in Individual and Group Life Insurance concepts, different type of Annuity products etc. Proficient in different insurance plans - Qualified/Non-Qualified Plans, IRA, Roth IRA, CRA, SEP Solid knowledge on the Policy Life cycle Illustrations/Quote/Rating New Business & Underwriting Policy Servicing and Administration Billing & Payment Claims Processing Disbursement (Systematic withdrawals, RMD, Surrenders) Regulatory Changes & Taxation Understanding of business rules of Pay-out Understanding on upstream and downstream interfaces for policy lifecycle Experience in DXC Platforms – Vantage, wmA, nbA, CSA, Cyber-life, Life70, Life Asia, PerformancePlus Consulting Skills – Experience in creating business process map for future state architecture, creating WBS for overall conversion strategy, requirement refinement process in multi-vendor engagement. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Technology Skills - Experienced in data migration projects, ensuring seamless transfer of data between systems while maintaining data integrity and security. Skilled in data analytics, utilizing various tools and techniques to extract insights and drive informed decision-making. Strong understanding of data governance principles and best practices, ensuring data quality and compliance. Collaborative team player, able to work closely with stakeholders and technical teams to define requirements and implement effective solutions. Industry certifications (AAPA/LOMA) will be added advantage. Experience on these COTS product is preferrable. FAST ALIP OIPA wmA We expect you to work effectively as a team member and build good relationships with the client. You will have the opportunity to expand your domain knowledge and skills and will be able to collaborate frequently with other EY professionals with a wide variety of expertise. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Andhra Pradesh

On-site

GlassDoor logo

About the Role: We are seeking experienced Data Analysts to join our growing team. The ideal candidate will have a strong background in data analysis, complex SQL queries, and experience working within large-scale Data Warehouse environments. Familiarity with cloud technologies such as GCP or AWS is mandatory, and prior exposure to AWS EMR and Apache Airflow is highly desirable. ________________________________________ Key Responsibilities: Perform deep data analysis to support business decision-making, reporting, and strategic initiatives. Write and optimize complex SQL queries for data extraction, transformation, and reporting across large, distributed datasets. Work extensively within data warehouse environments to design, test, and deliver data solutions. Collaborate with data engineers, business analysts, and stakeholders to understand requirements and translate them into technical deliverables. Analyze large, complex datasets to identify trends, patterns, and opportunities for business growth. Develop, maintain, and optimize ETL/ELT pipelines; familiarity with Apache Airflow for workflow orchestration is a plus. Work with cloud-native tools on GCP or AWS to manage and analyze data effectively. Support the development of data quality standards and ensure data integrity across all reporting platforms. Document data models, queries, processes, and workflows for knowledge sharing and scalability.____________________________________ Required Skills & Experience: Minimum 7 years of professional experience in Data Analysis. Strong, demonstrable expertise in SQL, including writing, debugging, and optimizing complex queries. Solid experience working within a Data Warehouse environment (e.g., BigQuery, Redshift, Snowflake, etc.). Hands-on experience with GCP (BigQuery, Dataflow) or AWS (Redshift, Athena, S3, EMR). Knowledge of data modeling concepts, best practices, and data architecture principles. Understanding of ETL processes and tools; hands-on experience with Apache Airflow is a strong plus. Strong analytical thinking, attention to detail, and problem-solving skills. Ability to work in a fast-paced environment and manage multiple priorities. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 2 weeks ago

Apply

7.0 - 12.0 years

25 - 30 Lacs

Coimbatore

Work from Office

Naukri logo

Job Summary: We are seeking a Senior Data & AI/ML Engineer (Lead) with strong expertise in Google Cloud Platform (GCP) and hands-on experience in building, deploying, and managing machine learning solutions at scale. The ideal candidate will lead AI/ML initiatives, mentor a team of engineers, and collaborate cross-functionally to drive data-driven innovation and business value. Key Responsibilities: Lead the end-to-end design and implementation of scalable AI/ML models and data pipelines on GCP. Drive architecture and design discussions for AI/ML solutions across cloud-native environments. Collaborate with data scientists, analysts, and business stakeholders to define requirements and deliver intelligent solutions. Manage and optimize data pipelines using tools such as Dataflow, Pub/Sub, BigQuery, Cloud Functions , etc. Deploy ML models using Vertex AI , AI Platform , or custom CI/CD pipelines. Monitor model performance and manage model retraining, versioning, and lifecycle. Ensure best practices in data governance, security, and compliance. Mentor junior engineers and data scientists; provide leadership in code reviews and project planning. Required Skills: 8+ years of experience in Data Engineering, Machine Learning, or AI application development. Strong programming skills in Python (preferred) and/or Java/Scala . Hands-on experience with GCP services : BigQuery, Vertex AI, Cloud Functions, Dataflow, Pub/Sub, GCS, etc. Proficient in ML libraries/frameworks like TensorFlow, PyTorch, Scikit-learn . Deep understanding of data modeling, feature engineering, and model evaluation techniques. Experience with Docker , Kubernetes , and ML Ops tools. Strong background in SQL and data warehousing concepts. Exposure to data security and compliance best practices (GDPR, HIPAA, etc.). Nice to Have: GCP Certification (e.g., Professional Machine Learning Engineer , Data Engineer ) Experience with streaming data architectures . Familiarity with AI ethics , explainability , and bias mitigation techniques . Education: Bachelors or Master’s degree in Computer Science, Data Science, Artificial Intelligence, or a related field.

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description Industrial System Analytics (ISA) is a product group that develops cloud analytic solutions using GCP tools/Techniques. This Product Manager position within Vehicle Product Management Product Line is ideal for the technically and Product oriented individual who has experience managing product and building roadmaps, strict to timelines, enhancing customer satisfaction, designing, building, deploying, and supporting cloud applications and is interested in working with a portfolio of strategic analytic solutions. Tech Anchor/Solution Architect Job Description: We are seeking a highly technical and experienced individual to fill the role of Tech Anchor/Solution Architect within our Industrial System Analytics (ISA) team. As a Tech Anchor, you will provide technical leadership and guidance to the development team, driving the design and implementation of cloud analytic solutions using GCP tools and techniques. Responsibilities: Provide technical guidance, mentorship, and code-level support to the development team Work with the team to develop and implement solutions using GCP tools (BigQuery, GCS, Dataflow, Dataproc, etc.) and APIs/Microservices Ensure adherence to security, legal, and Ford standard/policy compliance Drive effective and efficient delivery from the team, focusing on speed Identify risks and implement mitigation/contingency plans Assess the overall health of the product and raise key decisions Manage onboarding of new resources Lead the design and architecture of complex systems, ensuring scalability, reliability, and performance Participate in code reviews and contribute to improving code quality Champion Agile software processes, culture, best practices, and techniques Ensure effective usage of Rally and derive meaningful insights Ensure implementation of DevSecOps and software craftsmanship practices (CI/CD, TDD, Pair Programming) Responsibilities Technical Requirements: Bachelor’s /Master’s/ PhD in engineering, Computer Science, or in a related field Senior-level experience (8+ years) as a software engineer Deep and broad knowledge of: Programming Languages: Java, JavaScript, Python, SQL Front-End Technologies: React, Angular, HTML, CSS Back-End Technologies: Node.js, Python frameworks (Django, Flask), Java frameworks (Spring) Cloud Technologies: GCP (BigQuery, GCS, Dataflow, Dataproc, etc.) Deployment Practices: Docker, Kubernetes, CI/CD pipelines Experience with Agile software development methodologies Understanding/Exposure of: CI, CD, and Test-Driven Development (GitHub, Terraform/Tekton, 42Crunch, SonarQube, FOSSA, Checkmarx etc.) Qualifications Good to Have: Experience with GCP services such as: Cloud Run Cloud Build Cloud Source Repositories Cloud Workflows Knowledge of containerization using Docker and Kubernetes Experience with serverless architecture and event-driven design patterns Familiarity with machine learning and data science concepts Experience with data engineering and data warehousing Certification in GCP or other cloud platforms Soft Skills: Strong communication and collaboration skills Ability to work in a fast-paced, agile environment Proactive attitude and start-up mindset Show more Show less

Posted 2 weeks ago

Apply

8.0 - 13.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

We are looking for a skilled Lead Data Engineer to become an integral part of our vibrant team. In this role, you will take charge of designing, developing, and maintaining data integration solutions tailored to our clients' needs. You will oversee a team of engineers, ensuring the delivery of high-quality, scalable, and efficient data integration solutions. This role presents a thrilling challenge for a seasoned data integration expert who is passionate about technology and excels in a fast-paced, dynamic setting. Responsibilities Design, develop, and maintain client-specific data integration solutions Oversee a team of engineers to guarantee high-quality, scalable, and efficient delivery of data integration solutions Work with cross-functional teams to comprehend business requirements and create suitable data integration solutions Ensure the security, reliability, and efficiency of data integration solutions Create and update documentation, including technical specifications, data flow diagrams, and data mappings Stay informed and up-to-date with the latest data integration methods and tools Requirements Bachelor’s degree in Computer Science, Information Systems, or a related field 8-13 years of experience in data engineering, data integration, or related fields Proficiency in cloud-native or Spark-based ETL tools such as AWS Glue, Azure Data Factory, or GCP Dataflow Strong knowledge of SQL for data querying and manipulation Background in Snowflake for cloud data warehousing Familiarity with at least one cloud platform such as AWS, Azure, or GCP Experience in leading a team of engineers on data integration projects Good verbal and written communication skills in English at a B2 level Nice to have Background in ETL using Python Show more Show less

Posted 2 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Our organization is seeking a skilled Senior Data Engineer to become an integral part of our team. In this role, you will focus on projects related to data integration and ETL on cloud-based platforms. You will take charge of creating and executing sophisticated data solutions, ensuring data accuracy, dependability, and accessibility. Responsibilities Create and execute sophisticated data solutions on cloud-based platforms Build ETL processes utilizing SQL, Python, and other applicable technologies Maintain data accuracy, reliability, and accessibility for all stakeholders Work with cross-functional teams to comprehend data integration needs and specifications Produce and sustain documentation, including technical specifications, data flow diagrams, and data mappings Enhance and tune data integration processes for optimal performance and efficiency, guaranteeing data accuracy and integrity Requirements Bachelor’s degree in Computer Science, Electrical Engineering, or a related field 5-8 years of experience in data engineering Proficiency in cloud-native or Spark-based ETL tools such as AWS Glue, Azure Data Factory, or GCP Dataflow Strong knowledge of SQL for data querying and manipulation Qualifications in Snowflake for data warehousing Familiarity with cloud platforms like AWS, GCP, or Azure for data storage and processing Excellent problem-solving skills and attention to detail Good verbal and written communication skills in English at a B2 level Nice to have Background in ETL using Python Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

JD for a Databricks Data Engineer Key Responsibilities: Design, develop, and maintain high-performance data pipelines using Databricks and Apache Spark. Implement medallion architecture (Bronze, Silver, Gold layers) for efficient data processing. Optimize Delta Lake tables, partitioning, Z-ordering, and performance tuning in Databricks. Develop ETL/ELT processes using PySpark, SQL, and Databricks Workflows. Manage Databricks clusters, jobs, and notebooks for batch and real-time data processing. Work with Azure Data Lake, AWS S3, or GCP Cloud Storage for data ingestion and storage. Implement CI/CD pipelines for Databricks jobs and notebooks using DevOps tools. Monitor and troubleshoot performance bottlenecks, cluster optimization, and cost management. Ensure data quality, governance, and security using Unity Catalog, ACLs, and encryption. Collaborate with Data Scientists, Analysts, and Business Teams to deliver insights. Required Skills & Experience: 5+ years of hands-on experience in Databricks, Apache Spark, and Delta Lake. Strong SQL, PySpark, and Python programming skills. Experience in Azure Data Factory (ADF), AWS Glue, or GCP Dataflow. Expertise in performance tuning, indexing, caching, and parallel processing. Hands-on experience with Lakehouse architecture and Databricks SQL. Strong understanding of data governance, lineage, and cataloging (e.g., Unity Catalog). Experience with CI/CD pipelines (Azure DevOps, GitHub Actions, or Jenkins). Familiarity with Airflow, Databricks Workflows, or orchestration tools. Strong problem-solving skills with experience in troubleshooting Spark jobs. Nice to Have: Hands-on experience with Kafka, Event Hubs, or real-time streaming in Databricks. Certifications in Databricks, Azure, AWS, or GCP. Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Join Us About VOIS VO IS (Vodafone Intelligent Solutions) is a strategic arm of Vodafone Group Plc, creating value and enhancing quality and efficiency across 28 countries, and operating from 7 locations: Albania, Egypt, Hungary, India, Romania, Spain and the UK. Over 29,000 highly skilled individuals are dedicated to being Vodafone Group’s partner of choice for talent, technology, and transformation. We deliver the best services across IT, Business Intelligence Services, Customer Operations, Business Operations, HR, Finance, Supply Chain, HR Operations, and many more. Established in 2006, VO IS has evolved into a global, multi-functional organisation, a Centre of Excellence for Intelligent Solutions focused on adding value and delivering business outcomes for Vodafone What You’ll Do We are seeking a highly experienced Solutions Architect to lead the design and implementation of end-to-end solutions across Employee Data, HR Dashboards, and People Analytics. This individual will play a key role in integrating data from SAP SuccessFactors into our Google Cloud-based data lake, designing ML-driven analytics models, and enabling data-driven decisions through Qlik dashboards, with a roadmap to transition into SAP Analytics Cloud (SAC). The ideal candidate will have deep expertise in both HR domain and enterprise technology architecture, with the ability to connect business needs with scalable, efficient, and secure data and analytics solutions. Who You Are Key accountabilities and decision ownership: Strong expertise in SAP SuccessFactors (Employee Central, Talent modules, Workforce Analytics). Deep understanding of people data models, HR metrics, and employee lifecycle data. Proven experience with Google Cloud Platform, including BigQuery, Vertex AI, Dataflow, etc. Hands-on experience in designing and deploying machine learning models for HR use cases. Experience with Qlik for dashboarding and SAC (SAP Analytics Cloud) is preferred. Lead end-to-end solution design for employee data and people analytics use cases, ensuring alignment with business and IT strategy. Architect data flows from SAP SuccessFactors into the Google Cloud Platform for advanced analytics. Define and implement ML/AI-based models to derive actionable insights on workforce trends, talent, and engagement. Analytics and Dashboard solutions designed on time with thought leadership on end-to-end systems architecture Not a perfect fit? Worried that you don’t meet all the desired criteria exactly? At Vodafone we are passionate about empowering people and creating a workplace where everyone can thrive, whatever their personal or professional background. If you’re excited about this role but your experience doesn’t align exactly with every part of the job description, we encourage you to still apply as you may be the right candidate for this role or another opportunity. What's In It For You Last: VOIS Equal Opportunity Employer Commitment India VO IS is proud to be an Equal Employment Opportunity Employer. We celebrate differences and we welcome and value diverse people and insights. We believe that being authentically human and inclusive powers our employees’ growth and enables them to create a positive impact on themselves and society. We do not discriminate based on age, colour, gender (including pregnancy, childbirth, or related medical conditions), gender identity, gender expression, national origin, race, religion, sexual orientation, status as an individual with a disability, or other applicable legally protected characteristics. As a result of living and breathing our commitment, our employees have helped us get certified as a Great Place to Work in India for four years running. We have been also highlighted among the Top 10 Best Workplaces for Millennials, Equity, and Inclusion , Top 50 Best Workplaces for Women , Top 25 Best Workplaces in IT & IT-BPM and 10th Overall Best Workplaces in India by the Great Place to Work Institute in 2024. These achievements position us among a select group of trustworthy and high-performing companies which put their employees at the heart of everything they do. By joining us, you are part of our commitment. We look forward to welcoming you into our family which represents a variety of cultures, backgrounds, perspectives, and skills! Who We Are We are a leading international Telco, serving millions of customers. At Vodafone, we believe that connectivity is a force for good. If we use it for the things that really matter, it can improve people's lives and the world around us. Through our technology we empower people, connecting everyone regardless of who they are or where they live and we protect the planet, whilst helping our customers do the same. Belonging at Vodafone isn't a concept; it's lived, breathed, and cultivated through everything we do. You'll be part of a global and diverse community, with many different minds, abilities, backgrounds and cultures. ;We're committed to increase diversity, ensure equal representation, and make Vodafone a place everyone feels safe, valued and included. If you require any reasonable adjustments or have an accessibility request as part of your recruitment journey, for example, extended time or breaks in between online assessments, please refer to https://careers.vodafone.com/application-adjustments/ for guidance. Together we can. Show more Show less

Posted 2 weeks ago

Apply

4.0 - 6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

_VOIS Intro About _VOIS: _VO IS (Vodafone Intelligent Solutions) is a strategic arm of Vodafone Group Plc, creating value and enhancing quality and efficiency across 28 countries, and operating from 7 locations: Albania, Egypt, Hungary, India, Romania, Spain and the UK. Over 29,000 highly skilled individuals are dedicated to being Vodafone Group’s partner of choice for talent, technology, and transformation. We deliver the best services across IT, Business Intelligence Services, Customer Operations, Business Operations, HR, Finance, Supply Chain, HR Operations, and many more. Established in 2006, _VO IS has evolved into a global, multi-functional organisation, a Centre of Excellence for Intelligent Solutions focused on adding value and delivering business outcomes for Vodafone. _VOIS Centre Intro About _VOIS India: In 2009, _VO IS started operating in India and now has established global delivery centres in Pune, Bangalore and Ahmedabad. With more than 14,500 employees, _VO IS India supports global markets and group functions of Vodafone, and delivers best-in-class customer experience through multi-functional services in the areas of Information Technology, Networks, Business Intelligence and Analytics, Digital Business Solutions (Robotics & AI), Commercial Operations (Consumer & Business), Intelligent Operations, Finance Operations, Supply Chain Operations and HR Operations and more. Job Role Related Content (Role specific) Design, develop, and maintain scalable data pipelines and ETL processes using GCP services such as BigQuery, Cloud Data Fusion, Dataflow, Pub/Sub, Cloud Storage, Composer ,Cloud Function, Cloud RUN . Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver high-quality data solutions. Implement data integration solutions to ingest, process, and store large volumes of structured and unstructured data from various sources. Optimize and tune data pipelines for performance, reliability, and cost-efficiency. . Ensure data quality and integrity through data validation, cleansing, and transformation processes. Develop and maintain data models, schemas, and metadata to support data analytics and reporting. Monitor and troubleshoot data pipeline issues, ensuring timely resolution and minimal disruption to data workflows. Stay up-to-date with the latest GCP technologies and best practices, and provide recommendations for continuous improvement. Mentor and guide junior data engineers, fostering a culture of knowledge sharing and collaboration. 4 to 6 years of experience in data engineering, with a strong focus on GCP. Proficiency in GCP services such as BigQuery, Cloud Data Fusion, Dataflow, Pub/Sub, Cloud Storage, Composer ,Cloud Function, Cloud RUN. Strong programming skills in Python, PLSQL. Experience with SQL and NoSQL databases. Knowledge of data warehousing concepts and best practices. Familiarity with data integration tools and frameworks. Excellent problem-solving and analytical skills. _VOIS Equal Opportunity Employer Commitment India _VO IS is proud to be an Equal Employment Opportunity Employer. We celebrate differences and we welcome and value diverse people and insights. We believe that being authentically human and inclusive powers our employees’ growth and enables them to create a positive impact on themselves and society. We do not discriminate based on age, colour, gender (including pregnancy, childbirth, or related medical conditions), gender identity, gender expression, national origin, race, religion, sexual orientation, status as an individual with a disability, or other applicable legally protected characteristics. As a result of living and breathing our commitment, our employees have helped us get certified as a Great Place to Work in India for four years running. We have been also highlighted among the Top 5 Best Workplaces for Diversity, Equity, and Inclusion , Top 10 Best Workplaces for Women , Top 25 Best Workplaces in IT & IT-BPM and 14th Overall Best Workplaces in India by the Great Place to Work Institute in 2023. These achievements position us among a select group of trustworthy and high-performing companies which put their employees at the heart of everything they do. By joining us, you are part of our commitment. We look forward to welcoming you into our family which represents a variety of cultures, backgrounds, perspectives, and skills! Apply now, and we’ll be in touch! Show more Show less

Posted 3 weeks ago

Apply

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description This role is critical in bringing all customer related data sources into the Data Platform and delivering transformed customer knowledge and insights to the enterprise through real-time interfaces and data solutions. The ideal candidate will have deep expertise in software engineering, data engineering, cloud platforms, and product-centric thinking, coupled with strong leadership and execution capabilities. The candidate also should have experience and understanding of customer data management and activation of the same for business use cases. Responsibilities Strategic Leadership: Define and drive the roadmap for Ford’s Customer 360 Data Platform, aligning with business and technology goals. Design and oversee data ingestion strategies for both batch and streaming data from various sources, including Oracle, MySQL, PostgreSQL, DB2, Mainframe, Kafka, file systems and third-party vendor sources. Ensure seamless integration of structured and unstructured data into the enterprise data platform. Lead efficient and scalable ingestion operations leading a group of Data Stewards and Product Owners in close collaboration with EDP ingestion teams. Demand intake, review, prioritization based on business drivers and running efficient operations are key deliverables. Web Services Development & Adoption: Oversee the design, development, and deployment of scalable web services, ensuring broad enterprise adoption. Data Governance & Compliance: Work with Enterprise Data Platform team to implement best-in-class data governance, security, and privacy frameworks to ensure compliance with industry regulations. AI/ML Enablement: Lead efforts to integrate AI-driven capabilities such as SQL-to-code conversion, LLM-powered debugging, and self-service ingestion frameworks. Engineering & Operational Excellence: Establish best practices for data engineering, DevSecOps, reliability, and observability. Talent & Organization Building: Attract, develop, and mentor top talent in data engineering and platform development, fostering a high-performance team culture. Cross-functional Collaboration: Partner with business units, product teams, and other technology leaders to ensure seamless data accessibility and usability across Ford. Data Ingestion and Integration into Enterprise Data Platform (EDP): Qualifications Bachelor’s degree in Computer Science, Engineering, Data Science, or a related field; an advanced degree is a plus. 10+ years of experience in data engineering, cloud platforms, or enterprise-scale data management, with at least 3 years in a leadership role. Proven track record of delivering large-scale data platforms, metadata catalogs, operations, services, and real-time&streaming architectures. Expertise in GCP (CloudRun, APIGEE, PostgreSQL, Dataflow, Pub/Sub), and other Modern Data Stack technologies. Experience in implementing data governance, classification, and security policies at an enterprise level. Exceptional leadership skills with a demonstrated ability to drive alignment, influence stakeholders, and lead diverse global teams. Excellent communication and executive presence, capable of engaging with senior leadership and business partners. Show more Show less

Posted 3 weeks ago

Apply

5.0 - 7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

You are passionate about quality and how customers experience the products you test. You have the ability to create, maintain and execute test plans in order to verify requirements. As a Quality Engineer at Equifax, you will be a catalyst in both the development and the testing of high priority initiatives. You will develop and test new products to support technology operations while maintaining exemplary standards. As a collaborative member of the team, you will deliver QA services (code quality, testing services, performance engineering, development collaboration and continuous integration). You will conduct quality control tests in order to ensure full compliance with specified standards and end user requirements. You will execute tests using established plans and scripts; documents problems in an issues log and retest to ensure problems are resolved. You will create test files to thoroughly test program logic and verify system flow. You will identify, recommend and implement changes to enhance effectiveness of QA strategies. What you will do Independently develop scalable and reliable automated tests and frameworks for testing software solutions. Specify and automate test scenarios and test data for a highly complex business by analyzing integration points, data flows, personas, authorization schemes and environments Develop regression suites, develop automation scenarios, and move automation to an agile continuous testing model. Pro-actively and collaboratively taking part in all testing related activities while establishing partnerships with key stakeholders in Product, Development/Engineering, and Technology Operations. What Experience You Need Bachelor's degree in a STEM major or equivalent experience 5-7 years of software testing experience Able to create and review test automation according to specifications Ability to write, debug, and troubleshoot code in Java, Springboot, TypeScript/JavaScript, HTML, CSS Creation and use of big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others with respect to software validation Created test strategies and plans Led complex testing efforts or projects Participated in Sprint Planning as the Test Lead Collaborated with Product Owners, SREs, Technical Architects to define testing strategies and plans. Design and development of micro services using Java, Springboot, GCP SDKs, GKE/Kubeneties Deploy and release software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs Cloud Certification Strongly Preferred What could set you apart An ability to demonstrate successful performance of our Success Profile skills, including: Attention to Detail - Define test case candidates for automation that are outside of product specifications. i.e. Negative Testing; Create thorough and accurate documentation of all work including status updates to summarize project highlights; validating that processes operate properly and conform to standards Automation - Automate defined test cases and test suites per project Collaboration - Collaborate with Product Owners and development team to plan and and assist with user acceptance testing; Collaborate with product owners, development leads and architects on functional and non-functional test strategies and plans Execution - Develop scalable and reliable automated tests; Develop performance testing scripts to assure products are adhering to the documented SLO/SLI/SLAs; Specify the need for Test Data types for automated testing; Create automated tests and tests data for projects; Develop automated regression suites; Integrate automated regression tests into the CI/CD pipeline; Work with teams on E2E testing strategies and plans against multiple product integration points Quality Control - Perform defect analysis, in-depth technical root cause analysis, identifying trends and recommendations to resolve complex functional issues and process improvements; Analyzes results of functional and non-functional tests and make recommendation for improvements; Performance / Resilience: Understanding application and network architecture as inputs to create performance and resilience test strategies and plans for each product and platform. Conducting the performance and resilience testing to ensure the products meet SLAs / SLOs Quality Focus - Review test cases for complete functional coverage; Review quality section of Production Readiness Review for completeness; Recommend changes to existing testing methodologies for effectiveness and efficiency of product validation; Ensure communications are thorough and accurate for all work documentation including status and project updates Risk Mitigation - Work with Product Owners, QE and development team leads to track and determine prioritization of defects fixes Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

Extensive Experience in IT data analytics projects, Hands on experience in migrating on premise ETLs to Google Cloud Platform (GCP) using cloud native tools such as BIG query, Cloud Data,Proc, Google Cloud Storage, Composer SQL concepts, Presto SQL, Hive SQL, Python (Pandas, Numpy, SciPy, Matplotlib) and Pyspark Design and implement scalable data pipelines using Google Cloud Dataflow Develop and optimize BigQuery datasets for efficient data analysis and reporting Collaborate with cross-functional teams to integrate data solutions with business processes Automate data workflows using Cloud Composer and Apache Airflow Ensure data security and compliance with GCP Identity and Access Management Mentor junior engineers in best practices for cloud-based data engineering Implement real-time data processing with Google Cloud Pub/Sub and Dataflow Continuously evaluate and adopt new GCP tools and technologies Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Position Overview Job Title: Data Engineer for Private Bank One Data Platform on Google Cloud Corporate Title: Associate Location: Pune, India Role Description As part of one of the internationally staffed agile teams of the Private Bank One Data Platform, you are part of the "TDI PB Germany Enterprise & Data" division. The focus here is on the development, design, and provision of different solutions in the field of data warehousing, reporting and analytics for the Private Bank to ensure that necessary data is provided for operational and analytical purposes. The PB One Data Platform is the new strategic data platform of the Private Bank and uses the Google Cloud Platform as the basis. With Google as a close partner, we are following Deutsche Bank’s cloud strategy with the aim of transferring or rebuilding a significant share of today’s on-prem applications to the Google Cloud Platform. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities Work within software development applications as a Data Engineer to provide fast and reliable data solutions for warehousing, reporting, Customer- and Business Intelligence solutions. Partner with Service/Backend Engineers to integrate data provided by legacy IT solutions into your designed databases and make it accessible to the services consuming those data. Focus on the design and setup of databases, data models, data transformations (ETL), critical Online banking business processes in the context Customer Intelligence, Financial Reporting and performance controlling. Contribute to data harmonization as well as data cleansing. A passion for constantly learning and applying new technologies and programming languages in a constantly evolving environment. Build solutions are highly scalable and can be operated flawlessly under high load scenarios. Together with your team, you will run and develop you application self-sufficiently. You'll collaborate with Product Owners as well as the team members regarding design and implementation of data analytics solutions and act as support during the conception of products and solutions. When you see a process running with high manual effort, you'll fix it to run automated, optimizing not only our operating model, but also giving yourself more time for development. Your Skills And Experience Mandatory Skills Hands-on development work building scalable data engineering pipelines and other data engineering/modelling work using Java/Python. Excellent knowledge of SQL and NOSQL databases. Experience working in a fast-paced and Agile work environment. Working knowledge of public cloud environment. Preferred Skills Experience in Dataflow (Apache Beam)/Cloud Functions/Cloud Run Knowledge of workflow management tools such as Apache Airflow/Composer. Demonstrated ability to write clear code that is well-documented and stored in a version control system (GitHub). Knowledge of GCS Buckets, Google Pub Sub, BigQuery Knowledge about ETL processes in the Data Warehouse environment/Data Lake and how to automate them. Nice to have Knowledge of provisioning cloud resources using Terraform. Knowledge of Shell Scripting. Experience with Git, CI/CD pipelines, Docker, and Kubernetes. Knowledge of Google Cloud Cloud Monitoring & Alerting Knowledge of Cloud Run, Data Form, Cloud Spanner Knowledge of Data Warehouse solution - Data Vault 2.0 Knowledge on NewRelic Excellent analytical and conceptual thinking. Excellent communication skills, strong independence and initiative, ability to work in agile delivery teams. Good communication and experience in working with distributed teams (especially Germany + India) How We’ll Support You Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment. Show more Show less

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

India

On-site

Linkedin logo

Description The Position We are seeking a seasoned engineer with a passion for changing the way millions of people save energy. You’ll work within the Engineering team to build and improve our platforms to deliver flexible and creative solutions to our utility partners and end users and help us achieve our ambitious goals for our business and the planet. We are seeking a skilled and passionate Data Engineer with expertise in Python to join our development team. As a Data Engineer, you will play a crucial role developing different components, harnessing the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, data processing and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. You will coordinate with the rest of the team working on different layers of the infrastructure. Therefore, a commitment to collaborative problem solving, sophisticated design, and quality product is important. You will own the development and its quality independently and be responsible for high quality deliverables. And you will work with a great team with excellent benefits. Responsibilities & Skills You should: Be excited to work with talented, committed people in a fast-paced environment. Use a data-driven approach and actively work on product & technology roadmap at strategy level and day-to-day tactical level. Have a proven experience as a Data Engineer with a focus on Python. Be designing, building, and maintaining high performance solutions with reusable, and reliable code. Use a rigorous approach for product improvement and customer satisfaction. Love developing great software as a seasoned product engineer. Be ready, able, and willing to jump onto a call with a partner or customer to help solve problems. Be able to deliver against several initiatives simultaneously. Have a strong eye for detail and quality of code. Have an agile mindset. Have strong problem-solving skills and attention to detail. Required Skills (Data Engineer) You are an experienced developer – you ideally have 4 or more years of professional experience Design, build, and maintain scalable data pipelines and ETL processes to support business analytics and reporting needs. Strong proficiency in Python for building and automating data pipelines, ETL processes, and data integration workflows. Strong Experience with SQL for querying and transforming large datasets, and optimizing query performance in relational databases. Familiarity with big data frameworks such as Apache Spark or PySpark for distributed data processing. Hands-on experience with data pipeline orchestration tools like Apache Airflow or Prefect for workflow automation. Strong Understanding of data modeling principles for building scalable and efficient data architectures (e.g., star schema, snowflake schema). Good to have experience with Databricks for managing and processing large datasets, implementing Delta Lake, and leveraging its collaborative environment. Knowledge of Google Cloud Platform (GCP) services like BigQuery, Dataflow, Pub/Sub, and Cloud Storage for end-to-end data engineering solutions. Familiarity with version control systems such as Git and CI/CD pipelines for managing code and deploying workflows. Awareness of data governance and security best practices, including access control, data masking, and compliance with industry standards. Exposure to monitoring and logging tools like Datadog, Cloud Logging, or ELK stack for maintaining pipeline reliability. Ability to understand business requirements and translate them into technical requirements. Expertise in solutions design. Demonstrable experience with writing unit and functional tests. Ability to deliver against several initiatives simultaneously as a multiplier. Required Skills (Python) You are an experienced developer - a minimum of 4+ years of professional experience Python experience, preferably both 2.7 and 3.x Strong Python knowledge - familiar with OOPs, data structures and algorithms Work experience & strong proficiency in Python and its associated frameworks (like Flask, FastAPI etc.) Experience in designing and implementing scalable microservice architecture Familiarity with RESTful APIs and integration of third-party APIs 2+ years building and managing APIs to industry-accepted RESTful standards Demonstrable experience with writing unit and functional tests Application of industry security best practices to application and system development Experience with database systems such as PostgreSQL, MySQL, or MongoDB Required The following experiences are not required, but you'll stand out from other applicants if you have any of the following, in our order of importance: Experience with cloud infrastructure like AWS/GCP or other cloud service provider experience Serverless architecture, preferably AWS Lambda Solid CI/CD experience You are a Git guru and revel in collaborative workflows You work on the command line confidently and are familiar with all the goodies that the linux toolkit can provide Knowledge of modern authorization mechanisms, such as JSON Web Token Qualifications Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Uplight provides equal employment opportunities to all employees and applicants and prohibits discrimination and harassment of any type without regard to race (including hair texture and hairstyles), color, religion (including head coverings), age, sex, national origin, caste, disability status, genetics, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Kanayannur, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Business Consultant P&C (Property & Casualty - Personal and Commercial Insurance) Candidate should have experience in working in Property & Casualty lines (both Personal and Commercial Insurance), should be familiar with anyone or more functional process – PC, BC, CC. (Preferred Guidewire/Duckcree LOBS Line of Business (Personal and Commercial Lines): must have Property Auto General Liability Good to have - Casualty Lines Professional Liability, Directors & Officers, Errors & Omissions, EPL, etc Inland Marine, Cargo Workers Compensation Umbrella, Excess Liability Roles and Responsibilities: Worked on multiple Business transformation, upgrade and modernization programs. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Product Experience/Other Skills: Product Knowledge – Guidewire, Duckcreek, Exigent, Majesco. (Preferred Guidewire/Duckcreek) Strong skills in stakeholder management and communication. Should have end to end processes in P&C insurance domain. Should be ready to work in flexible shifts (a good amount of overlap with US/UK hours). Good organizational and time management skills required. Should have good written and verbal communication skills in English. Industry certifications AINS 21 - Property and Liability Insurance Principles, AINS 22 - Personal Insurance, AINS 23 - Commercial Insurance and AINS 24 - General Insurance for IT and Support Professionals will be added advantage. Additional experience in Life or other insurance domain is added advantage. We expect you to work effectively as a team member and build good relationships with the client. You will have the opportunity to expand your domain knowledge and skills and will be able to collaborate frequently with other EY professionals with a wide variety of expertise. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Site Reliability Engineering (SRE) at Equifax is a discipline that combines software and systems engineering for building and running large-scale, distributed, fault-tolerant systems. SRE ensures that internal and external services meet or exceed reliability and performance expectations while adhering to Equifax engineering principles. SRE is also an engineering approach to building and running production systems – we engineer solutions to operational problems. Our SREs are responsible for overall system operation and we use a breadth of tools and approaches to solve a broad set of problems. Practices such as limiting time spent on operational work, blameless postmortems, proactive identification, and prevention of potential outages. Our SRE culture of diversity, intellectual curiosity, problem solving and openness is key to its success. Equifax brings together people with a wide variety of backgrounds, experiences and perspectives. We encourage them to collaborate, think big, and take risks in a blame-free environment. We promote self-direction to work on meaningful projects, while we also strive to build an environment that provides the support and mentorship needed to learn, grow and take pride in our work What You’ll Do Troubleshoot and support the dev teams with their continuous integration and continuous deployment processes (CI/CD). Assist in resolving complex issues arising from product upgrades, installations and configurations Design and improve automation tools that integrate with: Docker, Kubernetes, Helm, Terraform, GitHub Actions, GCP Develop and execute best practices, system hardening and security controls, contribute to providing solution architectures and strategy You will automate system scalability and continually work to improve system resiliency, performance and efficiency Configuration of monitoring and APM tools such as: Datadog, AppDynamics, Grafana and Prometheus Partner with respective departments to develop practical automation solutions and participate in cross functional team meetings to collaborate and ensure successful execution Diagnose and deploy complex systems that may involve coordination with external teams. Maintain internal documentation that fully reflects all activity related to an application and environment to be used by applicable teams Respond and work incident tickets in ServiceNow regarding items such as service outages, infrastructure issues, zero day vulnerability patching, etc. Design and implement delivery pipelines, including test automation, security, and performance Assist in resolving complex issues arising from product upgrades, installations and configurations Comply with all corporate and departmental privacy and data security policies and practice You will influence and design infrastructure, architecture, standards and methods for large-scale systems You will support services prior to production via infrastructure design, software platform development, load testing, capacity planning and launch reviews You will maintain services during deployment and in production by measuring and monitoring key performance and service level indicators including availability latency, and overall system health What Experience You Need Bachelor's degree in Computer Science or related technical field involving coding (e.g., physics or mathematics), or equivalent job experience required 5+ years of experience developing and/or administering software in public cloud 5+ years of experience in languages such as Python, Bash, Java, Go, JavaScript and/or node.js or similar skills 5+ years of experience in system administration skills, including automation and orchestration of Linux/Windows using Chef, Puppet, and/or containers (Docker, Kubernetes, etc.) or similar skills Experience with build systems such as GitHub Actions, Jenkins Experience with configuration management tools such as Chef, Ansible, Powershell DSC Experience with infrastructure-as-code technologies (Terraform, GCP Deployment Manager) Experience with Kubernetes (GKE preferred) What Could Set You Apart Technical knowledge about monitoring tools, Splunk, security controls, networking (firewalls, ingress/egress routing) Experience with GCP, such as autoscaling, Google Cloud Functions, Google Cloud Dataflow, Google Cloud Pub/Sub, IAM Experience with web servers such as Apache or Nginx You have expertise designing, analyzing and troubleshooting large-scale distributed systems. You take a system problem-solving approach, coupled with strong communication skills and a sense of ownership and drive You are passionate for automation with a desire to eliminate toil whenever possible You’ve built software or maintained systems in a highly secure, regulated or compliant industry You thrive in and have experience and passion for working within a DevOps culture and as part of a team Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

WPP is the creative transformation company. We use the power of creativity to build better futures for our people, planet, clients, and communities. Working at WPP means being part of a global network of more than 100,000 talented people dedicated to doing extraordinary work for our clients. We operate in over 100 countries, with corporate headquarters in New York, London and Singapore. WPP is a world leader in marketing services, with deep AI, data and technology capabilities, global presence and unrivalled creative talent. Our clients include many of the biggest companies and advertisers in the world, including approximately 300 of the Fortune Global 500. Our people are the key to our success. We're committed to fostering a culture of creativity, belonging and continuous learning, attracting and developing the brightest talent, and providing exciting career opportunities that help our people grow. Why we're hiring: WPP is at the forefront of the marketing and advertising industry's largest transformation. Our Global CIO is leading a significant evolution of our Enterprise Technology capabilities, bringing together over 2,500 technology professionals into an integrated global team. This team will play a crucial role in enabling the ongoing transformation of our agencies and functions. GroupM is the world’s leading media investment company responsible for more than $63B in annual media investment through agencies Mindshare, MediaCom, Wavemaker, Essence and m/SIX, as well as the results-driven programmatic audience company, Xaxis and data and technology company Choreograph. GroupM’s portfolio includes Data & Technology, Investment and Services, all united in a vision to shape the next era of media where advertising works better for people. By leveraging all the benefits of scale, the company innovates, differentiates and generates sustained value for our clients wherever they do business. The GroupM IT team in WPP IT are the technology solutions partner for the GroupM group of agencies and are accountable for co-ordinating and assuring end-to-end change delivery, managing the GroupM IT technology life cycle and innovation pipeline. This role will work as part of the Business Platform Team for EMEA. You will be part of a new team Data team in Chennai, that will support our existing and future BI setup for EMEA markets. You will be responsible for delivering the solutions formulated by product owners and key stakeholders for different EMEA markets. In collaboration with the Data development team, you update the architecture and data models to new data needs and changes in source systems What you'll be doing: Design, develop, and maintain robust and scalable data pipelines using Google Cloud Platform services (such as Dataflow, Pub/Sub, and Cloud Composer) to extract, transform, and load data from various sources. Communicate technical concepts and benefits of GCP solutions to non-technical stakeholders, aiding them in understanding necessary changes for project goals. Monitor and optimize data processing and storage resources on GCP. Troubleshoot and resolve data pipeline issues and performance bottlenecks. Document data engineering processes, best practices, and technical specifications. Adhere to agile development practices – including evolutionary design, refactoring, continuous integration/delivery, and test-driven development. Collaborate with Business Partner Team and Stakeholders during project scoping and feasibility phases as a SME, identifying technical risks and proposing mitigations. Provide production support for data load jobs. Automate data workflows and create queries for periodic report generation. Collaborate with development teams to implement data manipulation queries ensuring alignment with business requirements. Maintain and upgrade existing applications as necessary. Participate in key design meetings and provide technical support. Perform additional tasks related to data migrations, cloud resource audits, and cross-functional support activities. What you'll need: Education: Combination of education/experience that would enable incumbent to meet fundamental duties and required competencies. Bachelor’s degree in computer science, Engineering, Mathematics or another technical field is highly preferred. Personality and Working Practice: Team player, with good communication skills, analytical thinking, attention to detail and the ability to think about the big picture. Required Experience and Knowledge: +5 years’ experience with GCP services such as BigQuery, Dataflow, Pub/Sub, Cloud Storage, Cloud Composer and Cloud Function Understanding of GCP security best practices, including IAM roles and service accounts 1 or 2 + years of strong experience in Python development (object-oriented/Functional Programming, Pandas, Pyspark etc) Experience with CI/CD pipelines, containerization (Docker, Kubernetes), and infrastructure as code (Terraform, Cloud Deployment Manager) Desirable Experience and Knowledge: Knowledge and some experience with DBT Experience with designing and implementing data pipelines using tools like Apache Beam, Apache Airflow, or similar Languages: Very good English skills, any other language is an addition. Who you are: You're open : We are inclusive and collaborative; we encourage the free exchange of ideas; we respect and celebrate diverse views. We are open-minded: to new ideas, new partnerships, new ways of working. You're optimistic : We believe in the power of creativity, technology and talent to create brighter futures or our people, our clients and our communities. We approach all that we do with conviction: to try the new and to seek the unexpected. You're extraordinary: we are stronger together: through collaboration we achieve the amazing. We are creative leaders and pioneers of our industry; we provide extraordinary every day. What we'll give you: Passionate, inspired people – We aim to create a culture in which people can do extraordinary work. Scale and opportunity – We offer the opportunity to create, influence and complete projects at a scale that is unparalleled in the industry. Challenging and stimulating work – Unique work and the opportunity to join a group of creative problem solvers. Are you up for the challenge? We believe the best work happens when we're together, fostering creativity, collaboration, and connection. That's why we’ve adopted a hybrid approach, with teams in the office around four days a week. If you require accommodations or flexibility, please discuss this with the hiring team during the interview process. WPP is an equal opportunity employer and considers applicants for all positions without discrimination or regard to particular characteristics. We are committed to fostering a culture of respect in which everyone feels they belong and has the same opportunities to progress in their careers. Please read our Privacy Notice (https://www.wpp.com/people/wpp-privacy-policy-for-recruitment) for more information on how we process the information you provide. Show more Show less

Posted 3 weeks ago

Apply

Exploring Dataflow Jobs in India

The dataflow job market in India is currently experiencing a surge in demand for skilled professionals. With the increasing reliance on data-driven decision-making in various industries, the need for individuals proficient in managing and analyzing dataflow is on the rise. This article aims to provide job seekers with valuable insights into the dataflow job landscape in India.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Delhi

These cities are known for their thriving tech ecosystems and are home to numerous companies actively hiring for dataflow roles.

Average Salary Range

The average salary range for dataflow professionals in India varies based on experience levels. Entry-level positions can expect to earn between INR 4-6 lakhs per annum, while experienced professionals can command salaries upwards of INR 12-15 lakhs per annum.

Career Path

In the dataflow domain, a typical career path may involve starting as a Junior Data Analyst or Data Engineer, progressing to roles such as Senior Data Scientist or Data Architect, and eventually reaching positions like Tech Lead or Data Science Manager.

Related Skills

In addition to expertise in dataflow tools and technologies, dataflow professionals are often expected to have proficiency in programming languages such as Python or R, knowledge of databases like SQL, and familiarity with data visualization tools like Tableau or Power BI.

Interview Questions

  • What is dataflow and how is it different from data streaming? (basic)
  • Explain the difference between batch processing and real-time processing. (medium)
  • How do you handle missing or null values in a dataset? (basic)
  • Can you explain the concept of data lineage? (medium)
  • What is the importance of data quality in dataflow processes? (basic)
  • How do you optimize dataflow pipelines for performance? (medium)
  • Describe a time when you had to troubleshoot a dataflow issue. (medium)
  • What are some common challenges faced in dataflow projects? (medium)
  • How do you ensure data security and compliance in dataflow processes? (medium)
  • What are the key components of a dataflow architecture? (medium)
  • Explain the concept of data partitioning in dataflow. (advanced)
  • How would you handle a sudden increase in data volume in a dataflow pipeline? (advanced)
  • What role does data governance play in dataflow processes? (medium)
  • Can you discuss the advantages and disadvantages of using cloud-based dataflow solutions? (medium)
  • How do you stay updated with the latest trends and technologies in dataflow? (basic)
  • What is the significance of metadata in dataflow management? (medium)
  • Walk us through a dataflow project you have worked on from start to finish. (medium)
  • How do you ensure data quality and consistency across different data sources in a dataflow pipeline? (medium)
  • What are some best practices for monitoring and troubleshooting dataflow pipelines? (medium)
  • How do you handle data transformations and aggregations in a dataflow process? (basic)
  • What are the key performance indicators you would track in a dataflow project? (medium)
  • How do you collaborate with cross-functional teams in a dataflow project? (basic)
  • Can you explain the concept of data replication in dataflow management? (advanced)
  • How do you approach data modeling in a dataflow project? (medium)
  • Describe a challenging dataflow problem you encountered and how you resolved it. (advanced)

Closing Remark

As you navigate the dataflow job market in India, remember to showcase your skills and experiences confidently during interviews. Stay updated with the latest trends in dataflow and continuously upskill to stand out in a competitive job market. Best of luck in your job search journey!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies