Jobs
Interviews

904 Dataflow Jobs - Page 30

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

15.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Job Description We are seeking an experienced Director to lead a team responsible for the development and maintenance of our Connected Vehicle Data. The ideal candidate will have a strong technical background in data and/or software engineering, along with proven leadership and management skills. This role requires the ability to design and code streaming solutions, prioritize team tasks, make timely decisions, and guide the team to deliver high-quality results. The leader must be knowledgeable in data governance, customer consent, and security standards. Responsibilities Responsibilities: Lead and mentor a high-performing team of local and remote data engineers. Prioritize team workload, allocate tasks effectively, and ensure team members have the resources to succeed. Provide technical expertise and guidance to the team. Evaluate and mentor adherence to coding standards, best practices, and architectural guidelines. Oversee the design, development, maintenance, scalability, reliability, and performance of the connected vehicle data platform pipelines and architecture. Contribute to the long-term strategic direction of the Connected Vehicle Data Platform with a focus on enterprise use. Enforce and ensure data quality, data governance, and security standards. Collaborate with Data Program Management to prioritize and implement various business customers’ requests and logic into data assets with optimized design and code development. Collaborate to identify and consolidate common tasks across teams to improve efficiency and reduce redundancy. Communicate decisions effectively and transparently to internal and external customers. Stay updated on industry trends and emerging technologies to inform technical decisions. Qualifications Qualifications Required: Minimum – Bachelor’s Degree in Computer Science, Information Technology, Information Systems, or Data Analytics. Preferred – Master’s Degree in highly technical field – computer science, mathematics, physics. 15+ years of experience in data engineering, cloud platforms, or enterprise-scale data management, with a minimum of 5 years in connected/streaming vehicle platforms. 5+ years' experience leading a software/data engineering team. Expertise in one of the following public cloud environments: Amazon Web Services, Google Cloud Platform, or Microsoft Azure. Expert knowledge and hands on experience in DevOps and SDLC. Monitor and optimize cost and compute for processes in GCP technologies (e.g., BigQuery, Dataflow, Cloud Run, DataProc). Manage and scale serverless applications and clusters, optimizing resource utilization, and implementing monitoring and logging strategies. Expertise in streaming technologies (Kafka, Pub/Sub) and OpenShift, managing high-throughput topics, message ordering, and ensuring data consistency and durability. Why Join Ford? Be at the forefront of Ford’s data and AI transformation , influencing how data drives business decisions. Work in a fast-paced, innovation-driven environment with cutting-edge technology and industry-leading experts . Enjoy a culture that values collaboration, inclusion, and career development . Competitive compensation, benefits, and opportunities for professional growth. Join Us in Shaping the Future of Data at Ford! Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Noida

On-site

As a Data Engineer with a focus on migrating on-premises databases to Google Cloud SQL, you will play a critical role in solving complex problems and creating value for our business by ensuring reliable, scalable,and efficient data migration processes. You will be responsible for architecting,designing and implementing custom pipelines on the GCP stack to facilitate seamless migration. Required Skills: 5+ years of industry experience in data engineering, business intelligence, or a related field with experience in manipulating, processing, and extracting value from datasets. Expertise in architecting, designing, building, and deploying internal applications to support technology life cycle management, service delivery management, data, and business intelligence. Experience in developing modular code for versatile pipelines or complex ingestion frameworks aimed at loading data into Cloud SQL and managing data migration from multiple on-premises sources. Strong collaboration with analysts and business process owners to translate business requirements into technical solutions. Proficiency in coding with scripting languages (Shell scripting, Python, SQL). Deep understanding and hands-on experience with Google Cloud Platform (GCP) technologies, especially in data migration and warehousing, including Database Migration Service (DMS), Cloud SQL, BigQuery, Dataflow, Data Catalog, Cloud Composer, Google Cloud Storage (GCS), IAM, Compute Engine, Cloud Data Fusion, and optionally Dataproc. Adherence to best development practices including technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, and writing clean, modular, self-sustaining code. Familiarity with CI/CD processes using GitHub, Cloud Build, and Google Cloud SDK.

Posted 1 month ago

Apply

0 years

0 Lacs

Noida

On-site

As a Data Engineer, you will design, develop, and support data pipelines and related data products and platforms. Your primary responsibilities include designing and building data extraction, loading, and transformation pipelines across on-prem and cloud platforms. You will perform application impact assessments, requirements reviews, and develop work estimates. Additionally, you will develop test strategies and site reliability engineering measures for data products and solutions, participate in agile development "scrums" and solution reviews, mentor junior Data Engineering Specialists, lead the resolution of critical operations issues, and perform technical data stewardship tasks, including metadata management, security, and privacy by design. Required Skills: Design, develop, and support data pipelines and related data products and platforms. Design and build data extraction, loading, and transformation pipelines and data products across on-prem and cloud platforms. Perform application impact assessments, requirements reviews, and develop work estimates. Develop test strategies and site reliability engineering measures for data products and solutions. Participate in agile development "scrums" and solution reviews. Mentor junior Data Engineers. Lead the resolution of critical operations issues, including post-implementation reviews. Perform technical data stewardship tasks, including metadata management, security, and privacy by design. Design and build data extraction, loading, and transformation pipelines using Python and other GCP Data Technologies Demonstrate SQL and database proficiency in various data engineering tasks. Automate data workflows by setting up DAGs in tools like Control-M, Apache Airflow, and Prefect. Develop Unix scripts to support various data operations. Model data to support business intelligence and analytics initiatives. Utilize infrastructure-as-code tools such as Terraform, Puppet, and Ansible for deployment automation. Expertise in GCP data warehousing technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog, Cloud Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion and Dataproc (good to have).

Posted 1 month ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Required Skills & Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or a related discipline. Minimum of 5 years of practical experience in a data engineering or comparable position. Demonstrated expertise in SQL and Python (or similar languages such as Scala/Java). Extensive experience with data pipeline orchestration tools (e.g., Airflow, dbt, ). Proficiency in cloud data platforms, including AWS (Redshift, S3, Glue), or GCP (BigQuery, Dataflow), or Azure (Data Factory, Synapse). Familiarity with big data technologies (e.g., Spark, Kafka, Hive) and other data tools. Solid grasp of data warehousing principles, data modeling techniques, and performance tuning. (e.g. Erwin Data Modeler, MySQL Workbench) · Exceptional problem-solving abilities coupled with a proactive and team-oriented approach. Show more Show less

Posted 1 month ago

Apply

5.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Our organization is seeking a skilled Senior Data Engineer to become an integral part of our team. In this role, you will focus on projects related to data integration and ETL on cloud-based platforms. You will take charge of creating and executing sophisticated data solutions, ensuring data accuracy, dependability, and accessibility. Responsibilities Create and execute sophisticated data solutions on cloud-based platforms Build ETL processes utilizing SQL, Python, and other applicable technologies Maintain data accuracy, reliability, and accessibility for all stakeholders Work with cross-functional teams to comprehend data integration needs and specifications Produce and sustain documentation, including technical specifications, data flow diagrams, and data mappings Enhance and tune data integration processes for optimal performance and efficiency, guaranteeing data accuracy and integrity Requirements Bachelor’s degree in Computer Science, Electrical Engineering, or a related field 5-8 years of experience in data engineering Proficiency in cloud-native or Spark-based ETL tools such as AWS Glue, Azure Data Factory, or GCP Dataflow Strong knowledge of SQL for data querying and manipulation Qualifications in Snowflake for data warehousing Familiarity with cloud platforms like AWS, GCP, or Azure for data storage and processing Excellent problem-solving skills and attention to detail Good verbal and written communication skills in English at a B2 level Nice to have Background in ETL using Python Show more Show less

Posted 1 month ago

Apply

7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

The Technical Lead / Technical Consultant is a core role and focal point of the project team responsible for the whole technical solution and managing the day-to-day delivery. The role will focus on the technical solution architecture, detailed technical design, coaching of the development/implementation team, and governance of the technical delivery. Technical ownership of the solution from bid inception through implementation to client delivery, followed by after-sales support and best practice advice. Interactions with internal stakeholders and clients to explain technology solutions and a clear understanding of client’s business requirements through which to guide optimal design to meet their needs. Job Description: Must-Have Skills: Database (one or more of MS SQL Server, Oracle, Cloud SQL, Cloud Spanner, etc.) Data Warehouse (one or more of Big Query, SnowFlake, etc.) ETL tool (two or more of Cloud Data Fusion, Dataflow, Dataproc, Pub/Sub, Composer, Cloud Functions, Cloud Run, etc.) Experience in Cloud platforms - GCP Python, PySpark, Project & resource management SVN, JIRA, Automation workflow (Composer, Cloud Scheduler, Apache Airflow, Tidal, Tivoli or similar) Good to have Skills: UNIX shell scripting, SnowFlake, Redshift, Familiar with NoSQL such as MongoDB, etc ETL tool (Databricks / AWS Glue / AWS Lambda / Amazon Kinesis / Amazon Firehose / Azure Data Factory / ADF / DBT / Talend, Informatica, IICS (Informatica cloud) ) Experience in Cloud platforms - AWS / Azure Client-facing skills Key Responsibilities: Ability to design simple to medium data solutions for clients by using cloud architecture using GCP Strong understanding of DW, data mart, data modeling, data structures, databases, and data ingestion and transformation. Working knowledge of ETL as well as database skills Working knowledge of data modeling, data structures, databases, and ETL processes Strong understanding of relational and non-relational databases and when to use them Leadership and communication skills to collaborate with local leadership as well as our global teams Translating technical requirements into ETL/ SQL application code Document project architecture, explain the detailed design to the team, and create low-level to high-level design Create technical documents for ETL and SQL developments using Visio, PowerPoint, and other MS Office package Will need to engage with Project Managers, Business Analysts, and Application DBA to implement ETL Solutions Perform mid to complex-level tasks independently Support Clients, Data Scientists, and Analytical Consultants working on marketing solution Work with cross-functional internal teams and external clients Strong project management and organization skills . Ability to lead 1 – 2 projects of team size 2 – 3 team members. Code management systems which include Code review, deployment, cod Work closely with the QA / Testing team to help identify/implement defect reduction initiatives Work closely with the Architecture team to make sure Architecture standards and principles are followed during development Performing Proof of Concepts on new platforms/validating proposed solutions Work with the team to establish and reinforce disciplined software development, processes, standards, and error recovery procedures are deployed Must understand software development methodologies including waterfall and agile Distribute and manage SQL development Work across the team The candidate must be willing to work during overlapping hours with US-based teams to ensure effective collaboration and communication, typically between [e.g., 6:00 PM to 11:00 PM IST], depending on project needs. Qualifications: Bachelor’s or Master's Degree in Computer Science with >= 7 years of IT experience Location: Bangalore Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less

Posted 1 month ago

Apply

7.0 years

0 Lacs

New Delhi, Delhi, India

On-site

The Technical Lead / Technical Consultant is a core role and focal point of the project team responsible for the whole technical solution and managing the day-to-day delivery. The role will focus on the technical solution architecture, detailed technical design, coaching of the development/implementation team, and governance of the technical delivery. Technical ownership of the solution from bid inception through implementation to client delivery, followed by after-sales support and best practice advice. Interactions with internal stakeholders and clients to explain technology solutions and a clear understanding of client’s business requirements through which to guide optimal design to meet their needs. Job Description: Must-Have Skills: Database (one or more of MS SQL Server, Oracle, Cloud SQL, Cloud Spanner, etc.) Data Warehouse (one or more of Big Query, SnowFlake, etc.) ETL tool (two or more of Cloud Data Fusion, Dataflow, Dataproc, Pub/Sub, Composer, Cloud Functions, Cloud Run, etc.) Experience in Cloud platforms - GCP Python, PySpark, Project & resource management SVN, JIRA, Automation workflow (Composer, Cloud Scheduler, Apache Airflow, Tidal, Tivoli or similar) Good to have Skills: UNIX shell scripting, SnowFlake, Redshift, Familiar with NoSQL such as MongoDB, etc ETL tool (Databricks / AWS Glue / AWS Lambda / Amazon Kinesis / Amazon Firehose / Azure Data Factory / ADF / DBT / Talend, Informatica, IICS (Informatica cloud) ) Experience in Cloud platforms - AWS / Azure Client-facing skills Key Responsibilities: Ability to design simple to medium data solutions for clients by using cloud architecture using GCP Strong understanding of DW, data mart, data modeling, data structures, databases, and data ingestion and transformation. Working knowledge of ETL as well as database skills Working knowledge of data modeling, data structures, databases, and ETL processes Strong understanding of relational and non-relational databases and when to use them Leadership and communication skills to collaborate with local leadership as well as our global teams Translating technical requirements into ETL/ SQL application code Document project architecture, explain the detailed design to the team, and create low-level to high-level design Create technical documents for ETL and SQL developments using Visio, PowerPoint, and other MS Office package Will need to engage with Project Managers, Business Analysts, and Application DBA to implement ETL Solutions Perform mid to complex-level tasks independently Support Clients, Data Scientists, and Analytical Consultants working on marketing solution Work with cross-functional internal teams and external clients Strong project management and organization skills . Ability to lead 1 – 2 projects of team size 2 – 3 team members. Code management systems which include Code review, deployment, cod Work closely with the QA / Testing team to help identify/implement defect reduction initiatives Work closely with the Architecture team to make sure Architecture standards and principles are followed during development Performing Proof of Concepts on new platforms/validating proposed solutions Work with the team to establish and reinforce disciplined software development, processes, standards, and error recovery procedures are deployed Must understand software development methodologies including waterfall and agile Distribute and manage SQL development Work across the team The candidate must be willing to work during overlapping hours with US-based teams to ensure effective collaboration and communication, typically between [e.g., 6:00 PM to 11:00 PM IST], depending on project needs. Qualifications: Bachelor’s or Master's Degree in Computer Science with >= 7 years of IT experience Location: Bangalore Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less

Posted 1 month ago

Apply

4.0 years

0 Lacs

India

Remote

Job Post :- AI/ML Engineer Experience - 4+ years Location - Remote Key Responsibilities: Design, build, and maintain ML infrastructure on GCP using tools such as Vertex AI, GKE, Dataflow, BigQuery, and Cloud Functions. Develop and automate ML pipelines for model training, validation, deployment, and monitoring using tools like Kubeflow Pipelines, TFX, or Vertex AI Pipelines. Work with Data Scientists to productionize ML models and support experimentation workflows. Implement model monitoring and alerting for drift, performance degradation, and data quality issues. Manage and scale containerized ML workloads using Kubernetes (GKE) and Docker. Set up CI/CD workflows for ML using tools like Cloud Build, Bitbucket, Jenkins, or similar. Ensure proper security, versioning, and compliance across the ML lifecycle. Maintain documentation, artifacts, and reusable templates for reproducibility and auditability. Having GCP MLE Certification is Plus Show more Show less

Posted 1 month ago

Apply

6.0 years

0 Lacs

Gurugram, Haryana, India

Remote

What would a typical day at your work be like? You will lead and manage the delivery of projects and be responsible for the delivery of project and team goals. Build & support data ingestion and processing pipelines. This will entail extract, load and transform of data from a wide variety of sources using latest data frameworks and technologies. Design, build, test, and maintain machine learning infrastructure and frameworks to empower data scientists to rapidly iterate on model development. Own and lead client engagement and communication on technical projects. Define project scopes and track project progress and delivery. Plan and execute project architecture and allocate work to team. Keep up to date with advances in big data technologies and run pilots to design the data architecture to scale with the increased data volume. Partner with software engineering teams to drive completion of multi-functional projects. What Do We Expect? Minimum 6 years of overall experience in data engineering and 2+ years leading a team as team lead and doing project management. Experience working with a global team and remote clients. Hands on experience in building data pipelines on various infrastructures. Knowledge of statistical and machine learning techniques. Hands on experience in integrating machine learning in data pipelines. Ability to work hands-on with the data engineers in the team in design and development of the solution using the relevant big data technologies and data warehouse concepts Strong knowledge of advanced SQL, data warehousing concepts, DataMart designing. Have strong experience in modern data platform components such as Spark, Python, etc. Experience with setting up and maintaining Data warehouse (Google BigQuery, Redshift, Snowflake) and Data Lakes (GCS, AWS S3 etc.) for an organization. Experience in building data pipeline with AWS Glue, Azure Data Factory and Google Dataflow. Experience with relational SQL and NoSQL databases, including Postgres and Cassandra / MongoDB. Strong problem solving and communication skills Show more Show less

Posted 1 month ago

Apply

0.0 - 5.0 years

0 Lacs

Gurugram, Haryana

On-site

Senior Data Engineer(GCP, Python) Gurgaon, India Information Technology 314204 Job Description About The Role: Grade Level (for internal use): 10 S&P Global Mobility The Role: Senior Data Engineer Department overview Automotive Insights at S&P Mobility, leverages technology and data science to provide unique insights, forecasts and advisory services spanning every major market and the entire automotive value chain—from product planning to marketing, sales and the aftermarket. We provide the most comprehensive data spanning the entire automotive lifecycle—past, present and future. With over 100 years of history, unmatched credentials, and the largest base of customers than any other provider, we are the industry benchmark for clients around the world, helping them make informed decisions to capitalize on opportunity and avoid risk. Our solutions are used by nearly every major OEM, 90% of the top 100 tier one suppliers, media agencies, governments, insurance companies, and financial stakeholders to provide actionable insights that enable better decisions and better results. Position summary S&P Global is seeking an experienced and driven Senior data Engineer who is passionate about delivering high-value, high-impact solutions to the world’s most demanding, high-profile clients. The ideal candidate must have at least 5 years of experience in developing and deploying data pipelines on Google Cloud Platform (GCP). They should be passionate about building high-quality, reusable pipelines using cutting-edge technologies. This role involves designing, building, and maintaining scalable data pipelines, optimizing workflows, and ensuring data integrity across multiple systems. The candidate will collaborate with data scientists, analysts, and software engineers to develop robust and efficient data solutions. Responsibilities Design, develop, and maintain scalable ETL/ELT pipelines. Optimize and automate data ingestion, transformation, and storage processes. Work with structured and unstructured data sources, ensuring data quality and consistency. Develop and maintain data models, warehouses, and databases. Collaborate with cross-functional teams to support data-driven decision-making. Ensure data security, privacy, and compliance with industry standards. Troubleshoot and resolve data-related issues in a timely manner. Monitor and improve system performance, reliability, and scalability. Stay up-to-date with emerging data technologies and recommend improvements to our data architecture and engineering practices. What you will need: Strong programming skills using python. 5+ years of experience in data engineering, ETL development, or a related role. Proficiency in SQL and experience with relational (PostgreSQL, MySQL, etc.) and NoSQL (DynamoDB, MongoDB etc…) databases. Proficiency building data pipelines in Google cloud platform(GCP) using services like DataFlow, Cloud Batch, BigQuery, BigTable, Cloud functions, Cloud Workflows, Cloud Composer etc.. Strong understanding of data modeling, data warehousing, and data governance principles. Should be capable of mentoring junior data engineers and assisting them with technical challenges. Familiarity with orchestration tools like Apache Airflow. Familiarity with containerization and orchestration. Experience with version control systems (Git) and CI/CD pipelines. Excellent problem-solving skills and ability to work in a fast-paced environment. Excellent communication skills. Hands-on experience with snowflake is a plus. Experience with big data technologies is a plus. Experience in AWS is a plus. Should be able to convert business queries into technical documentation. Education and Experience Bachelor’s degree in Computer Science, Information Systems, Information Technology, or a similar major or Certified Development Program 5+ years of experience building data pipelines using python & GCP (Google Cloud platform). About Company Statement: S&P Global delivers essential intelligence that powers decision making. We provide the world’s leading organizations with the right data, connected technologies and expertise they need to move ahead. As part of our team, you’ll help solve complex challenges that equip businesses, governments and individuals with the knowledge to adapt to a changing economic landscape. S&P Global Mobility turns invaluable insights captured from automotive data to help our clients understand today’s market, reach more customers, and shape the future of automotive mobility. About S&P Global Mobility At S&P Global Mobility, we provide invaluable insights derived from unmatched automotive data, enabling our customers to anticipate change and make decisions with conviction. Our expertise helps them to optimize their businesses, reach the right consumers, and shape the future of mobility. We open the door to automotive innovation, revealing the buying patterns of today and helping customers plan for the emerging technologies of tomorrow. For more information, visit www.spglobal.com/mobility. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 314204 Posted On: 2025-05-30 Location: Gurgaon, Haryana, India

Posted 1 month ago

Apply

4.0 years

0 Lacs

India

Remote

Job Description KLDiscovery, a leading global provider of electronic discovery, information governance and data recovery services, is currently seeking a Senior Software Engineer for an exciting new opportunity. This person will develop core parts of our eDiscovery offerings, including software development, testing, and systems automation. They will collaborate with team members, product owners, designers, architects, and other development teams to research relevant technologies and build innovative solutions that enhance our offerings and exceed customer needs. If you like working in a creative, technology-driven, high energy, collaborative, casual environment, and you have strong software development abilities, this is the opportunity for you! Hybrid or remote, work from home opportunity. Responsibilities Create, Validate and Review program code per specifications. Develop automated unit and API tests. Support bug fixes and implement enhancements to applications in Production. Create, design and review SW documentation. Utilize, communicate, and enforce coding standards. Provide Technical Support to applications in Production within defined SLA. Adhere to development processes and workflows. Assist and mentor team demonstrating technical excellence. Detects problems and areas that need improvement early and raises issues. Qualifications Fluent English (C1) At least 4 years of commercial, hands-on software development experience in C#/.NET and C++ Experience with ASP.NET Core Blazor Experience with desktop applications (Winforms preferred) Experience with background jobs and workers (e.g. Hangfire) Experience with Angular is a plus Creating dataflow/sequence/C4 diagrams Good understanding of at least one of architectural/design patterns: MVC/MVP/MVVM/Clean/Screaming/Hexagonal architectures .NET memory model and performance optimizations solutions Writing functional tests. Writing structure tests. Understanding modularity and vertical slices. Data privacy and securing desktop apps. Ability to design functionalities based on requirements Experience with Entity Framework Core Our Cultural Values Entrepreneurs At Heart, We Are a Customer First Team Sharing One Goal And One Vision. We Seek Team Members Who Are Humble - No one is above another; we all work together to meet our clients’ needs and we acknowledge our own weaknesses Hungry - We all are driven internally to be successful and to continually expand our contribution and impact Smart - We use emotional intelligence when working with one another and with clients Our culture shapes our actions, our products, and the relationships we forge with our customers. Who We Are KLDiscovery provides technology-enabled services and software to help law firms, corporations, government agencies and consumers solve complex data challenges. The company, with offices in 26 locations across 17 countries, is a global leader in delivering best-in-class eDiscovery, information governance and data recovery solutions to support the litigation, regulatory compliance, internal investigation and data recovery and management needs of our clients. Serving clients for over 30 years, KLDiscovery offers data collection and forensic investigation, early case assessment, electronic discovery and data processing, application software and data hosting for web-based document reviews, and managed document review services. In addition, through its global Ontrack Data Recovery business, KLDiscovery delivers world-class data recovery, email extraction and restoration, data destruction and tape management. KLDiscovery has been recognized as one of the fastest growing companies in North America by both Inc. Magazine (Inc. 5000) and Deloitte (Deloitte’s Technology Fast 500) and CEO Chris Weiler has been honored as a past Ernst & Young Entrepreneur of the Year™. Additionally, KLDiscovery is an Orange-level Relativity Best in Service Partner, a Relativity Premium Hosting Partner and maintains ISO/IEC 27001 Certified data centers. KLDiscovery is an Equal Opportunity Employer. Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

What you’ll do: Design, develop, and operate high scale applications across the full engineering stack. Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Research, create, and develop software applications to extend and improve on Equifax Solutions. Manage sole project priorities, deadlines, and deliverables. Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What experience you need : Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart: Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI Cloud Certification strongly preferred Show more Show less

Posted 1 month ago

Apply

10.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

We are looking for an experienced Integration Technical Lead with over 10 years of in-depth experience in Oracle Fusion Middleware technologies such as SOA Suite, Oracle Service Bus (OSB), and Oracle Data Integrator (ODI). Candidate will be responsible for leading integration initiatives including custom development, platform customization, and day-to-day operational support. A strong interest in Google Cloud Platform (GCP) is highly desirable, with clear opportunities for training and skill development. ShyftLabs is a growing data product company founded in early 2020 and works primarily with Fortune 500 companies. We deliver digital solutions built to help accelerate the growth of businesses in various industries by focusing on creating value through innovation. Job Responsibilities: 1. Integration Leadership & Development: Lead end-to-end integration design and development across on-premise and cloud systems using Oracle SOA, OSB, and ODI. Drive new integration projects, from requirements gathering through to deployment and support. Develop, customize, and maintain reusable integration components and templates. Translate complex business processes into scalable, secure, and performant integration solutions. 2. Platform Customization & Optimization: Customize Oracle Fusion middleware components to meet specific business needs and performance objectives. Evaluate existing integrations and enhance them for greater efficiency and lower latency. Implement best practices in integration design, error handling, and performance tuning. 3. Operational Excellence & Support: Own the operational stability of integration platforms including monitoring, incident resolution, and root cause analysis. Manage daily operations such as deployments, patches, backups, and performance reviews. Collaborate with IT support teams to maintain integration SLAs, uptime and reliability. 4. Cloud Integration & GCP Adoption: Contribute to the design of hybrid and cloud-native integration architectures using GCP. Learn and eventually implement integration patterns using tools like Apigee, Pub/Sub, Cloud Functions, and Dataflow. Participate in GCP migration initiative for legacy integration assets Basic Qualifications: 10+ years of hands-on experience with Oracle SOA Suite, OSB, and ODI in enterprise environments. Expertise in web services (REST/SOAP), XML, XSD, XSLT, XPath, and service orchestration. Strong skills in platform customization, new integration development, integration monitoring, alerting, and troubleshooting processes and long-term system maintenance. Experience with performance optimization, fault tolerance, and secure integrations. Excellent communication and team leadership skills. Preferred Qualifications: Exposure to Google Cloud Platform (GCP) or strong interest and ability to learn. Familiarity with GCP services for integration (Pub/Sub, Cloud Store/Functions). Understanding of containerized deployments using Docker and Kubernetes. Experience with DevOps tools and CI/CD pipelines for integration delivery. We are proud to offer a competitive salary alongside a strong insurance package. We pride ourselves on the growth of our employees, offering extensive learning and development resources. Show more Show less

Posted 1 month ago

Apply

2.0 years

0 Lacs

India

On-site

Description The Position We are seeking a seasoned engineer with a passion for changing the way millions of people save energy. You’ll work within the Engineering team to build and improve our platforms to deliver flexible and creative solutions to our utility partners and end users and help us achieve our ambitious goals for our business and the planet. We are seeking a skilled and passionate Data Engineer - Business Intelligence with expertise in Data Engineering and BI Reporting to join our development team. As a Data Engineer, you will play a crucial role developing different components, harnessing the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, data processing and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. You will also work on creating BI reports as well as development of a Business Intelligence platform that will enable users to create reports and dashboards based on their requirements. You will coordinate with the rest of the team working on different layers of the infrastructure. Therefore, a commitment to collaborative problem solving, sophisticated design, and quality product is important. You will own the development and its quality independently and be responsible for high quality deliverables. And you will work with a great team with excellent benefits. Responsibilities & Skills You should: Be excited to work with talented, committed people in a fast-paced environment. Have a proven experience as a Data Engineer with a focus on BI reporting.. Be designing, building, and maintaining high performance solutions with reusable, and reliable code. Use a rigorous approach for product improvement and customer satisfaction. Love developing great software as a seasoned product engineer. Be ready, able, and willing to jump onto a call with stakeholders to help solve problems. Be able to deliver against several initiatives simultaneously. Have a strong eye for detail and quality of code. Have an agile mindset. Have strong problem-solving skills and attention to detail. Required Skills (Data Engineer) You ideally have 2+ or more years of professional experience. Design, build, and maintain scalable data pipelines and ETL processes to support business analytics and reporting needs. Strong Experience with SQL for querying and transforming large datasets, and optimizing query performance in relational databases. Proficiency in Python for building and automating data pipelines, ETL processes, and data integration workflows. Familiarity with big data frameworks such as Apache Spark or PySpark for distributed data processing. Strong Understanding of data modeling principles for building scalable and efficient data architectures (e.g., star schema, snowflake schema). Good to have experience with Databricks for managing and processing large datasets, implementing Delta Lake, and leveraging its collaborative environment. Knowledge of Google Cloud Platform (GCP) services like BigQuery, Dataflow, Pub/Sub, and Cloud Storage for end-to-end data engineering solutions. Familiarity with version control systems such as Git and CI/CD pipelines for managing code and deploying workflows. Awareness of data governance and security best practices, including access control, data masking, and compliance with industry standards. Exposure to monitoring and logging tools like Datadog, Cloud Logging, or ELK stack for maintaining pipeline reliability. Ability to understand business requirements and translate them into technical requirements. Inclination to design solutions for complex data problems. Ability to deliver against several initiatives simultaneously as a multiplier. Demonstrable experience with writing unit and functional tests. Required Skills (BI Reporting) Strong experience in developing Business Intelligence reports and dashboards via tools such as Tableau, PowerBI, Sigma etc. Ability to analyse and deeply understand the data, relate it to the business application and derive meaningful insights from the data. Required The following experiences are not required, but you'll stand out from other applicants if you have any of the following, in our order of importance: You are an experienced developer - a minimum of 2+ years of professional experience. Work experience & strong proficiency in Python, SQL and BI Reporting and its associated frameworks (like Flask, FastAPI etc.). Experience with cloud infrastructure like AWS/GCP or other cloud service provider experience CI/CD experience You are a Git guru and revel in collaborative workflows You work on the command line confidently and are familiar with all the goodies that the linux toolkit can provide Familiarity with Apache Spark and PySpark. Qualifications Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Uplight provides equal employment opportunities to all employees and applicants and prohibits discrimination and harassment of any type without regard to race (including hair texture and hairstyles), color, religion (including head coverings), age, sex, national origin, caste, disability status, genetics, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Trivandrum, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. L&A Business Consultant Working as part of the Consulting team, you will take part in engagements related to a wide range of topics. Some examples of domains in which you will support our clients include the following: Proficient in Individual and Group Life Insurance concepts, different type of Annuity products etc. Proficient in different insurance plans - Qualified/Non-Qualified Plans, IRA, Roth IRA, CRA, SEP Solid knowledge on the Policy Life cycle Illustrations/Quote/Rating New Business & Underwriting Policy Servicing and Administration Billing & Payment Claims Processing Disbursement (Systematic withdrawals, RMD, Surrenders) Regulatory Changes & Taxation Understanding of business rules of Pay-out Understanding on upstream and downstream interfaces for policy lifecycle Experience in DXC Platforms – Vantage, wmA, nbA, CSA, Cyber-life, Life70, Life Asia, PerformancePlus Consulting Skills – Experience in creating business process map for future state architecture, creating WBS for overall conversion strategy, requirement refinement process in multi-vendor engagement. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Technology Skills - Experienced in data migration projects, ensuring seamless transfer of data between systems while maintaining data integrity and security. Skilled in data analytics, utilizing various tools and techniques to extract insights and drive informed decision-making. Strong understanding of data governance principles and best practices, ensuring data quality and compliance. Collaborative team player, able to work closely with stakeholders and technical teams to define requirements and implement effective solutions. Industry certifications (AAPA/LOMA) will be added advantage. Experience on these COTS product is preferrable. FAST ALIP OIPA wmA We expect you to work effectively as a team member and build good relationships with the client. You will have the opportunity to expand your domain knowledge and skills and will be able to collaborate frequently with other EY professionals with a wide variety of expertise. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Kolkata, West Bengal, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. L&A Business Consultant Working as part of the Consulting team, you will take part in engagements related to a wide range of topics. Some examples of domains in which you will support our clients include the following: Proficient in Individual and Group Life Insurance concepts, different type of Annuity products etc. Proficient in different insurance plans - Qualified/Non-Qualified Plans, IRA, Roth IRA, CRA, SEP Solid knowledge on the Policy Life cycle Illustrations/Quote/Rating New Business & Underwriting Policy Servicing and Administration Billing & Payment Claims Processing Disbursement (Systematic withdrawals, RMD, Surrenders) Regulatory Changes & Taxation Understanding of business rules of Pay-out Understanding on upstream and downstream interfaces for policy lifecycle Experience in DXC Platforms – Vantage, wmA, nbA, CSA, Cyber-life, Life70, Life Asia, PerformancePlus Consulting Skills – Experience in creating business process map for future state architecture, creating WBS for overall conversion strategy, requirement refinement process in multi-vendor engagement. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Technology Skills - Experienced in data migration projects, ensuring seamless transfer of data between systems while maintaining data integrity and security. Skilled in data analytics, utilizing various tools and techniques to extract insights and drive informed decision-making. Strong understanding of data governance principles and best practices, ensuring data quality and compliance. Collaborative team player, able to work closely with stakeholders and technical teams to define requirements and implement effective solutions. Industry certifications (AAPA/LOMA) will be added advantage. Experience on these COTS product is preferrable. FAST ALIP OIPA wmA We expect you to work effectively as a team member and build good relationships with the client. You will have the opportunity to expand your domain knowledge and skills and will be able to collaborate frequently with other EY professionals with a wide variety of expertise. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 month ago

Apply

7.0 years

0 Lacs

Andhra Pradesh

On-site

About the Role: We are seeking experienced Data Analysts to join our growing team. The ideal candidate will have a strong background in data analysis, complex SQL queries, and experience working within large-scale Data Warehouse environments. Familiarity with cloud technologies such as GCP or AWS is mandatory, and prior exposure to AWS EMR and Apache Airflow is highly desirable. ________________________________________ Key Responsibilities: Perform deep data analysis to support business decision-making, reporting, and strategic initiatives. Write and optimize complex SQL queries for data extraction, transformation, and reporting across large, distributed datasets. Work extensively within data warehouse environments to design, test, and deliver data solutions. Collaborate with data engineers, business analysts, and stakeholders to understand requirements and translate them into technical deliverables. Analyze large, complex datasets to identify trends, patterns, and opportunities for business growth. Develop, maintain, and optimize ETL/ELT pipelines; familiarity with Apache Airflow for workflow orchestration is a plus. Work with cloud-native tools on GCP or AWS to manage and analyze data effectively. Support the development of data quality standards and ensure data integrity across all reporting platforms. Document data models, queries, processes, and workflows for knowledge sharing and scalability.____________________________________ Required Skills & Experience: Minimum 7 years of professional experience in Data Analysis. Strong, demonstrable expertise in SQL, including writing, debugging, and optimizing complex queries. Solid experience working within a Data Warehouse environment (e.g., BigQuery, Redshift, Snowflake, etc.). Hands-on experience with GCP (BigQuery, Dataflow) or AWS (Redshift, Athena, S3, EMR). Knowledge of data modeling concepts, best practices, and data architecture principles. Understanding of ETL processes and tools; hands-on experience with Apache Airflow is a strong plus. Strong analytical thinking, attention to detail, and problem-solving skills. Ability to work in a fast-paced environment and manage multiple priorities. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 1 month ago

Apply

7.0 - 12.0 years

25 - 30 Lacs

Coimbatore

Work from Office

Job Summary: We are seeking a Senior Data & AI/ML Engineer (Lead) with strong expertise in Google Cloud Platform (GCP) and hands-on experience in building, deploying, and managing machine learning solutions at scale. The ideal candidate will lead AI/ML initiatives, mentor a team of engineers, and collaborate cross-functionally to drive data-driven innovation and business value. Key Responsibilities: Lead the end-to-end design and implementation of scalable AI/ML models and data pipelines on GCP. Drive architecture and design discussions for AI/ML solutions across cloud-native environments. Collaborate with data scientists, analysts, and business stakeholders to define requirements and deliver intelligent solutions. Manage and optimize data pipelines using tools such as Dataflow, Pub/Sub, BigQuery, Cloud Functions , etc. Deploy ML models using Vertex AI , AI Platform , or custom CI/CD pipelines. Monitor model performance and manage model retraining, versioning, and lifecycle. Ensure best practices in data governance, security, and compliance. Mentor junior engineers and data scientists; provide leadership in code reviews and project planning. Required Skills: 8+ years of experience in Data Engineering, Machine Learning, or AI application development. Strong programming skills in Python (preferred) and/or Java/Scala . Hands-on experience with GCP services : BigQuery, Vertex AI, Cloud Functions, Dataflow, Pub/Sub, GCS, etc. Proficient in ML libraries/frameworks like TensorFlow, PyTorch, Scikit-learn . Deep understanding of data modeling, feature engineering, and model evaluation techniques. Experience with Docker , Kubernetes , and ML Ops tools. Strong background in SQL and data warehousing concepts. Exposure to data security and compliance best practices (GDPR, HIPAA, etc.). Nice to Have: GCP Certification (e.g., Professional Machine Learning Engineer , Data Engineer ) Experience with streaming data architectures . Familiarity with AI ethics , explainability , and bias mitigation techniques . Education: Bachelors or Master’s degree in Computer Science, Data Science, Artificial Intelligence, or a related field.

Posted 1 month ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Description Industrial System Analytics (ISA) is a product group that develops cloud analytic solutions using GCP tools/Techniques. This Product Manager position within Vehicle Product Management Product Line is ideal for the technically and Product oriented individual who has experience managing product and building roadmaps, strict to timelines, enhancing customer satisfaction, designing, building, deploying, and supporting cloud applications and is interested in working with a portfolio of strategic analytic solutions. Tech Anchor/Solution Architect Job Description: We are seeking a highly technical and experienced individual to fill the role of Tech Anchor/Solution Architect within our Industrial System Analytics (ISA) team. As a Tech Anchor, you will provide technical leadership and guidance to the development team, driving the design and implementation of cloud analytic solutions using GCP tools and techniques. Responsibilities: Provide technical guidance, mentorship, and code-level support to the development team Work with the team to develop and implement solutions using GCP tools (BigQuery, GCS, Dataflow, Dataproc, etc.) and APIs/Microservices Ensure adherence to security, legal, and Ford standard/policy compliance Drive effective and efficient delivery from the team, focusing on speed Identify risks and implement mitigation/contingency plans Assess the overall health of the product and raise key decisions Manage onboarding of new resources Lead the design and architecture of complex systems, ensuring scalability, reliability, and performance Participate in code reviews and contribute to improving code quality Champion Agile software processes, culture, best practices, and techniques Ensure effective usage of Rally and derive meaningful insights Ensure implementation of DevSecOps and software craftsmanship practices (CI/CD, TDD, Pair Programming) Responsibilities Technical Requirements: Bachelor’s /Master’s/ PhD in engineering, Computer Science, or in a related field Senior-level experience (8+ years) as a software engineer Deep and broad knowledge of: Programming Languages: Java, JavaScript, Python, SQL Front-End Technologies: React, Angular, HTML, CSS Back-End Technologies: Node.js, Python frameworks (Django, Flask), Java frameworks (Spring) Cloud Technologies: GCP (BigQuery, GCS, Dataflow, Dataproc, etc.) Deployment Practices: Docker, Kubernetes, CI/CD pipelines Experience with Agile software development methodologies Understanding/Exposure of: CI, CD, and Test-Driven Development (GitHub, Terraform/Tekton, 42Crunch, SonarQube, FOSSA, Checkmarx etc.) Qualifications Good to Have: Experience with GCP services such as: Cloud Run Cloud Build Cloud Source Repositories Cloud Workflows Knowledge of containerization using Docker and Kubernetes Experience with serverless architecture and event-driven design patterns Familiarity with machine learning and data science concepts Experience with data engineering and data warehousing Certification in GCP or other cloud platforms Soft Skills: Strong communication and collaboration skills Ability to work in a fast-paced, agile environment Proactive attitude and start-up mindset Show more Show less

Posted 1 month ago

Apply

8.0 - 13.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

We are looking for a skilled Lead Data Engineer to become an integral part of our vibrant team. In this role, you will take charge of designing, developing, and maintaining data integration solutions tailored to our clients' needs. You will oversee a team of engineers, ensuring the delivery of high-quality, scalable, and efficient data integration solutions. This role presents a thrilling challenge for a seasoned data integration expert who is passionate about technology and excels in a fast-paced, dynamic setting. Responsibilities Design, develop, and maintain client-specific data integration solutions Oversee a team of engineers to guarantee high-quality, scalable, and efficient delivery of data integration solutions Work with cross-functional teams to comprehend business requirements and create suitable data integration solutions Ensure the security, reliability, and efficiency of data integration solutions Create and update documentation, including technical specifications, data flow diagrams, and data mappings Stay informed and up-to-date with the latest data integration methods and tools Requirements Bachelor’s degree in Computer Science, Information Systems, or a related field 8-13 years of experience in data engineering, data integration, or related fields Proficiency in cloud-native or Spark-based ETL tools such as AWS Glue, Azure Data Factory, or GCP Dataflow Strong knowledge of SQL for data querying and manipulation Background in Snowflake for cloud data warehousing Familiarity with at least one cloud platform such as AWS, Azure, or GCP Experience in leading a team of engineers on data integration projects Good verbal and written communication skills in English at a B2 level Nice to have Background in ETL using Python Show more Show less

Posted 1 month ago

Apply

5.0 - 8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Our organization is seeking a skilled Senior Data Engineer to become an integral part of our team. In this role, you will focus on projects related to data integration and ETL on cloud-based platforms. You will take charge of creating and executing sophisticated data solutions, ensuring data accuracy, dependability, and accessibility. Responsibilities Create and execute sophisticated data solutions on cloud-based platforms Build ETL processes utilizing SQL, Python, and other applicable technologies Maintain data accuracy, reliability, and accessibility for all stakeholders Work with cross-functional teams to comprehend data integration needs and specifications Produce and sustain documentation, including technical specifications, data flow diagrams, and data mappings Enhance and tune data integration processes for optimal performance and efficiency, guaranteeing data accuracy and integrity Requirements Bachelor’s degree in Computer Science, Electrical Engineering, or a related field 5-8 years of experience in data engineering Proficiency in cloud-native or Spark-based ETL tools such as AWS Glue, Azure Data Factory, or GCP Dataflow Strong knowledge of SQL for data querying and manipulation Qualifications in Snowflake for data warehousing Familiarity with cloud platforms like AWS, GCP, or Azure for data storage and processing Excellent problem-solving skills and attention to detail Good verbal and written communication skills in English at a B2 level Nice to have Background in ETL using Python Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

JD for a Databricks Data Engineer Key Responsibilities: Design, develop, and maintain high-performance data pipelines using Databricks and Apache Spark. Implement medallion architecture (Bronze, Silver, Gold layers) for efficient data processing. Optimize Delta Lake tables, partitioning, Z-ordering, and performance tuning in Databricks. Develop ETL/ELT processes using PySpark, SQL, and Databricks Workflows. Manage Databricks clusters, jobs, and notebooks for batch and real-time data processing. Work with Azure Data Lake, AWS S3, or GCP Cloud Storage for data ingestion and storage. Implement CI/CD pipelines for Databricks jobs and notebooks using DevOps tools. Monitor and troubleshoot performance bottlenecks, cluster optimization, and cost management. Ensure data quality, governance, and security using Unity Catalog, ACLs, and encryption. Collaborate with Data Scientists, Analysts, and Business Teams to deliver insights. Required Skills & Experience: 5+ years of hands-on experience in Databricks, Apache Spark, and Delta Lake. Strong SQL, PySpark, and Python programming skills. Experience in Azure Data Factory (ADF), AWS Glue, or GCP Dataflow. Expertise in performance tuning, indexing, caching, and parallel processing. Hands-on experience with Lakehouse architecture and Databricks SQL. Strong understanding of data governance, lineage, and cataloging (e.g., Unity Catalog). Experience with CI/CD pipelines (Azure DevOps, GitHub Actions, or Jenkins). Familiarity with Airflow, Databricks Workflows, or orchestration tools. Strong problem-solving skills with experience in troubleshooting Spark jobs. Nice to Have: Hands-on experience with Kafka, Event Hubs, or real-time streaming in Databricks. Certifications in Databricks, Azure, AWS, or GCP. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Join Us About VOIS VO IS (Vodafone Intelligent Solutions) is a strategic arm of Vodafone Group Plc, creating value and enhancing quality and efficiency across 28 countries, and operating from 7 locations: Albania, Egypt, Hungary, India, Romania, Spain and the UK. Over 29,000 highly skilled individuals are dedicated to being Vodafone Group’s partner of choice for talent, technology, and transformation. We deliver the best services across IT, Business Intelligence Services, Customer Operations, Business Operations, HR, Finance, Supply Chain, HR Operations, and many more. Established in 2006, VO IS has evolved into a global, multi-functional organisation, a Centre of Excellence for Intelligent Solutions focused on adding value and delivering business outcomes for Vodafone What You’ll Do We are seeking a highly experienced Solutions Architect to lead the design and implementation of end-to-end solutions across Employee Data, HR Dashboards, and People Analytics. This individual will play a key role in integrating data from SAP SuccessFactors into our Google Cloud-based data lake, designing ML-driven analytics models, and enabling data-driven decisions through Qlik dashboards, with a roadmap to transition into SAP Analytics Cloud (SAC). The ideal candidate will have deep expertise in both HR domain and enterprise technology architecture, with the ability to connect business needs with scalable, efficient, and secure data and analytics solutions. Who You Are Key accountabilities and decision ownership: Strong expertise in SAP SuccessFactors (Employee Central, Talent modules, Workforce Analytics). Deep understanding of people data models, HR metrics, and employee lifecycle data. Proven experience with Google Cloud Platform, including BigQuery, Vertex AI, Dataflow, etc. Hands-on experience in designing and deploying machine learning models for HR use cases. Experience with Qlik for dashboarding and SAC (SAP Analytics Cloud) is preferred. Lead end-to-end solution design for employee data and people analytics use cases, ensuring alignment with business and IT strategy. Architect data flows from SAP SuccessFactors into the Google Cloud Platform for advanced analytics. Define and implement ML/AI-based models to derive actionable insights on workforce trends, talent, and engagement. Analytics and Dashboard solutions designed on time with thought leadership on end-to-end systems architecture Not a perfect fit? Worried that you don’t meet all the desired criteria exactly? At Vodafone we are passionate about empowering people and creating a workplace where everyone can thrive, whatever their personal or professional background. If you’re excited about this role but your experience doesn’t align exactly with every part of the job description, we encourage you to still apply as you may be the right candidate for this role or another opportunity. What's In It For You Last: VOIS Equal Opportunity Employer Commitment India VO IS is proud to be an Equal Employment Opportunity Employer. We celebrate differences and we welcome and value diverse people and insights. We believe that being authentically human and inclusive powers our employees’ growth and enables them to create a positive impact on themselves and society. We do not discriminate based on age, colour, gender (including pregnancy, childbirth, or related medical conditions), gender identity, gender expression, national origin, race, religion, sexual orientation, status as an individual with a disability, or other applicable legally protected characteristics. As a result of living and breathing our commitment, our employees have helped us get certified as a Great Place to Work in India for four years running. We have been also highlighted among the Top 10 Best Workplaces for Millennials, Equity, and Inclusion , Top 50 Best Workplaces for Women , Top 25 Best Workplaces in IT & IT-BPM and 10th Overall Best Workplaces in India by the Great Place to Work Institute in 2024. These achievements position us among a select group of trustworthy and high-performing companies which put their employees at the heart of everything they do. By joining us, you are part of our commitment. We look forward to welcoming you into our family which represents a variety of cultures, backgrounds, perspectives, and skills! Who We Are We are a leading international Telco, serving millions of customers. At Vodafone, we believe that connectivity is a force for good. If we use it for the things that really matter, it can improve people's lives and the world around us. Through our technology we empower people, connecting everyone regardless of who they are or where they live and we protect the planet, whilst helping our customers do the same. Belonging at Vodafone isn't a concept; it's lived, breathed, and cultivated through everything we do. You'll be part of a global and diverse community, with many different minds, abilities, backgrounds and cultures. ;We're committed to increase diversity, ensure equal representation, and make Vodafone a place everyone feels safe, valued and included. If you require any reasonable adjustments or have an accessibility request as part of your recruitment journey, for example, extended time or breaks in between online assessments, please refer to https://careers.vodafone.com/application-adjustments/ for guidance. Together we can. Show more Show less

Posted 1 month ago

Apply

4.0 - 6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

_VOIS Intro About _VOIS: _VO IS (Vodafone Intelligent Solutions) is a strategic arm of Vodafone Group Plc, creating value and enhancing quality and efficiency across 28 countries, and operating from 7 locations: Albania, Egypt, Hungary, India, Romania, Spain and the UK. Over 29,000 highly skilled individuals are dedicated to being Vodafone Group’s partner of choice for talent, technology, and transformation. We deliver the best services across IT, Business Intelligence Services, Customer Operations, Business Operations, HR, Finance, Supply Chain, HR Operations, and many more. Established in 2006, _VO IS has evolved into a global, multi-functional organisation, a Centre of Excellence for Intelligent Solutions focused on adding value and delivering business outcomes for Vodafone. _VOIS Centre Intro About _VOIS India: In 2009, _VO IS started operating in India and now has established global delivery centres in Pune, Bangalore and Ahmedabad. With more than 14,500 employees, _VO IS India supports global markets and group functions of Vodafone, and delivers best-in-class customer experience through multi-functional services in the areas of Information Technology, Networks, Business Intelligence and Analytics, Digital Business Solutions (Robotics & AI), Commercial Operations (Consumer & Business), Intelligent Operations, Finance Operations, Supply Chain Operations and HR Operations and more. Job Role Related Content (Role specific) Design, develop, and maintain scalable data pipelines and ETL processes using GCP services such as BigQuery, Cloud Data Fusion, Dataflow, Pub/Sub, Cloud Storage, Composer ,Cloud Function, Cloud RUN . Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver high-quality data solutions. Implement data integration solutions to ingest, process, and store large volumes of structured and unstructured data from various sources. Optimize and tune data pipelines for performance, reliability, and cost-efficiency. . Ensure data quality and integrity through data validation, cleansing, and transformation processes. Develop and maintain data models, schemas, and metadata to support data analytics and reporting. Monitor and troubleshoot data pipeline issues, ensuring timely resolution and minimal disruption to data workflows. Stay up-to-date with the latest GCP technologies and best practices, and provide recommendations for continuous improvement. Mentor and guide junior data engineers, fostering a culture of knowledge sharing and collaboration. 4 to 6 years of experience in data engineering, with a strong focus on GCP. Proficiency in GCP services such as BigQuery, Cloud Data Fusion, Dataflow, Pub/Sub, Cloud Storage, Composer ,Cloud Function, Cloud RUN. Strong programming skills in Python, PLSQL. Experience with SQL and NoSQL databases. Knowledge of data warehousing concepts and best practices. Familiarity with data integration tools and frameworks. Excellent problem-solving and analytical skills. _VOIS Equal Opportunity Employer Commitment India _VO IS is proud to be an Equal Employment Opportunity Employer. We celebrate differences and we welcome and value diverse people and insights. We believe that being authentically human and inclusive powers our employees’ growth and enables them to create a positive impact on themselves and society. We do not discriminate based on age, colour, gender (including pregnancy, childbirth, or related medical conditions), gender identity, gender expression, national origin, race, religion, sexual orientation, status as an individual with a disability, or other applicable legally protected characteristics. As a result of living and breathing our commitment, our employees have helped us get certified as a Great Place to Work in India for four years running. We have been also highlighted among the Top 5 Best Workplaces for Diversity, Equity, and Inclusion , Top 10 Best Workplaces for Women , Top 25 Best Workplaces in IT & IT-BPM and 14th Overall Best Workplaces in India by the Great Place to Work Institute in 2023. These achievements position us among a select group of trustworthy and high-performing companies which put their employees at the heart of everything they do. By joining us, you are part of our commitment. We look forward to welcoming you into our family which represents a variety of cultures, backgrounds, perspectives, and skills! Apply now, and we’ll be in touch! Show more Show less

Posted 1 month ago

Apply

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Description This role is critical in bringing all customer related data sources into the Data Platform and delivering transformed customer knowledge and insights to the enterprise through real-time interfaces and data solutions. The ideal candidate will have deep expertise in software engineering, data engineering, cloud platforms, and product-centric thinking, coupled with strong leadership and execution capabilities. The candidate also should have experience and understanding of customer data management and activation of the same for business use cases. Responsibilities Strategic Leadership: Define and drive the roadmap for Ford’s Customer 360 Data Platform, aligning with business and technology goals. Design and oversee data ingestion strategies for both batch and streaming data from various sources, including Oracle, MySQL, PostgreSQL, DB2, Mainframe, Kafka, file systems and third-party vendor sources. Ensure seamless integration of structured and unstructured data into the enterprise data platform. Lead efficient and scalable ingestion operations leading a group of Data Stewards and Product Owners in close collaboration with EDP ingestion teams. Demand intake, review, prioritization based on business drivers and running efficient operations are key deliverables. Web Services Development & Adoption: Oversee the design, development, and deployment of scalable web services, ensuring broad enterprise adoption. Data Governance & Compliance: Work with Enterprise Data Platform team to implement best-in-class data governance, security, and privacy frameworks to ensure compliance with industry regulations. AI/ML Enablement: Lead efforts to integrate AI-driven capabilities such as SQL-to-code conversion, LLM-powered debugging, and self-service ingestion frameworks. Engineering & Operational Excellence: Establish best practices for data engineering, DevSecOps, reliability, and observability. Talent & Organization Building: Attract, develop, and mentor top talent in data engineering and platform development, fostering a high-performance team culture. Cross-functional Collaboration: Partner with business units, product teams, and other technology leaders to ensure seamless data accessibility and usability across Ford. Data Ingestion and Integration into Enterprise Data Platform (EDP): Qualifications Bachelor’s degree in Computer Science, Engineering, Data Science, or a related field; an advanced degree is a plus. 10+ years of experience in data engineering, cloud platforms, or enterprise-scale data management, with at least 3 years in a leadership role. Proven track record of delivering large-scale data platforms, metadata catalogs, operations, services, and real-time&streaming architectures. Expertise in GCP (CloudRun, APIGEE, PostgreSQL, Dataflow, Pub/Sub), and other Modern Data Stack technologies. Experience in implementing data governance, classification, and security policies at an enterprise level. Exceptional leadership skills with a demonstrated ability to drive alignment, influence stakeholders, and lead diverse global teams. Excellent communication and executive presence, capable of engaging with senior leadership and business partners. Show more Show less

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies