Home
Jobs

607 Dataflow Jobs - Page 3

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Demonstrate a deep understanding of cloud native, distributed micro service based architectures Deliver solutions for complex business problems through software standard SDLC Build strong relationships with both internal and external stakeholders including product, business and sales partners Demonstrate excellent communication skills with the ability to both simplify complex problems and also dive deeper if needed Build and manage strong technical teams that deliver complex software solutions that scale Manage teams with cross functional skills that include software, quality, reliability engineers, project managers and scrum masters Provide deep troubleshooting skills with the ability to lead and solve production and customer issues under pressure Leverage strong experience in full stack software development and public cloud like GCP and AWS Mentor, coach and develop junior and senior software, quality and reliability engineers Lead with a data/metrics driven mindset with a maniacal focus towards optimizing and creating efficient solutions Ensure compliance with EFX secure software development guidelines and best practices and responsible for meeting and maintaining QE, DevSec, and FinOps KPIs Define, maintain and report SLA, SLO, SLIs meeting EFX engineering standards in partnership with the product, engineering and architecture teams Collaborate with architects, SRE leads and other technical leadership on strategic technical direction, guidelines, and best practices Drive up-to-date technical documentation including support, end user documentation and run books Lead Sprint planning, Sprint Retrospectives, and other team activity Responsible for implementation architecture decision making associated with Product features/stories, refactoring work, and EOSL decisions Create and deliver technical presentations to internal and external technical and non-technical stakeholders communicating with clarity and precision, and present complex information in a concise format that is audience appropriate What Experience You Need Bachelor's degree or equivalent experience 7+ years of software engineering experience 7+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 7+ years experience with Cloud technology: GCP, AWS, or Azure 7+ years experience designing and developing cloud-native solutions 7+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 7+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Strong communication and presentation skills Strong leadership qualities Demonstrated problem solving skills and the ability to resolve conflicts Experience creating and maintaining product and software roadmaps Experience overseeing yearly as well as product/project budgets Working in a highly regulated environment Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI Show more Show less

Posted 4 days ago

Apply

2.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

What you’ll do: As a Senior Developer at Equifax, you will oversee and steer the delivery of innovative batch and data products, primarily leveraging Java and the Google Cloud Platform (GCP). Your expertise will ensure the efficient and timely deployment of high quality data solutions that support our business objectives and client needs. Project Development: Development, and deployment of batch and data products, ensuring alignment with business goals and technical requirements. Technical Oversight: Provide technical direction and oversee the implementation of solutions using Java and GCP, ensuring best practices in coding, performance, and security. Team Management: Mentor and guide junior developers and engineers, fostering a collaborative and high performance environment. Stakeholder Collaboration: Work closely with cross functional teams including product managers, business analysts, and other stakeholders to gather requirements and translate them into technical solutions. Documentation and Compliance: Maintain comprehensive documentation for all technical processes, ensuring compliance with internal and external standards. Continuous Improvement: Advocate for and implement continuous improvement practices within the team, staying abreast of emerging technologies and methodologies. What experience you need: Education: Bachelor's degree in Computer Science, Information Technology, or a related field. Experience: Minimum of 2-5 years of relevant experience in software development, with a focus on batch processing and data solutions. Technical Skills: Proficiency in Java, with a strong understanding of its ecosystems. 2+ year of relevant Java, Sprint, Spring Boot, REST, Microservices, Hibernate, JPA, RDBMS Minimum 2 Git, CI/CD Pipelines, Jenkins Experience with Google Cloud Platform, including BigQuery, Cloud Storage, Dataflow, Big Table and other GCP tools. Familiarity with ETL processes, data modeling, and SQL. Soft Skills: Strong problem solving abilities and a proactive approach to project management. Effective communication and interpersonal skills, with the ability to convey technical concepts to nontechnical stakeholders. What could set you apart: Technical Skills: Proficiency in Java, with a strong understanding of its ecosystems. 2+ year of relevant Java, Sprint, Spring Boot, REST, Microservices, Hibernate, JPA, RDBMS Minimum 2 Git, CI/CD Pipelines, Jenkins Experience with Google Cloud Platform, including Big Query, Cloud Storage, Dataflow, Big Table and other GCP tools. Familiarity with ETL processes, data modeling, and SQL. Certification in Google Cloud (e.g., Associate Cloud Engineer). Experience with other cloud platforms (AWS, Azure) is a plus. Understanding of data privacy regulations and compliance frameworks. Show more Show less

Posted 4 days ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

🚀 We're Hiring: Senior Software Engineer – GCP | Python | Angular We’re looking for a highly skilled and passionate Software Engineer to join our fast-paced, product-focused engineering team. In this role, you’ll be involved in end-to-end development — from design and implementation to testing, deployment, and support. If you thrive in a modern cloud-native, CI/CD-driven development environment and enjoy working on impactful features across the stack, we’d love to hear from you! 📍 Location: Chennai 💼 Join a cutting-edge digital team powering innovation for a globally renowned automotive and mobility leader. 🔧 Key Skills Required : Languages & Frameworks : Python, Java, JavaScript (Node.js), Angular, RESTful APIs Cloud & DevOps : Google Cloud Platform (GCP), Cloud Run, BigQuery, Git, Jenkins, CI/CD Data & Infrastructure : Dataflow, Terraform, Airflow Testing & Best Practices : Jest, Mocha, TDD, Clean Code, Design Patterns 👤 Experience & Qualifications : 5+ years of professional software development experience Bachelor’s degree in Computer Science, Engineering, or related field Experience building scalable full-stack solutions in a cloud environment Strong understanding of Agile, CI/CD, and DevOps practices ✨ Why Join Us? Work on cutting-edge tech including LLM integrations Be part of a team that values quality, ownership, and innovation Collaborate across product, engineering, and DevOps in a cloud-native setup 📩 Interested? Drop your profile or DM for a quick conversation. 📌 Immediate to 30 days joiners preferred #Hiring #SoftwareEngineer #FullStackDeveloper #Python #Angular #GCP #CloudRun #BigQuery #DevOps #LLM #CI/CD #ImmediateJoiners #ChennaiJobs #GoogleCloud Show more Show less

Posted 4 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

We are seeking a skilled Database Migration Specialist with deep expertise in mainframe modernization and data migration to cloud platforms such as AWS, Azure, or GCP . The ideal candidate will have hands-on experience migrating legacy systems (COBOL, DB2, IMS, VSAM, etc.) to modern cloud-native databases like PostgreSQL, Oracle, or NoSQL . What will your job look like? Lead and execute end-to-end mainframe-to-cloud database migration projects. Analyze legacy systems (z/OS, Unisys) and design modern data architectures. Extract, transform, and load (ETL) complex datasets ensuring data integrity and taxonomy alignment. Collaborate with cloud architects and application teams to ensure seamless integration. Optimize performance and scalability of migrated databases. Document migration processes, tools, and best practices. Required Skills & Experience 5+ years in mainframe systems (COBOL, CICS, DB2, IMS, JCL, VSAM, Datacom). Proven experience in cloud migration (AWS DMS, Azure Data Factory, GCP Dataflow, etc.). Strong knowledge of ETL tools , data modeling, and schema conversion. Experience with PostgreSQL, Oracle, or other cloud-native databases . Familiarity with data governance , security, and compliance in cloud environments. Excellent problem-solving and communication skills. Show more Show less

Posted 4 days ago

Apply

1.0 years

0 Lacs

India

On-site

Job description : 1. Provide counselling and guidance to students and help them choose the most appropriate courses. 2. Maintain student records and admission details. 3. Handle student queries before and after the admission process. 4. Handling dataflow, exam registration for medical overseas licensure examinations. 5. Manage daily activity report 6. Maintain master data of customers and clients 7. Ensure the smooth flow of business activities Qualification : 1. Freshers can apply **** 2. Excellent communication skills ( English & Malayalam ) 3. Proficient with Microsoft Office, Excel 4. Customer Handling Skills 5. Excellent record keeping skills. 6. Ability to multitask and prioritize daily workload Please share your updated resume with photo Job Type: Full-time Pay: From ₹12,000.00 per month Schedule: Day shift Education: Bachelor's (Required) Experience: Admin: 1 year (Required) Work Location: In person

Posted 4 days ago

Apply

7.0 - 9.0 years

0 Lacs

New Delhi, Delhi, India

On-site

Linkedin logo

The purpose of this role is to understand, model and facilitate change in a significant area of the business and technology portfolio either by line of business, geography or specific architecture domain whilst building the overall Architecture capability and knowledge base of the company. Job Description: Role Overview : We are seeking a highly skilled and motivated Cloud Data Engineering Manager to join our team. The role is critical to the development of a cutting-edge reporting platform designed to measure and optimize online marketing campaigns. The GCP Data Engineering Manager will design, implement, and maintain scalable, reliable, and efficient data solutions on Google Cloud Platform (GCP). The role focuses on enabling data-driven decision-making by developing ETL/ELT pipelines, managing large-scale datasets, and optimizing data workflows. The ideal candidate is a proactive problem-solver with strong technical expertise in GCP, a passion for data engineering, and a commitment to delivering high-quality solutions aligned with business needs. Key Responsibilities : Data Engineering & Development : Design, build, and maintain scalable ETL/ELT pipelines for ingesting, processing, and transforming structured and unstructured data. Implement enterprise-level data solutions using GCP services such as BigQuery, Dataform, Cloud Storage, Dataflow, Cloud Functions, Cloud Pub/Sub, and Cloud Composer. Develop and optimize data architectures that support real-time and batch data processing. Build, optimize, and maintain CI/CD pipelines using tools like Jenkins, GitLab, or Google Cloud Build. Automate testing, integration, and deployment processes to ensure fast and reliable software delivery. Cloud Infrastructure Management : Manage and deploy GCP infrastructure components to enable seamless data workflows. Ensure data solutions are robust, scalable, and cost-effective, leveraging GCP best practices. Infrastructure Automation and Management: Design, deploy, and maintain scalable and secure infrastructure on GCP. Implement Infrastructure as Code (IaC) using tools like Terraform. Manage Kubernetes clusters (GKE) for containerized workloads. Collaboration and Stakeholder Engagement : Work closely with cross-functional teams, including data analysts, data scientists, DevOps, and business stakeholders, to deliver data projects aligned with business goals. Translate business requirements into scalable, technical solutions while collaborating with team members to ensure successful implementation. Quality Assurance & Optimization : Implement best practices for data governance, security, and privacy, ensuring compliance with organizational policies and regulations. Conduct thorough quality assurance, including testing and validation, to ensure the accuracy and reliability of data pipelines. Monitor and optimize pipeline performance to meet SLAs and minimize operational costs. Qualifications and Certifications : Education: Bachelor’s or master’s degree in computer science, Information Technology, Engineering, or a related field. Experience: Minimum of 7 to 9 years of experience in data engineering, with at least 4 years working on GCP cloud platforms. Proven experience designing and implementing data workflows using GCP services like BigQuery, Dataform Cloud Dataflow, Cloud Pub/Sub, and Cloud Composer. Certifications: Google Cloud Professional Data Engineer certification preferred. Key Skills : Mandatory Skills: Advanced proficiency in Python for data pipelines and automation. Strong SQL skills for querying, transforming, and analyzing large datasets. Strong hands-on experience with GCP services, including Cloud Storage, Dataflow, Cloud Pub/Sub, Cloud SQL, BigQuery, Dataform, Compute Engine and Kubernetes Engine (GKE). Hands-on experience with CI/CD tools such as Jenkins, GitHub or Bitbucket. Proficiency in Docker, Kubernetes, Terraform or Ansible for containerization, orchestration, and infrastructure as code (IaC) Familiarity with workflow orchestration tools like Apache Airflow or Cloud Composer Strong understanding of Agile/Scrum methodologies Nice-to-Have Skills: Experience with other cloud platforms like AWS or Azure. Knowledge of data visualization tools (e.g., Power BI, Looker, Tableau). Understanding of machine learning workflows and their integration with data pipelines. Soft Skills : Strong problem-solving and critical-thinking abilities. Excellent communication skills to collaborate with technical and non-technical stakeholders. Proactive attitude towards innovation and learning. Ability to work independently and as part of a collaborative team. Location: Bengaluru Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less

Posted 4 days ago

Apply

7.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Demonstrate a deep understanding of cloud native, distributed micro service based architectures Deliver solutions for complex business problems through software standard SDLC Build strong relationships with both internal and external stakeholders including product, business and sales partners Demonstrate excellent communication skills with the ability to both simplify complex problems and also dive deeper if needed Build and manage strong technical teams that deliver complex software solutions that scale Manage teams with cross functional skills that include software, quality, reliability engineers, project managers and scrum masters Provide deep troubleshooting skills with the ability to lead and solve production and customer issues under pressure Leverage strong experience in full stack software development and public cloud like GCP and AWS Mentor, coach and develop junior and senior software, quality and reliability engineers Lead with a data/metrics driven mindset with a maniacal focus towards optimizing and creating efficient solutions Ensure compliance with EFX secure software development guidelines and best practices and responsible for meeting and maintaining QE, DevSec, and FinOps KPIs Define, maintain and report SLA, SLO, SLIs meeting EFX engineering standards in partnership with the product, engineering and architecture teams Collaborate with architects, SRE leads and other technical leadership on strategic technical direction, guidelines, and best practices Drive up-to-date technical documentation including support, end user documentation and run books Lead Sprint planning, Sprint Retrospectives, and other team activity Responsible for implementation architecture decision making associated with Product features/stories, refactoring work, and EOSL decisions Create and deliver technical presentations to internal and external technical and non-technical stakeholders communicating with clarity and precision, and present complex information in a concise format that is audience appropriate What Experience You Need Bachelor's degree or equivalent experience 7+ years of software engineering experience 7+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 7+ years experience with Cloud technology: GCP, AWS, or Azure 7+ years experience designing and developing cloud-native solutions 7+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 7+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Strong communication and presentation skills Strong leadership qualities Demonstrated problem solving skills and the ability to resolve conflicts Experience creating and maintaining product and software roadmaps Experience overseeing yearly as well as product/project budgets Working in a highly regulated environment Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Show more Show less

Posted 4 days ago

Apply

7.0 years

0 Lacs

India

On-site

Linkedin logo

Job Summary/Overview: We are seeking a highly experienced and skilled Senior GCP Data Engineer to design, develop, and maintain data pipelines and data warehousing solutions on the Google Cloud Platform (GCP). This role requires a strong understanding of data engineering principles and a proven track record of success in building and managing large-scale data solutions. The ideal candidate will be proficient in various GCP services and have experience working with large datasets. Key Responsibilities: * Design, develop, and implement robust and scalable data pipelines using GCP services. * Develop and maintain data warehousing solutions on GCP. * Perform data modeling, ETL processes, and data quality assurance. * Optimize data pipeline performance and efficiency. * Collaborate with other engineers and stakeholders to define data requirements and solutions. * Troubleshoot and resolve data-related issues. * Contribute to the development and improvement of data engineering best practices. * Participate in code reviews and ensure code quality. * Document technical designs and processes. Required Qualifications: * Bachelor's degree in Computer Science, Engineering, or a related field. * 7+ years of experience as a Data Engineer. * Extensive experience with GCP services, including BigQuery, Dataflow, Dataproc, Cloud Storage, and Cloud Pub/Sub. * Proven experience designing and implementing data pipelines using ETL/ELT processes. * Experience with data warehousing concepts and best practices. * Strong SQL and data modeling skills. * Experience working with large datasets. Preferred Qualifications: * Master's degree in Computer Science, Engineering, or a related field. * Experience with data visualization tools. * Experience with data governance and compliance. * Experience with containerization technologies (e.g., Docker, Kubernetes). * Experience with Apache Kafka or similar message queuing systems. Show more Show less

Posted 4 days ago

Apply

8.0 years

0 Lacs

India

Remote

Linkedin logo

🚀 We’re Hiring: AEP Data Engineer 📍 Location: Remote | 💼 Experience: 6–8 Years 🕒 Contract: 6 Months (Extendable) Prior experience with Adobe Experience Platform (AEP) is a plus ! We’re looking for an experienced Data Engineer with strong GCP (Google Cloud Platform) skills and a background in ETL development and data warehouse migration . 🔧 What You’ll Do: Design and build scalable ETL pipelines and data workflows in GCP Migrate on-premise data warehouses to BigQuery and other GCP tools Collaborate with architects, data scientists, and stakeholders to deliver reliable data solutions Optimize performance, maintain data quality, and ensure smooth operations Participate in code reviews , CI/CD workflows, and Agile ceremonies. 🎯 What You Bring: 6–8 years in Data Engineering 3–4 years of hands-on experience in GCP tools: BigQuery, Dataflow, Cloud Composer, Pub/Sub Strong in SQL and Python (or similar language) Solid experience with ETL frameworks and data migration strategies Proficiency in version control (Git) and working in remote agile teams Excellent communication and ability to work independently AEP knowledge is a big plus 📩 Apply now or share your resume at Recruiter@systembender.com Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Summary Skill Name: Power BI with GCP developer Experience : 7 - 10 yrs Mandatory Skills : Power BI + GCP(Big Query) Required Skills & Qualifications: Power BI Expertise: Strong hands-on experience in Power BI development, including report/dashboard creation, DAX, Power Query, and custom visualizations. Semantic Model Knowledge: Proficiency in building and managing semantic models within Power BI to ensure consistency and user-friendly data exploration. GCP Tools: Practical experience with Google Cloud Platform tools, particularly BigQuery, Dataflow, and Cloud Storage, for managing large datasets and data integration. ETL Processes: Experience in designing and managing ETL (Extract, Transform, Load) processes using GCP services. SQL & Data Modeling: Solid skills in SQL and data modeling, particularly for BI solutions and creating relationships between different data sources. Cloud Data Integration: Familiarity with integrating cloud-based data sources into Power BI, including knowledge of best practices for handling cloud storage and data pipelines. Data Analysis & Troubleshooting: Strong problem-solving abilities, including diagnosing and resolving issues in data models, reports, or data integration pipelines. Communication & Collaboration: Excellent communication skills to work effectively with cross Show more Show less

Posted 4 days ago

Apply

6.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Role Overview As a Principal Software Engineer, you will be a key contributor to the design, development, and deployment of advanced AI and generative AI-based products. You will drive technical innovation, lead complex projects, and collaborate closely with cross-functional teams to deliver high-quality, scalable, and maintainable solutions. This role requires a strong background in software development, AI/ML techniques, and DevOps practices, along with the ability to mentor junior engineers and contribute to strategic technical decisions. Responsibilities Advanced Software Development: Design, develop, and optimize high-quality code for complex software applications and systems, maintaining high standards of performance, scalability, and maintainability. Drive best practices in code quality, documentation, and test coverage. GenAI Product Development: Lead end-to-end development of generative AI solutions, from data collection and model training to deployment and optimization. Experiment with cutting-edge generative AI techniques to enhance product capabilities and performance. Technical Leadership: Take ownership of architecture and technical decisions for AI/ML projects. Mentor junior engineers, review code for adherence to best practices, and ensure the team follows a high standard of technical excellence. Project Ownership: Lead execution and delivery of features, managing project scope, timelines, and priorities in collaboration with product managers. Proactively identify and mitigate risks to ensure successful, on-time project completion. Architectural Design: Contribute to the architectural design and planning of new features, ensuring solutions are scalable, reliable, and maintainable. Engage in technical reviews with peers and stakeholders, promoting a product suite mindset. Code Review & Best Practices: Conduct rigorous code reviews to ensure adherence to industry best practices in coding standards, maintainability, and performance optimization. Provide feedback that supports team growth and technical improvement. Testing & Quality Assurance: Design and implement robust test suites to ensure code quality and system reliability. Advocate for test automation and the use of CI/CD pipelines to streamline testing processes and maintain service health. Service Health & Reliability: Monitor and maintain the health of deployed services, utilizing telemetry and performance indicators to proactively address potential issues. Perform root cause analysis for incidents and drive preventive measures for improved system reliability. DevOps Ownership: Take end-to-end responsibility for features and services, working in a DevOps model to deploy and manage software in production. Ensure efficient incident response and maintain a high level of service availability. Documentation & Knowledge Sharing: Create and maintain thorough documentation for code, processes, and technical decisions. Contribute to knowledge sharing within the team, enabling continuous learning and improvement. Minimum Qualifications Educational Background: Bachelor’s degree in Computer Science, Engineering, or a related technical field; Master’s degree preferred. Experience: 6+ years of professional software development experience, including significant experience with AI/ML or GenAI applications. Demonstrated expertise in building scalable, production-grade software solutions. Technical Expertise: Advanced proficiency in Python, FastAPI, PyTest, Celery, and other Python frameworks. Deep knowledge of software design patterns, object-oriented programming,and concurrency. Cloud & DevOps Proficiency: Extensive experience with cloud technologies (e.g., GCP,AWS, Azure), containerization (e.g., Docker, Kubernetes), and CI/CD practices. Strong understanding of version control systems (e.g., GitHub) and work tracking tools (e.g., JIRA). AI/GenAI Knowledge: Familiarity with GenAI frameworks (e.g., LangChain, LangGraph), MLOps, and AI lifecycle management. Experience with model deployment and monitoring in cloud environments. Preferred Qualifications AI & Machine Learning: Hands-on experience with advanced ML algorithms, including generative models, NLP, and transformers. Knowledge of industry-standard AI frameworks (e.g.,TensorFlow, PyTorch) and experience with data preprocessing and model evaluation. Data & Analytics Tools: Proficiency with relational and NoSQL databases (e.g., MongoDB,MSSQL, PostgreSQL) and analytics platforms (e.g., BigQuery, Snowflake, Tableau). Experience with messaging systems (e.g., Kafka) is a plus. Testing & Quality: Experience with test automation tools (e.g., PyTest, xUnit) and CI/CD tooling such as Terraform and GitHub Actions. Strong emphasis on building resilient and testable software. Advanced Cloud Knowledge: Proficiency with GCP technologies such as VertexAI, BigQuery, GKE, GCS, and DataFlow, with a focus on deploying AI models at scale Where we’re going UKG is on the cusp of something truly special. Worldwide, we already hold the #1 market share position for workforce management and the #2 position for human capital management. Tens of millions of frontline workers start and end their days with our software, with billions of shifts managed annually through UKG solutions today. Yet it’s our AI-powered product portfolio designed to support customers of all sizes, industries, and geographies that will propel us into an even brighter tomorrow! UKG is proud to be an equal opportunity employer and is committed to promoting diversity and inclusion in the workplace, including the recruitment process. Disability Accommodation For individuals with disabilities that need additional assistance at any point in the application and interview process, please email UKGCareers@ukg.com Show more Show less

Posted 4 days ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Company Description Coders Brain is a global leader in IT services, digital and business solutions that partners with clients to simplify, strengthen, and transform their businesses. The company ensures high levels of certainty and satisfaction through deep industry expertise and a global network of innovation and delivery centers. Job Title: Senior Data Engineer Location: Hyderabad Experience: 6+ Years Employment Type: Full-Time Job Summary: We are looking for a highly skilled Senior Data Engineer to join our Data Engineering team. You will play a key role in designing, implementing, and optimizing robust, scalable data solutions that drive business decisions for our clients. This position involves hands-on development of data pipelines, cloud data platforms, and analytics tools using cutting-edge technologies. Key Responsibilities: Design and build reliable, scalable, and high-performance data pipelines to ingest, transform, and store data from various sources. Develop cloud-based data infrastructure using platforms such as AWS , Azure , or Google Cloud Platform (GCP) . Optimize data processing and storage frameworks for cost efficiency and performance. Ensure high standards for data quality, integrity, and governance across all systems. Collaborate with cross-functional teams including data scientists, analysts, and product managers to translate requirements into technical solutions. Troubleshoot and resolve issues with data pipelines and workflows, ensuring system reliability and availability. Stay current with emerging trends and technologies in big data and cloud ecosystems and recommend improvements accordingly. Required Qualifications: Bachelor’s degree in Computer Science , Software Engineering , or a related field. Minimum 6 years of professional experience in data engineering or a related discipline. Proficiency in Python , Java , or Scala for data engineering tasks. Strong expertise in SQL and hands-on experience with modern data warehouses (e.g., Snowflake, Redshift, BigQuery). In-depth knowledge of big data technologies such as Hadoop , Spark , or Hive . Practical experience with cloud-based data platforms such as AWS (e.g., Glue, EMR) , Azure (e.g., Data Factory, Synapse) , or GCP (e.g., Dataflow, BigQuery) . Excellent analytical, problem-solving, and communication skills. Nice to Have: Experience with containerization and orchestration tools such as Docker and Kubernetes . Familiarity with CI/CD pipelines for data workflows. Knowledge of data governance, security, and compliance best practices. Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Your potential, unleashed. India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realize your potential amongst cutting edge leaders, and organizations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The Team Deloitte’s Technology & Transformation practice can help you uncover and unlock the value buried deep inside vast amounts of data. Our global network provides strategic guidance and implementation services to help companies manage data from disparate sources and convert it into accurate, actionable information that can support fact-driven decision-making and generate an insight-driven advantage. Our practice addresses the continuum of opportunities in business intelligence & visualization, data management, performance management and next-generation analytics and technologies, including big data, cloud, cognitive and machine learning. Your work profile: As a Consultant/Senior Consultant/Manager in our Technology & Transformation you’ll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations: - To do this, following are the desired qualification and required skills: Good hands-on experience in GCP services including Big Query, Cloud Storage, Dataflow, Cloud Dataproc, Cloud Composer/Airflow, and IAM. Must have proficient experience in GCP Databases : Bigtable, Spanner, Cloud SQL and Alloy DB Proficiency either in SQL, Python, Java, or Scala for data processing and scripting. Experience in development and test automation processes through the CI/CD pipeline (Git, Jenkins, SonarQube, Artifactory, Docker containers) Experience in orchestrating data processing tasks using tools like Cloud Composer or Apache Airflow. Strong understanding of data modeling, data warehousing and big data processing concepts. Solid understanding and experience of relational database concepts and technologies such as SQL, MySQL, PostgreSQL or Oracle. Design and implement data migration strategies for various database types ( PostgreSQL, Oracle, Alloy DB etc.) Deep understanding of at least 1 Database type with ability to write complex SQLs. Experience with NoSQL databases such as MongoDB, Scylla, Cassandra, or DynamoDB is a plus Optimize data pipelines for performance and cost-efficiency, adhering to GCP best practices. Implement data quality checks, data validation, and monitoring mechanisms to ensure data accuracy and integrity. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and translate them into technical solutions. Ability to work independently and manage multiple priorities effectively. Preferably having expertise in end to end DW implementation UG: B. Tech /B.E. in Any Specialization. Location and way of working: Base location: Bengaluru/Hyderabad/Mumbai/Bhubaneshwar/Coimbatore/Delhi This profile involves occasional travelling to client locations. Hybrid is our default way of working. Each domain has customized the hybrid approach to their unique needs. Your role as a Consultant/Senior Consultant/Manager: We expect our people to embrace and live our purpose by challenging themselves to identify issues that are most important for our clients, our people, and for society. In addition to living our purpose, Consultant/Senior Consultant/Manager across our organization must strive to be: Inspiring - Leading with integrity to build inclusion and motivation Committed to creating purpose - Creating a sense of vision and purpose Agile - Achieving high-quality results through collaboration and Team unity Skilled at building diverse capability - Developing diverse capabilities for the future Persuasive / Influencing - Persuading and influencing stakeholders Collaborating - Partnering to build new solutions Delivering value - Showing commercial acumen Committed to expanding business - Leveraging new business opportunities Analytical Acumen - Leveraging data to recommend impactful approach and solutions through the power of analysis and visualization Effective communication – Must be well abled to have well-structured and well-articulated conversations to achieve win-win possibilities Engagement Management / Delivery Excellence - Effectively managing engagement(s) to ensure timely and proactive execution as well as course correction for the success of engagement(s) Managing change - Responding to changing environment with resilience Managing Quality & Risk - Delivering high quality results and mitigating risks with utmost integrity and precision Strategic Thinking & Problem Solving - Applying strategic mindset to solve business issues and complex problems Tech Savvy - Leveraging ethical technology practices to deliver high impact for clients and for Deloitte Empathetic leadership and inclusivity - creating a safe and thriving environment where everyone's valued for who they are, use empathy to understand others to adapt our behaviors' and attitudes to become more inclusive. How you’ll grow Connect for impact Our exceptional team of professionals across the globe are solving some of the world’s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report. Empower to lead You can be a leader irrespective of your career level. Our colleagues are characterized by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership. Inclusion for all At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters. Drive your career At Deloitte, you are encouraged to take ownership of your career. We recognize there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte. Everyone’s welcome… entrust your happiness to us Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here’s a glimpse of things that are in store for you. Interview tips We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organization and the business area you’re applying to. Check out recruiting tips from Deloitte professionals. Show more Show less

Posted 4 days ago

Apply

15.0 years

0 Lacs

Delhi, India

Remote

Linkedin logo

Educational Qualifications: BE/B Tech/ M.E/M To lead the operations of UIDAI's critical infrastructure, primarily hosted on Open-stack on-premise Private Cloud architecture, ensuring 24/7 availability of Aadhaar services. Manage a team of experts to design application deployment architecture to ensure high availability. Manage a team of experts to provide infra-deployment guidelines to bake into app design. Ensure robust security, scalability, and reliability of UIDAI's data centres and networks. Participate in architectural design review sessions, develop proof of concepts/pilots, implement projects, and deliver ongoing upgrades and enhancements. Revamp applications for AADHAR's Private Cloud Deployment in today's constantly shifting digital landscape to increase operational efficiency and reduce infrastructure costs. Role & Innovation & Technology Transformation Align with the Vision, Mission and Core Values UIDAI while closely aligning with inter-disciplinary teams. Lead Cloud Operations/ Infra team in fine-tuning & optimization of cloud-native platforms to improve performance and to achieve cost efficiency. Drive solution design for RFPs, POCs, and pilots for new and upcoming projects or R&D initiatives, using open-source cloud and infrastructure to build a scalable and elastic Data Center. Encourage & create an environment for Knowledge sharing within and outside the UIDAI. To interact/ partner with leading institutes/ R&D establishments/ educational institutions to stay up to date with new technologies and trends in cloud computing. Be a thought leader in architecture design and development of complex operational data analytics solutions to monitor various metrics related to infrastructure and app Architecture Design & the design, implementation, and deployment of OpenStack-based on-premise private cloud infrastructure. Develop scalable, secure, and highly available cloud architectures to meet business and operational needs. Architect and design infrastructure solutions that support both virtualized and containerized workloads. Solution Integration, Performance Monitoring & Integrate OpenStack with existing on-premise data centre systems, network infrastructure, and storage platforms. Work with cross-functional teams to ensure seamless integration of cloud solutions in UIDAI. Monitor cloud infrastructure performance and ensure efficient use of resources. Identify areas for improvement and implement optimizations to reduce costs and improve performance. Security & Compliance: - Implement security best practices for on-premise cloud environments, ensuring data protection and compliance with industry standards. Regularly perform security audits and vulnerability assessments to maintain a secure cloud Collaboration & Collaborate with internal teams (App development and Security) to align cloud infrastructure with UIDAIs requirements and objectives & manage seamless communication within tech teams and across the organization. Maintain detailed live documentation of cloud architecture, processes, and configurations to establish trails of decision-making and ensure transparency and accountability. Role More than 15 years of experience in Technical, Infra and App Solutioning, and at least 7+ years of experience in spearheading large multi-disciplinary technology teams working across various domains in a leadership position. Excellent problem-solving and troubleshooting skills. Must have demonstrable experience in application performance analysis through low-level debugging. Experience on transformation projects for On-Premise data solutions, open-source CMP - OpenStack, CloudStack. Should be well versed with Site Reliability Engineering (SRE) concepts with a focus on extreme automation & infrastructure as code (IaC) methodologies & have led such teams before; including exp on Gitops, and platform automation tools like Terraform, Ansible etc. Strong knowledge of Linux-based operating systems (Ubuntu, CentOS, RedHat, etc). Strong understanding on the HTTP1.1, Http2 with gRPC and HTTP/2 (QUICK) protocol functioning. Experience in System Administration, Server storage, Networking, virtualization, Data Warehouse, Data Integration, Data Migration and Business Intelligence/Artificial Intelligence solutions on the Cloud. Proficient in technology administration, remote infrastructure management, cloud assessment, QA, monitoring, and DevOps practices. Extensive experience in Cloud platform architecture, Private cloud deployment, large-scale ] transformation or migration of applications to cloud-native platforms. Should have experience in building cloud-native platforms on Kubernetes, including awareness & experience of service mesh, cloud-native storage, integration with SAN & NAS, Kubernetes operators, CNI, CSI, CRI etc. Should have strong expertise in networking background in terms of routing, switching, BGP, technologies like TRILL, MP-BGP, EVPN etc. Preferably, should have experience in SAN networking & Linux networking concepts like networking namespaces, route tables, and ss utilities. Experience on Cloud and On-Premise Databases like Cloud SQL, Cloud Spanner, Big Table, RDS, Aurora, DynamoDB, Oracle, Teradata, MySQL, DB2, SQL Server. Exposure to any of the No-SQL databases like Mongo dB, CouchDB, Cassandra, Graph dB, etc. Experience in ML Ops pipeline is preferable. Experience with distributed computing platforms and enterprise environments like Hadoop, GCP/AWS/Azure Cloud is preferred. Experience with various Data Integration, and ETL technologies on Cloud like Spark, Pyspark/Scala, and Dataflow is preferred. (ref:iimjobs.com) Show more Show less

Posted 4 days ago

Apply

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Kyndryl Software Engineering, IT, Data Science Bengaluru, Karnataka, India Chennai, Tamil Nadu, India Posted on Jun 12, 2025 Apply now Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Are you ready to join a team that is passionate about solving complex business problems with cutting-edge technology? Kyndryl is seeking a talented Data Architect who will take charge of all things data and information, transforming them into remarkable solutions that drive our customers’ success. Get ready to unleash your creative prowess and shape the future of data management. As a Data Architect with Kyndryl, you will play a vital role in managing all aspects of data and information, from understanding business requirements to translating them into data models scheme design and data architectures. Your expertise will be critical in guiding the full IT data lifecycle governance , including acquisition, transformation, classification, storage, presentation, distribution, security, privacy, and archiving to ensure data is accurate, complete and secure. You will work closely with our customers to design solutions and data architectures that address their unique business problems, considering their needs and constraints while applying your industry knowledge and expertise. You will have the opportunity to work with diverse technologies, like databases (relational, hierarchical and object-oriented), file systems and storage management, document imaging, knowledge and content management, taxonomies and business intelligence. All relevant for designing business-driven IT solutions that meet data requirements and incorporate cloud solutions for different types of storage needs. As a Data Architect, you will have the chance to develop and design centralized or distributed systems that both address user requirements and perform efficiently and effectively. You will ensure the viability of proposed solutions by conducting solution assurance assessments and work closely with the project team and key stakeholders to ensure that the final solution meets all customer requirements and expectations. At Kyndryl, you'll be part of a dynamic, forward-thinking team where creativity knows no bounds. Together, we'll shape the future of data architecture and revolutionize the way businesses thrive. Apply now and take the first step towards an exciting and rewarding career with Kyndryl. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career, from a Junior Architect to Principal Architect – we have opportunities for that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Technical And Professional Expertise 10+ Years experience in data modeling, database design, and data management best practices. Proven expertise in architecting data and AI platforms using GCP. Hands-on experience with BigQuery, Dataflow, VertexAI, and Gemini. Proficiency in Python and SQL for data processing, transformation, and automation. Deep understanding of Generative AI, Agentic AI, and related architectures. Experience implementing GenAI-based applications using tools such as Langchain and Langraph. Solid background in building scalable and secure data pipelines and storage systems in cloud environments. Preferred Technical And Professional Experience Experience with cloud-based data platforms, integration, and governance frameworks across Azure, AWS, or GCP. Knowledge of machine learning lifecycle management, model monitoring, and MLOps best practices. Bachelor’s or Master’s degree in Computer Science, Information Systems, Data Engineering, or a related field. Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address. Apply now See more open positions at Kyndryl Show more Show less

Posted 5 days ago

Apply

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Kyndryl Software Engineering, IT, Data Science Bengaluru, Karnataka, India Chennai, Tamil Nadu, India Posted on Jun 12, 2025 Apply now Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Are you ready to join a team that is passionate about solving complex business problems with cutting-edge technology? Kyndryl is seeking a talented Data Architect who will take charge of all things data and information, transforming them into remarkable solutions that drive our customers’ success. Get ready to unleash your creative prowess and shape the future of data management. As a Data Architect with Kyndryl, you will play a vital role in managing all aspects of data and information, from understanding business requirements to translating them into data models scheme design and data architectures. Your expertise will be critical in guiding the full IT data lifecycle governance , including acquisition, transformation, classification, storage, presentation, distribution, security, privacy, and archiving to ensure data is accurate, complete and secure. You will work closely with our customers to design solutions and data architectures that address their unique business problems, considering their needs and constraints while applying your industry knowledge and expertise. You will have the opportunity to work with diverse technologies, like databases (relational, hierarchical and object-oriented), file systems and storage management, document imaging, knowledge and content management, taxonomies and business intelligence. All relevant for designing business-driven IT solutions that meet data requirements and incorporate cloud solutions for different types of storage needs. As a Data Architect, you will have the chance to develop and design centralized or distributed systems that both address user requirements and perform efficiently and effectively. You will ensure the viability of proposed solutions by conducting solution assurance assessments and work closely with the project team and key stakeholders to ensure that the final solution meets all customer requirements and expectations. At Kyndryl, you'll be part of a dynamic, forward-thinking team where creativity knows no bounds. Together, we'll shape the future of data architecture and revolutionize the way businesses thrive. Apply now and take the first step towards an exciting and rewarding career with Kyndryl. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career, from a Junior Architect to Principal Architect – we have opportunities for that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Technical And Professional Expertise 10+ Years experience in data modeling, database design, and data management best practices. Proven expertise in architecting data and AI platforms using GCP. Hands-on experience with BigQuery, Dataflow, VertexAI, and Gemini. Proficiency in Python and SQL for data processing, transformation, and automation. Deep understanding of Generative AI, Agentic AI, and related architectures. Experience implementing GenAI-based applications using tools such as Langchain and Langraph. Solid background in building scalable and secure data pipelines and storage systems in cloud environments. Preferred Technical And Professional Experience Experience with cloud-based data platforms, integration, and governance frameworks across Azure, AWS, or GCP. Knowledge of machine learning lifecycle management, model monitoring, and MLOps best practices. Bachelor’s or Master’s degree in Computer Science, Information Systems, Data Engineering, or a related field. Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address. Apply now See more open positions at Kyndryl Show more Show less

Posted 5 days ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About the Job The Director Data Engineering will lead the development and implementation of a comprehensive data strategy that aligns with the organization’s business goals and enables data driven decision making. Roles and Responsibilities Build and manage a team of talented data managers and engineers with the ability to not only keep up with, but also pioneer, in this space Collaborate with and influence leadership to directly impact company strategy and direction Develop new techniques and data pipelines that will enable various insights for internal and external customers Develop deep partnerships with client implementation teams, engineering and product teams to deliver on major cross-functional measurements and testing Communicate effectively to all levels of the organization, including executives Provide success in partnering teams with dramatically varying backgrounds, from the highly technical to the highly creative Design a data engineering roadmap and execute the vision behind it Hire, lead, and mentor a world-class data team Partner with other business areas to co-author and co-drive strategies on our shared roadmap Oversee the movement of large amounts of data into our data lake Establish a customer-centric approach and synthesize customer needs Own end-to-end pipelines and destinations for the transfer and storage of all data Manage 3rd-party resources and critical data integration vendors Promote a culture that drives autonomy, responsibility, perfection and mastery. Maintain and optimize software and cloud expenses to meet financial goals of the company Provide technical leadership to the team in design and architecture of data products and drive change across process, practices, and technology within the organization Work with engineering managers and functional leads to set direction and ambitious goals for the Engineering department Ensure data quality, security, and accessibility across the organization Skills You Will Need 10+ years of experience in data engineering 5+ years of experience leading data teams of 30+ resources or more, including selection of talent planning / allocating resources across multiple geographies and functions. 5+ years of experience with GCP tools and technologies, specifically, Google BigQuery, Google cloud composer, Dataflow, Dataform, etc. Experience creating large-scale data engineering pipelines, data-based decision-making and quantitative analysis tools and software Experience with hands-on to code version control systems (git) Experience with CICD, data architectures, pipelines, quality, and code management Experience with complex, high volume, multi-dimensional data, based on unstructured, structured, and streaming datasets Experience with SQL and NoSQL databases Experience creating, testing, and supporting production software and systems Proven track record of identifying and resolving performance bottlenecks for production systems Experience designing and developing data lake, data warehouse, ETL and task orchestrating systems Strong leadership, communication, time management and interpersonal skills Proven architectural skills in data engineering Experience leading teams developing production-grade data pipelines on large datasets Experience designing a large data lake and lake house experience, managing data flows that integrate information from various sources into a common pool implementing data pipelines based on the ETL model Experience with common data languages (e.g. Python, Scala) and data warehouses (e.g. Redshift, BigQuery, Snowflake, Databricks) Extensive experience on cloud tools and technologies - GCP preferred Experience managing real-time data pipelines Successful track record and demonstrated thought-leadership and cross-functional influence and partnership within an agile / water-fall development environment. Experience in regulated industries or with compliance frameworks (e.g., SOC 2, ISO 27001). Nice to have: HR services industry experience Experience in data science, including predictive modeling Experience leading teams across multiple geographies Show more Show less

Posted 5 days ago

Apply

5.0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Senior Data Engineer Experience: 5+ Years Location: Remote Contract Duration: Short Term Work Time: IST Shift Job Description We are seeking a skilled and experienced Senior Data Engineer to develop scalable and optimized data pipelines using the Databricks Lakehouse platform. The role requires proficiency in Apache Spark, PySpark, cloud data services (AWS, Azure, GCP), and solid programming knowledge in Python and Java. The engineer will collaborate with cross-functional teams to design and deliver high-performing data solutions. Responsibilities Data Pipeline Development Build efficient ETL/ELT workflows using Databricks and Spark for batch and streaming data Utilize Delta Lake and Unity Catalog for structured data management Optimize Spark jobs using tuning techniques such as caching, partitioning, and serialization Cloud-Based Implementation Develop and deploy data workflows on AWS (S3, EMR, Glue), Azure (ADLS, ADF, Synapse), and/or GCP (GCS, Dataflow, BigQuery) Manage and optimize data storage, access control, and orchestration using native cloud tools Implement data ingestion and querying with Databricks Auto Loader and SQL Warehousing Programming and Automation Write clean, reusable, and production-grade code in Python and Java Automate workflows using orchestration tools like Airflow, ADF, or Cloud Composer Implement testing, logging, and monitoring mechanisms Collaboration and Support Work closely with data analysts, scientists, and business teams to meet data requirements Support and troubleshoot production workflows Document solutions, maintain version control, and follow Agile/Scrum methodologies Required Skills Technical Skills Databricks: Experience with notebooks, cluster management, Delta Lake, Unity Catalog, and job orchestration Spark: Proficient in transformations, joins, window functions, and tuning Programming: Strong in PySpark and Java, with data validation and error handling expertise Cloud: Experience with AWS, Azure, or GCP data services and security frameworks Tools: Familiarity with Git, CI/CD, Docker (preferred), and data monitoring tools Experience 5–8 years in data engineering or backend development Minimum 1–2 years of hands-on experience with Databricks and Spark Experience with large-scale data migration, processing, or analytics projects Certifications (Optional but Preferred) Databricks Certified Data Engineer Associate Working Conditions Full-time remote work with availability during IST hours Occasional on-site presence may be required during client visits No regular travel required On-call support expected during deployment phases Show more Show less

Posted 5 days ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

What You’ll Do Design, develop, and operate high scale applications across the full engineering stack. Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Research, create, and develop software applications to extend and improve on Equifax Solutions. Manage sole project priorities, deadlines, and deliverables. Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, Spring Framework, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs Big Data Technologies : Spark/Scala/Hadoop What could set you apart Experience designing and developing big data processing solutions using DataProc, Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others Cloud Certification especially in GCP Self-starter that identifies/responds to priority shifts with minimal supervision. You have excellent leadership and motivational skills You have an inquisitive and innovative mindset with a shown ability to recognize opportunities to create distinctive value You can successfully evaluate workload to drive efficiency Show more Show less

Posted 5 days ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Reporting to the A/NZ DSE Chapter Manager India PEC within Decision Sciences & Engineering, this role will own and be responsible for the data & analytic engineering chapter in India PEC. The Data Engineer is an essential part of the business that enables the team to support the ongoing acquisition and internal purposing of data, through to the fulfilment of products, insights and systems. As a Data Engineer, you will be responsible for working with our internal customers to ensure that data and systems are being designed and built to move and manipulate data in a scalable, reusable and efficient manner to suit the environment, project, security and requirements. What You’ll Do Design, architect, and implement scalable and secure data pipelines on GCP, utilizing services like Dataflow, Pub/Sub, and Cloud Storage. Develop and maintain data models, ensuring data quality, consistency, and accessibility for various internal stakeholders. Automate data processes and workflows using scripting languages like Python, leveraging technologies like Spark and Airflow. Monitor and troubleshoot data pipelines, identifying and resolving performance issues proactively. Stay up-to-date with the latest trends and advancements in GCP and related technologies, actively proposing and evaluating new solutions. Implement data governance best practices, including data security, access control, and lineage tracking. Lead security initiatives, design and implement security architecture. Lead data quality initiatives, design and implement monitoring dashboards. Mentor and guide junior data engineers, sharing knowledge and best practices to foster a high-performing team. Role requires a solid educational foundation and the ability to develop a strategic vision and roadmap for D&A’s transition to the cloud while balancing delivery of near-term results that are aligned with execution. What Experience You Need BS degree in a STEM major or equivalent discipline; Master’s Degree strongly preferred 8+ years of experience as a data engineer or related role, with experience demonstrating leadership capabilities Cloud certification strongly preferred Expert level skills using programming languages such as Python or SQL (Big Query) and advanced level experience with scripting languages. Demonstrated proficiency in all Google Cloud Services Experience building and maintaining complex data pipelines, troubleshooting complex issues, transforming and entering data into a data pipeline in order for the content to be digested and usable for future projects; Proficiency in Airflow strongly desired Experience designing and implementing advanced to complex data models and experience enabling advanced optimization to improve performance Experience leading a team with Git expertise strongly preferred Hands on Experience on Agile Methodoligies Working Knowledge of CI/CD What could set you apart: Self-starter that identifies/responds to priority shifts with minimal supervision. Strong communication and presentation skills Strong leadership qualities A well-balanced view of resource management, thinking creatively and effectively to deploy the team whilst building skills for the future Skilled in internal networking, negotiating and proactively developing individuals and teams to be the best they can be Strong communicator & presenter, bringing everyone on the journey. Knowledge of Big Data technology and tools with the ability to share ideas among a collaborative team and drive the team based on technical expertise and learning, sharing best practices Excellent communication skills to engage with senior management, internal customers and product management Sound understanding of regulations and security requirements governing access to data in Big Data systems Sound understanding of Insight delivery systems for batch and online Should be able to run Agile Scrum-based projects Demonstrated problem solving skills and the ability to resolve conflicts Experience creating and maintaining product and software roadmaps Experience overseeing yearly as well as product/project budgets Working in a highly regulated environment Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Position Overview Job Title: Associate - Production Support Engineer Location: Bangalore, India Role Description You will be operating within Corporate Bank Production as an Associate, Production Support Engineer in the Corporate Banking subdivisions. You will be accountable to drive a culture of proactive continual improvement into the Production environment through application, user request support, troubleshooting and resolving the errors in production environment. Automation of manual work, monitoring improvements and platform hygiene. Supporting the resolution of issues and conflicts and preparing reports and meetings. Candidate should have experience in all relevant tools used in the Service Operations environment and has specialist expertise in one or more technical domains and ensures that all associated Service Operations stakeholders are provided with an optimum level of service in line with Service Level Agreements (SLAs) / Operating Level Agreements (OLAs). Ensure all the BAU support queries from business are handled on priority and within agreed SLA and also to ensure all application stability issues are well taken care off. Support the resolution of incidents and problems within the team. Assist with the resolution of complex incidents. Ensure that the right problem-solving techniques and processes are applied Embrace a Continuous Service Improvement approach to resolve IT failings, drive efficiencies and remove repetition to streamline support activities, reduce risk, and improve system availability. Be responsible for your own engineering delivery plus, using data and analytics, drive a reduction in technical debt across the production environment with development and infrastructure teams. Act as a Production Engineering role model to enhance the technical capability of the Production Support teams to create a future operating model embedded with engineering culture. Deutsche Bank’s Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support." What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities Lead by example to drive a culture of proactive continual improvement into the Production environment through automation of manual work, monitoring improvements and platform hygiene. Carry out technical analysis of the Production platform to identify and remediate performance and resiliency issues. Engage in the Software Development Lifecycle (SDLC) to enhance Production Standards and controls. Update the RUN Book and KEDB as & when required Participate in all BCP and component failure tests based on the run books Understand flow of data through the application infrastructure. It is critical to understand the dataflow to best provide operational support Event monitoring and management via a 24x7 workbench that is both monitoring and regularly probing the service environment and acting on instruction of the run book. Drive knowledge management across the supported applications and ensure full compliance Works with team members to identify areas of focus, where training may improve team performance, and improve incident resolution. Your Skills And Experience Recent experience of applying technical solutions to improve the stability of production environments Working experience of some of the following technology skills: Technologies/Frameworks: Unix, Shell Scripting and/or Python SQL Stack Oracle 12c/19c - for pl/sql, familiarity with OEM tooling to review AWR reports and parameters ITIL v3 Certified (must) Control-M, CRON scheduling MQ- DBUS, IBM JAVA 8/OpenJDK 11 (at least) - for debugging Familiarity with Spring Boot framework Data Streaming – Kafka (Experience with Confluent flavor a plus) and ZooKeeper Hadoop framework Configuration Mgmt Tooling: Ansible Operating System/Platform: RHEL 7.x (preferred), RHEL6.x OpenShift (as we move towards Cloud computing and the fact that Fabric is dependent on OpenShift) CI/CD: Jenkins (preferred) APM Tooling: either or one of Splunk AppDynamics Geneos NewRelic Other platforms: Scheduling – Ctrl-M is a plus, Autosys, etc Search – Elastic Search and/or Solr+ is a plus Methodology: Micro-services architecture SDLC Agile Fundamental Network topology – TCP, LAN, VPN, GSLB, GTM, etc Familiarity with TDD and/or BDD Distributed systems Experience on cloud platforms such as Azure, GCP is a plus Familiarity with containerization/Kubernetes Tools: ServiceNow Jira Confluence BitBucket and/or GIT IntelliJ SQL Plus Familiarity with simple Unix Tooling – putty, mPutty, exceed (PL/)SQL Developer Good understanding of ITIL Service Management framework such as Incident, Problem, and Change processes. Ability to self-manage a book of work and ensure clear transparency on progress with clear, timely, communication of issues. Excellent communication skills, both written and verbal, with attention to detail. Ability to work in Follow the Sun model, virtual teams and in matrix structure Service Operations experience within a global operations context 6-9 yrs experience in IT in large corporate environments, specifically in the area of controlled production environments or in Financial Services Technology in a client-facing function Global Transaction Banking Experience is a plus. Experience of end-to-end Level 2,3,4 management and good overview of Production/Operations Management overall Experience of run-book execution Experience of supporting complex application and infrastructure domains Good analytical, troubleshooting and problem-solving skills Working knowledge of incident tracking tools (i.e., Remedy, Heat etc.) How We’ll Support You Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment. Show more Show less

Posted 5 days ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

Remote

Linkedin logo

Role: Senior Data Engineer with Databricks. Experience: 5+ Years Job Type: Contract Contract Duration: 6 Months Budget: 1.0 lakh per month Location : Remote JOB DESCRIPTION: We are looking for a dynamic and experienced Senior Data Engineer – Databricks to design, build, and optimize robust data pipelines using the Databricks Lakehouse platform. The ideal candidate should have strong hands-on skills in Apache Spark, PySpark, cloud data services, and a good grasp of Python and Java. This role involves close collaboration with architects, analysts, and developers to deliver scalable and high-performing data solutions across AWS, Azure, and GCP. ESSENTIAL JOB FUNCTIONS 1. Data Pipeline Development • Build scalable and efficient ETL/ELT workflows using Databricks and Spark for both batch and streaming data. • Leverage Delta Lake and Unity Catalog for structured data management and governance. • Optimize Spark jobs by tuning configurations, caching, partitioning, and serialization techniques. 2. Cloud-Based Implementation • Develop and deploy data workflows onAWS (S3, EMR,Glue), Azure (ADLS, ADF, Synapse), and/orGCP (GCS, Dataflow, BigQuery). • Manage and optimize data storage, access control, and pipeline orchestration using native cloud tools. • Use tools like Databricks Auto Loader and SQL Warehousing for efficient data ingestion and querying. 3. Programming & Automation • Write clean, reusable, and production-grade code in Python and Java. • Automate workflows using orchestration tools(e.g., Airflow, ADF, or Cloud Composer). • Implement robust testing, logging, and monitoring mechanisms for data pipelines. 4. Collaboration & Support • Collaborate with data analysts, data scientists, and business users to meet evolving data needs. • Support production workflows, troubleshoot failures, and resolve performance bottlenecks. • Document solutions, maintain version control, and follow Agile/Scrum processes Required Skills Technical Skills: • Databricks: Hands-on experience with notebooks, cluster management, Delta Lake, Unity Catalog, and job orchestration. • Spark: Expertise in Spark transformations, joins, window functions, and performance tuning. • Programming: Strong in PySpark and Java, with experience in data validation and error handling. • Cloud Services: Good understanding of AWS, Azure, or GCP data services and security models. • DevOps/Tools: Familiarity with Git, CI/CD, Docker (preferred), and data monitoring tools. Experience: • 5–8 years of data engineering or backend development experience. • Minimum 1–2 years of hands-on work in Databricks with Spark. • Exposure to large-scale data migration, processing, or analytics projects. Certifications (nice to have): Databricks Certified Data Engineer Associate Working Conditions Hours of work - Full-time hours; Flexibility for remote work with ensuring availability during US Timings. Overtime expectations - Overtime may not be required as long as the commitment is accomplished Work environment - Primarily remote; occasional on-site work may be needed only during client visit. Travel requirements - No travel required. On-call responsibilities - On-call duties during deployment phases. Special conditions or requirements - Not Applicable. Workplace Policies and Agreements Confidentiality Agreement: Required to safeguard client sensitive data. Non-Compete Agreement: Must be signed to ensure proprietary model security. Non-Disclosure Agreement: Must be signed to ensure client confidentiality and security. Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Linkedin logo

When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What you’ll be doing... We’re seeking a skilled Lead Senior Data Engineering Analyst to join our high-performing team and propel our telecom business forward. You’ll contribute to building cutting-edge data products and assets for our wireless and wireline operations, spanning areas like consumer analytics, network performance, and service assurance. In this role, you will develop deep expertise in various telecom domains. As part of the Data Architecture & Strategy team, you’ll collaborate closely with IT and business stakeholders to design and implement user-friendly, robust data product solutions. This includes defining data quality and incorporating data classification and governance principles. Your responsibilities encompass Collaborating with stakeholders to understand data requirements and translate them into efficient data models Defining the scope and purpose of data product solutions, collaborating with stakeholders to finalize project blueprints, and overseeing the design process through all phases of the release lifecycle. Designing, developing, and implementing data architecture solutions on GCP and Teradata to support our Telecom business. Designing data ingestion for both real-time and batch processing, ensuring efficient and scalable data acquisition for creating an effective data warehouse. Formulating End to End data solutions (Authoritative Data Source, Data Protection, Taxonomy Alignment) Maintaining meticulous documentation, including data design specifications, functional test cases, data lineage, and other relevant artifacts for all data product solution assets. Defining Data Architecture Strategy (Enterprise & Domain level) and Enterprise Data Model Standards & Ownership Proactively identifying opportunities for automation and performance optimization within your scope of work Collaborating effectively within a product-oriented organization, providing data expertise and solutions across multiple business units. Cultivating strong cross-functional relationships and establish yourself as a subject matter expert in data and analytics within the organization. Acting as a mentor to junior team members What we’re looking for... You’re curious about new technologies and the game-changing possibilities it creates. You like to stay up-to-date with the latest trends and apply your technical expertise to solve business problems. You thrive in a fast-paced, innovative environment working as a phenomenal teammate to drive the best results and business outcomes. You'll need to have… Bachelor’s degree or four or more years of work experience. Four or more years of relevant work experience. Four or more years of relevant work experience in data architecture, data warehousing, or a related role. Strong grasp of data architecture principles, best practices, and methodologies. Expertise in SQL for data analysis, data discovery, data profiling and solution design. Experience defining data standards, data quality and implementing industry best practices for scalable and maintainable data models using data modeling tools like Erwin Proven experience with ETL, data warehousing concepts, and the data management lifecycle Skilled in creating technical documentation, including source-to-target mappings and SLAs. Experience in shell scripting and python programming language Understanding of git version control and basic git command Hands-on experience with cloud services relevant to data engineering and architecture (e.g., BigQuery, Dataflow, Dataproc, Cloud Storage). Even better if you have one or more of the following… Master's degree in Computer Science. Experience in the Telecommunications industry, with knowledge of wireless and wireline business domains. Experience with stream-processing systems, API, Events etc. Certification in GCP-Data Engineer/Architect. Accuracy and attention to detail. Good problem solving, analytical, and research capabilities. Good verbal and written communication. Experience presenting to and influence stakeholders. Experience with large clusters, databases, BI tools, data quality and performance tuning. Experience in driving one or more smaller teams for technical delivery If Verizon and this role sound like a fit for you, we encourage you to apply even if you don’t meet every “even better” qualification listed above. #AI&D Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics. Show more Show less

Posted 5 days ago

Apply

8.0 years

2 - 7 Lacs

Hyderābād

On-site

Minimum qualifications: Bachelor's degree in Computer Science, a related technical field, or equivalent practical experience. 8 years of experience with software development in one or more programming languages (e.g., Python, C, C++, Java, JavaScript). 3 years of experience in a technical leadership role; overseeing projects, with 2 years of experience in a people management, supervision/team leadership role. Experience in one or more disciplines such as machine learning, recommendation systems, natural language processing, computer vision, pattern recognition, or artificial intelligence. Preferred qualifications: Understanding of agentic AI/ML and Large Language Model (LLM). Excellent coding skills. About the job Like Google's own ambitions, the work of a Software Engineer goes beyond just Search. Software Engineering Managers have not only the technical expertise to take on and provide technical leadership to major projects, but also manage a team of Engineers. You not only optimize your own code but make sure Engineers are able to optimize theirs. As a Software Engineering Manager you manage your project goals, contribute to product strategy and help develop your team. Teams work all across the company, in areas such as information retrieval, artificial intelligence, natural language processing, distributed computing, large-scale system design, networking, security, data compression, user interface design; the list goes on and is growing every day. Operating with scale and speed, our exceptional software engineers are just getting started - and as a manager, you guide the way. With technical and leadership expertise, you manage engineers across multiple teams and locations, a large product budget and oversee the deployment of large-scale projects across multiple sites internationally. At Corp Eng, we build world-leading business solutions that scale a more helpful Google for everyone. As Google’s IT organization, we provide end-to-end solutions for organizations across Google. We deliver the right tools, platforms, and experiences for all Googlers as they create more helpful products and services for everyone. In the simplest terms, we are Google for Googlers. Responsibilities Manage a team of AI software engineers, fostering a collaborative and high-performing environment. This includes hiring, mentoring, performance management, and career development. Drive the design, development, and deployment of scalable and reliable Artificial Intelligence/Machine Learning (AI/ML) systems and infrastructure relevant to HR applications (e.g., talent acquisition, performance management, employee engagement, workforce planning). Collaborate with Product Managers and HR stakeholders to understand business needs, define product requirements, and translate them into technical specifications and project plans. Oversee the architecture and implementation of data pipelines using Google's data processing infrastructure (e.g., Beam, Dataflow) to support AI/ML initiatives. Stay up-to-date of the latest advancements in AI/ML and related technologies, evaluating their potential application within human resources and guiding the team's adoption of relevant innovations. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form.

Posted 5 days ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description The Service360 Senior Data Engineer will be the trusted data advisor in GDI&A (Global Data Insights & Analytics) supporting the following teams: Ford Pro, FCSA (Ford Customer Service Analytics) and FCSD (Ford Customer Service Division) Business. This is an exciting opportunity that provides the Data Engineer a well-rounded experience. The position requires translation of the customer’s Analytical needs into specific data products that should be built in the GCP environment by collaborating with the Product Owners, Technical Anchor and the Customers Responsibilities Work on a small agile team to deliver curated data products for the Product Organization. Work effectively with fellow data engineers, product owners, data champions and other technical experts. Minimum of 5 years of experience with progressive responsibilities in software development Minimum of 3 years of experience defining product vision, strategy, product roadmaps and creating and managing backlogs Experience wrangling, transforming and visualizing large data sets from multiple sources, using a variety of tools Proficiency in SQL is a must have skill Excellent written and verbal communication skills Must be comfortable presenting to and interacting with cross-functional teams and customers Demonstrate technical knowledge and communication skills with the ability to advocate for well-designed solutions. Develop exceptional analytical data products using both streaming and batch ingestion patterns on Google Cloud Platform with solid data warehouse principles. Be the Subject Matter Expert in Data Engineering with a focus on GCP native services and other well integrated third-party technologies. Architect and implement sophisticated ETL pipelines, ensuring efficient data integration into Big Query from diverse batch and streaming sources. Spearhead the development and maintenance of data ingestion and analytics pipelines using cutting-edge tools and technologies, including Python, SQL, and DBT/Data form. Ensure the highest standards of data quality and integrity across all data processes. Data workflow management using Astronomer and Terraform for cloud infrastructure, promoting best practices in Infrastructure as Code Rich experience in Application Support in GCP. Experienced in data mapping, impact analysis, root cause analysis, and document data lineage to support robust data governance. Develop comprehensive documentation for data engineering processes, promoting knowledge sharing and system maintainability. Utilize GCP monitoring tools to proactively address performance issues and ensure system resilience, while providing expert production support. Provide strategic guidance and mentorship to team members on data transformation initiatives, championing data utility within the enterprise. Qualifications Experience working in GCP native (or equivalent) services like Big Query, Google Cloud Storage, PubSub, Dataflow, Dataproc etc. Experience working with Airflow for scheduling and orchestration of data pipelines. Experience working with Terraform to provision Infrastructure as Code. 2 + years professional development experience in Java or Python. Bachelor’s degree in computer science or related scientific field. Experience in analysing complex data, organizing raw data, and integrating massive datasets from multiple data sources to build analytical domains and reusable data products. Experience in working with architects to evaluate and productionalize data pipelines for data ingestion, curation, and consumption. Experience in working with stakeholders to formulate business problems as technical data requirements, identify and implement technical solutions while ensuring key business drivers are captured in collaboration with product management. Show more Show less

Posted 5 days ago

Apply

Exploring Dataflow Jobs in India

The dataflow job market in India is currently experiencing a surge in demand for skilled professionals. With the increasing reliance on data-driven decision-making in various industries, the need for individuals proficient in managing and analyzing dataflow is on the rise. This article aims to provide job seekers with valuable insights into the dataflow job landscape in India.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Delhi

These cities are known for their thriving tech ecosystems and are home to numerous companies actively hiring for dataflow roles.

Average Salary Range

The average salary range for dataflow professionals in India varies based on experience levels. Entry-level positions can expect to earn between INR 4-6 lakhs per annum, while experienced professionals can command salaries upwards of INR 12-15 lakhs per annum.

Career Path

In the dataflow domain, a typical career path may involve starting as a Junior Data Analyst or Data Engineer, progressing to roles such as Senior Data Scientist or Data Architect, and eventually reaching positions like Tech Lead or Data Science Manager.

Related Skills

In addition to expertise in dataflow tools and technologies, dataflow professionals are often expected to have proficiency in programming languages such as Python or R, knowledge of databases like SQL, and familiarity with data visualization tools like Tableau or Power BI.

Interview Questions

  • What is dataflow and how is it different from data streaming? (basic)
  • Explain the difference between batch processing and real-time processing. (medium)
  • How do you handle missing or null values in a dataset? (basic)
  • Can you explain the concept of data lineage? (medium)
  • What is the importance of data quality in dataflow processes? (basic)
  • How do you optimize dataflow pipelines for performance? (medium)
  • Describe a time when you had to troubleshoot a dataflow issue. (medium)
  • What are some common challenges faced in dataflow projects? (medium)
  • How do you ensure data security and compliance in dataflow processes? (medium)
  • What are the key components of a dataflow architecture? (medium)
  • Explain the concept of data partitioning in dataflow. (advanced)
  • How would you handle a sudden increase in data volume in a dataflow pipeline? (advanced)
  • What role does data governance play in dataflow processes? (medium)
  • Can you discuss the advantages and disadvantages of using cloud-based dataflow solutions? (medium)
  • How do you stay updated with the latest trends and technologies in dataflow? (basic)
  • What is the significance of metadata in dataflow management? (medium)
  • Walk us through a dataflow project you have worked on from start to finish. (medium)
  • How do you ensure data quality and consistency across different data sources in a dataflow pipeline? (medium)
  • What are some best practices for monitoring and troubleshooting dataflow pipelines? (medium)
  • How do you handle data transformations and aggregations in a dataflow process? (basic)
  • What are the key performance indicators you would track in a dataflow project? (medium)
  • How do you collaborate with cross-functional teams in a dataflow project? (basic)
  • Can you explain the concept of data replication in dataflow management? (advanced)
  • How do you approach data modeling in a dataflow project? (medium)
  • Describe a challenging dataflow problem you encountered and how you resolved it. (advanced)

Closing Remark

As you navigate the dataflow job market in India, remember to showcase your skills and experiences confidently during interviews. Stay updated with the latest trends in dataflow and continuously upskill to stand out in a competitive job market. Best of luck in your job search journey!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies