Home
Jobs

607 Dataflow Jobs - Page 20

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

15.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Minimum 15+ years of IT Experience and 3years Development on the GCP projects and 5 projects implemented Possess In depth knowledge and hands on development experience in building Distributed Data Solutions including ingestion, processing, consumption) (Must Have) You have experience with developing winning themes and then writing technical responses to bids (RFP’s & RFI’s) Strong Development Experience in either one of the Distributed Big Data processing (bulk) engines preferably using Spark on Dataproc or DataFLow (Must Have) Strong understanding and experience with Cloud Storage infrastructure and operationalizing GCP based storage services & solutions prefer GCP Bucket or related (Must Have) Strong experience on either one or more MPP Data Warehouse Platforms prefer BigQuery, CoudSQL, CLoudSpanner, DataStore, Firestore or similar (Must Have) Strong Development Experience on at least one or more event driven streaming platforms prefer PUB/SUB, Kafka or related (Must Have) Strong Development Experience on the Networking on GCP (Must Have) Strong Data Orchestration experience using tools such has Cloud Functions, DataFlow, Cloud Composer, Apache Airflow or related (Must Have) Strong Development Experience to build data piple using Kubernaties (Must Have) Strong Development Experience in IAM, KSM, Container Registr Assess use cases for various teams within the client company and evaluate pros and cons and justify recommended tooling and component solution options using GCP services, 3rd party and open source solutions (Must Have) Strong technical communication skills and ability to engage a variety of business and technical audiences explaining features, metrics of Big Data technologies based on experience with previous solutions (Must Have) Strong Data Cataloging experience preferably using Data Catalog (Must Have) Strong Understanding and experience in Logging and Cloud Monitoring solutions (Must Have) Strong Understanding of at least one or more Cluster Managers (Yarn, Hive, Pig, etc) (Must Have) Strong knowledge and understanding of CI/CD processes and tools (Must Have) Interface with client project sponsors to gather, assess and interpret client needs and requirements. Advising on database performance, altering the ETL process, providing SQL transformations, discussing API integration, and deriving business and technical KPIs Develop a data model around stated use cases to capture client’s KPIs and data transformations Assess, document and translate goals, objectives, problem statements, etc. to our offshore team and onshore management Show more Show less

Posted 3 weeks ago

Apply

5.0 - 7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description We're seeking a highly skilled and experienced Full Stack Data Engineer to play a pivotal role in the development and maintenance of our Enterprise Data Platform. In this role, you'll be responsible for designing, building, and optimizing scalable data pipelines within our Google Cloud Platform (GCP) environment. You'll work with GCP Native technologies like BigQuery, Dataflow, and Pub/Sub, ensuring data governance, security, and optimal performance. This is a fantastic opportunity to leverage your full-stack expertise, collaborate with talented teams, and establish best practices for data engineering at Ford. Responsibilities Data Pipeline Architect & Builder: Spearhead the design, development, and maintenance of scalable data ingestion and curation pipelines from diverse sources. Ensure data is standardized, high-quality, and optimized for analytical use. Leverage cutting-edge tools and technologies, including Python, SQL, and DBT/Dataform, to build robust and efficient data pipelines. End-to-End Integration Expert: Utilize your full-stack skills to contribute to seamless end-to-end development, ensuring smooth and reliable data flow from source to insight. GCP Data Solutions Leader : Leverage your deep expertise in GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Functions, etc.) to build and manage data platforms that not only meet but exceed business needs and expectations. Data Governance & Security Champion : Implement and manage robust data governance policies, access controls, and security best practices, fully utilizing GCP's native security features to protect sensitive data. Data Workflow Orchestrator : Employ Astronomer and Terraform for efficient data workflow management and cloud infrastructure provisioning, championing best practices in Infrastructure as Code (IaC). Performance Optimization Driver : Continuously monitor and improve the performance, scalability, and efficiency of data pipelines and storage solutions, ensuring optimal resource utilization and cost-effectiveness. Collaborative Innovator : Collaborate effectively with data architects, application architects, service owners, and cross-functional teams to define and promote best practices, design patterns, and frameworks for cloud data engineering. Automation & Reliability Advocate : Proactively automate data platform processes to enhance reliability, improve data quality, minimize manual intervention, and drive operational efficiency. Effective Communicator : Clearly and transparently communicate complex technical decisions to both technical and non-technical stakeholders, fostering understanding and alignment. Continuous Learner : Stay ahead of the curve by continuously learning about industry trends and emerging technologies, proactively identifying opportunities to improve our data platform and enhance our capabilities. Business Impact Translator : Translate complex business requirements into optimized data asset designs and efficient code, ensuring that our data solutions directly contribute to business goals. Documentation & Knowledge Sharer : Develop comprehensive documentation for data engineering processes, promoting knowledge sharing, facilitating collaboration, and ensuring long-term system maintainability. Qualifications Bachelor's degree in Computer Science, Information Technology, Information Systems, Data Analytics, or a related field (or equivalent combination of education and experience). 5-7 years of experience in Data Engineering or Software Engineering, with at least 2 years of hands-on experience building and deploying cloud-based data platforms (GCP preferred). Strong proficiency in SQL, Java, and Python, with practical experience in designing and deploying cloud-based data pipelines using GCP services like BigQuery, Dataflow, and DataProc. Solid understanding of Service-Oriented Architecture (SOA) and microservices, and their application within a cloud data platform. Experience with relational databases (e.g., PostgreSQL, MySQL), NoSQL databases, and columnar databases (e.g., BigQuery). Knowledge of data governance frameworks, data encryption, and data masking techniques in cloud environments. Familiarity with CI/CD pipelines, Infrastructure as Code (IaC) tools like Terraform and Tekton, and other automation frameworks. Excellent analytical and problem-solving skills, with the ability to troubleshoot complex data platform and microservices issues. Experience in monitoring and optimizing cost and compute resources for processes in GCP technologies (e.g., BigQuery, Dataflow, Cloud Run, DataProc). A passion for data, innovation, and continuous learning. Show more Show less

Posted 3 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Our company is looking for an experienced Senior Data Engineer to join our team. As a Senior Data Engineer, you will be working on a project that focuses on data integration and ETL for cloud-based platforms. You will be responsible for designing and implementing complex data solutions, ensuring that the data is accurate, reliable, and easily accessible. Responsibilities Design and implement complex data solutions for cloud-based platforms Develop ETL processes using SQL, Python, and other relevant technologies Ensure that data is accurate, reliable, and easily accessible for all stakeholders Collaborate with cross-functional teams to understand data integration needs and requirements Develop and maintain documentation, including technical specifications, data flow diagrams, and data mappings Monitor and optimize data integration processes for performance and efficiency, ensuring data accuracy and integrity Requirements Bachelor's degree in Computer Science, Electrical Engineering, or a related field 5-8 years of experience in data engineering Experience with cloud-native or Spark-based ETL tools such as AWS Glue, Azure Data Factory, or GCP Dataflow Strong knowledge of SQL for data querying and manipulation Experience with Snowflake for data warehousing Experience with cloud platforms such as AWS, GCP, or Azure for data storage and processing Excellent problem-solving skills and attention to detail Good verbal and written communication skills in English at a B2 level Nice to have Experience with ETL using Python Show more Show less

Posted 3 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Our organization is in search of a seasoned Senior Data Engineer to enhance our team. In this role, you will focus on projects involving data integration and ETL for cloud environments. Your primary duties will include the design and execution of intricate data solutions, maintaining data accuracy, dependability, and accessibility. Responsibilities Design and execute intricate data solutions for cloud environments Develop ETL processes utilizing SQL, Python, and other applicable technologies Maintain data accuracy, dependability, and accessibility for all stakeholders Work collaboratively with cross-functional teams to comprehend data integration necessities and specifications Create and uphold documentation such as technical specifications, data flow diagrams, and data mappings Optimize data integration processes for enhanced performance and efficiency while ensuring data accuracy and integrity Requirements Bachelor's degree in Computer Science, Electrical Engineering, or related field 5-8 years of experience in data engineering Proficiency in cloud-native or Spark-based ETL tools like AWS Glue, Azure Data Factory, or GCP Dataflow Strong knowledge of SQL for data querying and manipulation Familiarity with Snowflake for data warehousing Background in cloud platforms such as AWS, GCP, or Azure for data storage and processing Excellent problem-solving skills and meticulous attention to detail Good verbal and written communication skills in English at a B2 level Nice to have Background in ETL using Python Show more Show less

Posted 3 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

We are seeking a skilled Senior Data Engineer to become an integral part of our team. In this role, you will focus on projects involving data integration and ETL for cloud-based platforms. Your tasks will include creating and executing sophisticated data solutions, ensuring the integrity, reliability, and accessibility of data. Responsibilities Create and execute sophisticated data solutions for cloud-based platforms Construct ETL processes utilizing SQL, Python, and other applicable technologies Maintain data accuracy, reliability, and accessibility for all stakeholders Work with cross-functional teams to comprehend data integration needs and specifications Produce and uphold documentation, such as technical specifications, data flow diagrams, and data mappings Enhance data integration processes for performance and efficiency, upholding data accuracy and integrity Requirements Bachelor’s degree in Computer Science, Electrical Engineering, or a related field 5-8 years of experience in data engineering Proficiency in cloud-native or Spark-based ETL tools like AWS Glue, Azure Data Factory, or GCP Dataflow Strong understanding of SQL for data querying and manipulation Background in Snowflake for data warehousing Familiarity with cloud platforms such as AWS, GCP, or Azure for data storage and processing Exceptional problem-solving abilities and meticulous attention to detail Strong verbal and written communication skills in English at a B2 level Nice to have Background in ETL using Python Show more Show less

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Role Description Key Responsibilities: Develop And Maintain Web Applications Build dynamic, user-centric web applications using React, React Hooks, and modern web technologies like HTML5 and CSS3. Ensure that the application is scalable, maintainable, and easy to navigate for end-users. Collaborate With Cross-Functional Teams Work closely with designers and product teams to bring UI/UX designs to life, ensuring the design vision is executed effectively using HTML and CSS. Ensure the application is responsive, performing optimally across all devices and browsers. State Management Utilize Redux to manage and streamline complex application states, ensuring seamless data flow and smooth user interactions. Component Development Develop reusable, modular, and maintainable React components using React Hooks and CSS/SCSS to style components effectively. Build component libraries and implement best practices to ensure code maintainability and reusability. Role Proficiency This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency. Validate results with user representatives integrating the overall solution seamlessly. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Influence and improve customer satisfaction through effective data solutions. Measures Of Outcomes Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Number of data security incidents or compliance breaches. Outputs Expected Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Define coding standards templates and checklists. Review code for team members and peers. Documentation Create and review templates checklists guidelines and standards for design processes and development. Create and review deliverable documents including design documents architecture documents infrastructure costing business requirements source-target mappings test cases and results. Configuration Define and govern the configuration management plan. Ensure compliance within the team. Testing Review and create unit test cases scenarios and execution plans. Review the test plan and test strategy developed by the testing team. Provide clarifications and support to the testing team as needed. Domain Relevance Advise data engineers on the design and development of features and components demonstrating a deeper understanding of business needs. Learn about customer domains to identify opportunities for value addition. Complete relevant domain certifications to enhance expertise. Project Management Manage the delivery of modules effectively. Defect Management Perform root cause analysis (RCA) and mitigation of defects. Identify defect trends and take proactive measures to improve quality. Estimation Create and provide input for effort and size estimation for projects. Knowledge Management Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Management Execute and monitor the release process to ensure smooth transitions. Design Contribution Contribute to the creation of high-level design (HLD) low-level design (LLD) and system architecture for applications business components and data models. Customer Interface Clarify requirements and provide guidance to the development team. Present design options to customers and conduct product demonstrations. Team Management Set FAST goals and provide constructive feedback. Understand team members' aspirations and provide guidance and opportunities for growth. Ensure team engagement in projects and initiatives. Certifications Obtain relevant domain and technology certifications to stay competitive and informed. Skill Examples Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning of data processes. Expertise in designing and optimizing data warehouses for cost efficiency. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets. Capacity to clearly explain and communicate design and development aspects to customers. Ability to estimate time and resource requirements for developing and debugging features or components. Knowledge Examples Knowledge Examples Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models relevant to various business contexts. Familiarity with domain-related data and its implications. Expertise in data warehousing optimization techniques. Knowledge of data security concepts and best practices. Familiarity with design patterns and frameworks in data engineering. Additional Comments "Frontend developer Required Skills and Experience: React.js Proficiency: - In-depth knowledge of React.js, JSX, React Hooks, and React Router. - Experience with state management using Redux or similar libraries. - Familiar with React performance optimization techniques, including lazy loading, memoization, and code splitting. - Experience with tools like react-testing-library, NPM (vite, Yup, Formik). CSS Expertise: - Strong proficiency in CSS, including the use of third-party frameworks like Material-UI (MUI) and Tailwind CSS for styling. - Ability to create responsive, visually appealing layouts with modern CSS practices. JavaScript/ES6+ Expertise: - Strong command of modern JavaScript (ES6+), including async/await, destructuring, and class-based components. - Familiarity with other JavaScript frameworks and libraries such as TypeScript is a bonus. Version Control: - Proficient with Git and platforms like GitHub or GitLab, including managing pull requests and version control workflows. API Integration: - Experienced in interacting with RESTful APIs. - Understanding of authentication mechanisms like JWT. Testing: - Able to write unit, integration, and end-to-end tests using tools such as react-testing-library. ------------------------------------------------------------------------------------------------------------------- Basic Qualifications: - At least 3 years of experience working with JavaScript frameworks, particularly React.js. - Experience working in cloud environments (AWS, Azure, Google Cloud) is a plus. - Basic understanding of backend technologies such as Python is advantageous." Skills Cloud Services,Backend Systems,Css Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

We are seeking a highly skilled and motivated Lead GCP Data Engineer to join our team. The role is critical to the development of a cutting-edge enterprise data products and solutions. The GCP Data Engineer will design, implement, and maintain scalable, reliable, and efficient data solutions on Google Cloud Platform (GCP). The role focuses on enabling data-driven decision-making by developing ETL/ELT pipelines, managing large-scale datasets, and optimizing data workflows. The ideal candidate is a proactive problem-solver with strong technical expertise in GCP, a passion for data engineering, and a commitment to delivering high-quality solutions aligned with business needs. Job Description: Key Responsibilities : Data Engineering & Development : Design, build, and maintain scalable ETL/ELT pipelines for ingesting, processing, and transforming structured and unstructured data. Implement enterprise-level data solutions using GCP services such as BigQuery, Cloud Storage, Dataflow, Cloud Functions, Cloud Pub/Sub, and Cloud Composer. Develop and optimize data architectures that support real-time and batch data processing. Cloud Infrastructure Management : Manage and deploy GCP infrastructure components to enable seamless data workflows. Ensure data solutions are robust, scalable, and cost-effective, leveraging GCP best practices. Collaboration and Stakeholder Engagement : Work closely with cross-functional teams, including data analysts, data scientists, DevOps, and business stakeholders, to deliver data projects aligned with business goals. Translate business requirements into scalable, technical solutions while collaborating with team members to ensure successful implementation. Quality Assurance & Optimization : Implement best practices for data governance, security, and privacy, ensuring compliance with organizational policies and regulations. Conduct thorough quality assurance, including testing and validation, to ensure the accuracy and reliability of data pipelines. Monitor and optimize pipeline performance to meet SLAs and minimize operational costs. Qualifications and Certifications : Education: Bachelor’s or Master’s degree in Computer Science, Information Technology, Engineering, or a related field. Experience: Minimum of 5 years of experience in data engineering, with at least 3 years working on GCP cloud platforms. Proven experience designing and implementing data workflows using GCP services like BigQuery, Cloud Dataflow, Dataform, Cloud Pub/Sub, and Cloud Composer. Certifications: Google Cloud Professional Data Engineer certification preferred. Key Skills : Mandatory Skills: Advanced proficiency in Python for data pipelines and automation. Strong SQL skills for querying, transforming, and analyzing large datasets. Expertise in GCP services such as BigQuery, Cloud Functions, DBT, Cloud Storage, Dataflow, and Kubernetes (GKE). Hands-on experience with CI/CD tools such as Jenkins, Git, or Bitbucket. Familiarity with workflow orchestration tools like Apache Airflow or Cloud Composer. Nice-to-Have Skills: Experience with other cloud platforms like AWS or Azure. Knowledge of data visualization tools (e.g., Looker, Tableau). Understanding of machine learning workflows and their integration with data pipelines. Soft Skills : Strong problem-solving and critical-thinking abilities. Excellent communication skills to collaborate with technical and non-technical stakeholders. Proactive attitude towards innovation and learning. Ability to work independently and as part of a collaborative team. Location: Mumbai Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Apply with updated CV to hr@bitstringit.com Main Task : Maintain & develop data platforms based on Microsoft Fabric for Business Intelligence & Databricks for real-time data analytics Design, implement and maintain standardized production-grade data pipelines using modern data transformation processes and workflows for SAP, MS Dynamics, on-premise or cloud. Develop an enterprise-scale cloud-based Data Lake for business intelligence solutions. Translate business and customer needs into data collection, preparation and processing requirements. Optimize the performance of algorithms developed by Data Scientists General administration and monitoring of the data platforms Competencies : working with structured & unstructured data experienced in various database technologies (RDBMS, OLAP, Timeseries, etc.) solid programming skills (Python, SQL, Scala is a plus) experience in Microsoft Fabric (incl. Warehouse, Lakehouse, Data Factory, DataFlow Gen2, Semantic Model) and/or Databricks (Spark) proficient in PowerBI experienced working with APIs proficient in security best practices data centric Azure know-how is a plus (Storage, Networking, Security, Billing) Education / experience / language: • Bachelor or Master degree in business informatics, computer science, or equal • A background in software engineering (e.g., agile programming, project organization) and experience with human centered design would be desirable • Extensive experience in handling large data sets • Experience working at least 5 years as a data engineer, preferably in an industrial company • Analytical problem-solving skills and the ability to assimilate complex information • Programming experience in modern data-oriented languages (SQL, Python) • Experience with Apache Spark and DevOps Show more Show less

Posted 3 weeks ago

Apply

47.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Data Engineer (SaaS-Based) Location : Noida (In-office/Hybrid; Client site if required) Experience : 47 years Type : Full-Time | Immediate Joiners Preferred Shift : 3 PM to 12 AM IST Client : Leading Tech Company Good to have : GCP Certified Data Engineer Overview Of The Role As a GCP Data Engineer, you'll focus on solving problems and creating value for the business by building solutions that are reliable and scalable to work with the size and scope of the company. You will be tasked with creating custom-built pipelines as well as migrating on-prem data pipelines to the GCP stack. You will be part of a team tackling intricate problems by designing and deploying reliable and scalable solutions tailored to the company's data landscape. Required Skills 5+ years of industry experience in software development, data engineering, business intelligence, or related field with experience in manipulating, processing, and extracting value from datasets. Extensive experience in doing requirement discovery, analysis and data pipeline solution design. Design, build and deploy internal applications to support our technology life cycle, collaboration and spaces, service delivery management, data and business intelligence among others. Building Modular code for multi usable pipeline or any kind of complex Ingestion Framework used to ease the job to load the data into Datalake or Data Warehouse from multiple sources. Work closely with analysts and business process owners to translate business requirements into technical solutions. Coding experience in scripting and languages (Python, SQL, PySpark). Expertise in Google Cloud Platform (GCP) technologies in the data warehousing space (BigQuery, GCP Workflows, Cloud Scheduler, Secret Manager, Batch, Cloud Logging, Cloud SDK, Google Cloud Storage, IAM). Exposure of Google Dataproc and Dataflow. Maintain highest levels of development practices including: technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, writing clean, modular and self-sustaining code, with repeatable quality and predictability. Understanding CI/CD Processes using Pulumi, GitHub, Cloud Build, Cloud SDK, Docker Experience with SAS/SQL Server/SSIS is an added advantage. Qualifications Bachelor's degree in Computer Science or related technical field, or equivalent practical experience. GCP Certified Data Engineer (preferred) Excellent verbal and written communication skills with the ability to effectively advocate technical solutions to other engineering teams and business audiences. (ref:hirist.tech) Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description Technical Skills: Proficient in Java, angular or any javascript technology with experience in designing and deploying cloud-based data pipelines and microservices using GCP tools like BigQuery, Dataflow, and Dataproc. Ability to leverage best in-class data platform technologies (Apache Beam, Kafka, …) to deliver platform features, and design & orchestrate platform services to deliver data platform capabilities. Service-Oriented Architecture and Microservices: Strong understanding of SOA, microservices, and their application within a cloud data platform context. Develop robust, scalable services using Java Spring Boot, Python, Angular, and GCP technologies. Full-Stack Development: Knowledge of front-end and back-end technologies, enabling collaboration on data access and visualization layers (e.g., React, Node.js). Design and develop RESTful APIs for seamless integration across platform services. Implement robust unit and functional tests to maintain high standards of test coverage and quality. Database Management: Experience with relational (e.g., PostgreSQL, MySQL) and NoSQL databases, as well as columnar databases like BigQuery. Data Governance and Security: Understanding of data governance frameworks and implementing RBAC, encryption, and data masking in cloud environments. CI/CD and Automation: Familiarity with CI/CD pipelines, Infrastructure as Code (IaC) tools like Terraform, and automation frameworks. Manage code changes with GitHub and troubleshoot and resolve application defects efficiently. Ensure adherence to SDLC best practices, independently managing feature design, coding, testing, and production releases. Problem-Solving: Strong analytical skills with the ability to troubleshoot complex data platform and microservices issues. Certifications (Preferred): GCP Data Engineer, GCP Professional Cloud Responsibilities Design and Build Data Pipelines: Architect, develop, and maintain scalable data pipelines and microservices that support real-time and batch processing on GCP. Service-Oriented Architecture (SOA) and Microservices: Design and implement SOA and microservices-based architectures to ensure modular, flexible, and maintainable data solutions. Full-Stack Integration: Leverage your full-stack expertise to contribute to the seamless integration of front-end and back-end components, ensuring robust data access and UI-driven data exploration. Data Ingestion and Integration: Lead the ingestion and integration of data from various sources into the data platform, ensuring data is standardized and optimized for analytics. GCP Data Solutions: Utilize GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Functions, etc.) to build and manage data platforms that meet business needs. Data Governance and Security: Implement and manage data governance, access controls, and security best practices while leveraging GCP’s native row- and column-level security features. Performance Optimization: Continuously monitor and improve the performance, scalability, and efficiency of data pipelines and storage solutions. Collaboration and Best Practices: Work closely with data architects, software engineers, and crossfunctional teams to define best practices, design patterns, and frameworks for cloud data engineering. Automation and Reliability: Automate data platform processes to enhance reliability, reduce manual intervention, and improve operational efficiency. Qualifications Bachelor’s degree in Computer Science, Data Engineering, Information Systems, or a related field. Master’s degree or equivalent experience preferred. Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. L&A Business Consultant Working as part of the Consulting team, you will take part in engagements related to a wide range of topics. Some examples of domains in which you will support our clients include the following: Proficient in Individual and Group Life Insurance concepts, different type of Annuity products etc. Proficient in different insurance plans - Qualified/Non-Qualified Plans, IRA, Roth IRA, CRA, SEP Solid knowledge on the Policy Life cycle Illustrations/Quote/Rating New Business & Underwriting Policy Servicing and Administration Billing & Payment Claims Processing Disbursement (Systematic withdrawals, RMD, Surrenders) Regulatory Changes & Taxation Understanding of business rules of Pay-out Understanding on upstream and downstream interfaces for policy lifecycle Experience in DXC Platforms – Vantage, wmA, nbA, CSA, Cyber-life, Life70, Life Asia, PerformancePlus Consulting Skills – Experience in creating business process map for future state architecture, creating WBS for overall conversion strategy, requirement refinement process in multi-vendor engagement. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Technology Skills - Experienced in data migration projects, ensuring seamless transfer of data between systems while maintaining data integrity and security. Skilled in data analytics, utilizing various tools and techniques to extract insights and drive informed decision-making. Strong understanding of data governance principles and best practices, ensuring data quality and compliance. Collaborative team player, able to work closely with stakeholders and technical teams to define requirements and implement effective solutions. Industry certifications (AAPA/LOMA) will be added advantage. Experience on these COTS product is preferrable. FAST ALIP OIPA wmA We expect you to work effectively as a team member and build good relationships with the client. You will have the opportunity to expand your domain knowledge and skills and will be able to collaborate frequently with other EY professionals with a wide variety of expertise. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. JD for L&A Business Consultant Working as part of the Consulting team, you will take part in engagements related to a wide range of topics. Some examples of domains in which you will support our clients include the following: Proficient in Individual and Group Life Insurance concepts, different type of Annuity products etc. Proficient in different insurance plans - Qualified/Non-Qualified Plans, IRA, Roth IRA, CRA, SEP Solid knowledge on the Policy Life cycle Illustrations/Quote/Rating New Business & Underwriting Policy Servicing and Administration Billing & Payment Claims Processing Disbursement (Systematic withdrawals, RMD, Surrenders) Regulatory Changes & Taxation Understanding of business rules of Pay-out Demonstrated ability of Insurance Company Operations like Nonforfeiture option/ Face amount increase, decrease/ CVAT or GPT calculations /Dollar cost averaging and perform their respective transactions. Understanding on upstream and downstream interfaces for policy lifecycle Consulting Skills – Experience in creating business process map for future state architecture, creating WBS for overall conversion strategy, requirement refinement process in multi-vendor engagement. Worked on multiple Business transformation and modernization programs. Conducted multiple Due-Diligence and Assessment projects as part of Transformation roadmaps to evaluate current state maturity, gaps in functionalities and COTs solution features. Requirements Gathering, Elicitation –writing BRDs, FSDs. Conducting JAD sessions and Workshops to capture requirements and working close with Product Owner. Work with the client to define the most optimal future state operational process and related product configuration. Define scope by providing innovative solutions and challenging all new client requirements and change requests but simultaneously ensuring that client gets the required business value. Elaborate and deliver clearly defined requirement documents with relevant dataflow and process flow diagrams. Work closely with product design development team to analyse and extract functional enhancements. Provide product consultancy and assist the client with acceptance criteria gathering and support throughout the project life cycle. Technology Skills - Proficient in technology solution architecture, with a focus on designing innovative and effective solutions. Experienced in data migration projects, ensuring seamless transfer of data between systems while maintaining data integrity and security. Skilled in data analytics, utilizing various tools and techniques to extract insights and drive informed decision-making. Strong understanding of data governance principles and best practices, ensuring data quality and compliance. Collaborative team player, able to work closely with stakeholders and technical teams to define requirements and implement effective solutions. Industry certifications (AAPA/LOMA) will be added advantage. Experience on these COTS product is preferrable. FAST ALIP OIPA wmA We expect you to work effectively as a team member and build good relationships with the client. You will have the opportunity to expand your domain knowledge and skills and will be able to collaborate frequently with other EY professionals with a wide variety of expertise. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 3 weeks ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Chennai, Tamil Nadu

Work from Office

Naukri logo

Duration: 12Months Work Type: Onsite Position Description: We seeking an experienced GCP Data Engineer who can build cloud analytics platform to meet ever expanding business requirements with speed and quality using lean Agile practices. You will work on analyzing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and Analytics in the Google Cloud Platform (GCP). You will be responsible for designing the transformation and modernization on GCP, as well as landing data from source applications to GCP. Experience with large scale solution and operationalization of data warehouses, data lakes and analytics platforms on Google Cloud Platform or other cloud environment is a must. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on Google Cloud Platform. Skills Required: Experience in working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Implement methods for automation of all parts of the pipeline to minimize labor in development and production Experience in analyzing complex data, organizing raw data and integrating massive datasets from multiple data sources to build subject areas and reusable data products Experience in working with architects to evaluate and productionalize appropriate GCP tools for data ingestion, integration, presentation, and reporting Experience in working with all stakeholders to formulate business problems as technical data requirement, identify and implement technical solutions while ensuring key business drivers are captured in collaboration with product management Proficient in Machine Learning model architecture, data pipeline interaction and metrics interpretation. This includes designing and deploying a pipeline with automated data lineage. Identify, develop, evaluate and summarize Proof of Concepts to prove out solutions. Test and compare competing solutions and report out a point of view on the best solution. Integration between GCP Data Catalog and Informatica EDC. Design and build production data engineering solutions to deliver pipeline patterns using Google Cloud Platform (GCP) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. Skills Preferred: Strong drive for results and ability to multi-task and work independently Self-starter with proven innovation skills Ability to communicate and work with cross-functional teams and all levels of management Demonstrated commitment to quality and project timing Demonstrated ability to document complex systems Experience in creating and executing detailed test plans Experience Required: 3 to 5 Yrs Education Required: BE or Equivalent

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description Job Description As a Technical Lead, you will be working on both offshore and onsite client projects. You will be working in projects which will involve Oracle BI Applications/ FAW or OBIEE/OAC/ ODI Implementation, You will be Interacting with client to understand and gather requirement You will be responsible for technical design, development, and system/integration testing using oracle methodologies Desired Profile End –to-end ODI, OAC and Oracle BI Applications/ FAW implementation experience Expert knowledge of BI Applications/ FAW including basic and advanced configurations with Oracle eBS suite/ Fusion as the source system Expert knowledge of OBIEE/OAC RPD design and reports design Expert knowledge ETL(ODI) design/ OCI DI/ OCI Dataflow Mandatory to have 1 of these skills : PLSQL/ BI Publisher/BI Apps Good to have EDQ, Pyspark skills Architectural Solution Definition Any Industry Standard Certifications will be a plus Good knowledge in Oracle database and development Experience in the database application. Creativity, Personal Drive, Influencing and Negotiating, Problem Solving Building Effective Relationships, Customer Focus, Effective Communication, Coaching Ready to travel as and when required by project Experience 8-12 yrs of Data warehousing and Business Intelligence project experience Having 4-6 years of project experience on BI Applications/ FAW and OBIEE/OAC/ ODI/ OCI DI with at least 2 complete lifecycle implementations 4-6 yrs of specialized BI Applications and OBIEE/OAC/ ODI/ OCI DI customization and solution architecture experience. Worked on Financial, SCM or HR Analytics recently in implementation and configuration Career Level - IC3 Responsibilities Job Description As a Technical Lead, you will be working on both offshore and onsite client projects. You will be working in projects which will involve Oracle BI Applications/ FAW or OBIEE/OAC/ ODI Implementation, You will be Interacting with client to understand and gather requirement You will be responsible for technical design, development, and system/integration testing using oracle methodologies Desired Profile End –to-end ODI, OAC and Oracle BI Applications/ FAW implementation experience Expert knowledge of BI Applications/ FAW including basic and advanced configurations with Oracle eBS suite/ Fusion as the source system Expert knowledge of OBIEE/OAC RPD design and reports design Expert knowledge ETL(ODI) design/ OCI DI/ OCI Dataflow Mandatory to have 1 of these skills : PLSQL/ BI Publisher/BI Apps Good to have EDQ, Pyspark skills Architectural Solution Definition Any Industry Standard Certifications will be a plus Good knowledge in Oracle database and development Experience in the database application. Creativity, Personal Drive, Influencing and Negotiating, Problem Solving Building Effective Relationships, Customer Focus, Effective Communication, Coaching Ready to travel as and when required by project Experience 8-12 yrs of Data warehousing and Business Intelligence project experience Having 4-6 years of project experience on BI Applications/ FAW and OBIEE/OAC/ ODI/ OCI DI with at least 2 complete lifecycle implementations 4-6 yrs of specialized BI Applications and OBIEE/OAC/ ODI/ OCI DI customization and solution architecture experience. Worked on Financial, SCM or HR Analytics recently in implementation and configuration About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Job Description Job Description As a Technical Lead, you will be working on both offshore and onsite client projects. You will be working in projects which will involve Oracle BI Applications/ FAW or OBIEE/OAC/ ODI Implementation, You will be Interacting with client to understand and gather requirement You will be responsible for technical design, development, and system/integration testing using oracle methodologies Desired Profile End –to-end ODI, OAC and Oracle BI Applications/ FAW implementation experience Expert knowledge of BI Applications/ FAW including basic and advanced configurations with Oracle eBS suite/ Fusion as the source system Expert knowledge of OBIEE/OAC RPD design and reports design Expert knowledge ETL(ODI) design/ OCI DI/ OCI Dataflow Mandatory to have 1 of these skills : PLSQL/ BI Publisher/BI Apps Good to have EDQ, Pyspark skills Architectural Solution Definition Any Industry Standard Certifications will be a plus Good knowledge in Oracle database and development Experience in the database application. Creativity, Personal Drive, Influencing and Negotiating, Problem Solving Building Effective Relationships, Customer Focus, Effective Communication, Coaching Ready to travel as and when required by project Experience 8-12 yrs of Data warehousing and Business Intelligence project experience Having 4-6 years of project experience on BI Applications/ FAW and OBIEE/OAC/ ODI/ OCI DI with at least 2 complete lifecycle implementations 4-6 yrs of specialized BI Applications and OBIEE/OAC/ ODI/ OCI DI customization and solution architecture experience. Worked on Financial, SCM or HR Analytics recently in implementation and configuration Career Level - IC3 Responsibilities Job Description As a Technical Lead, you will be working on both offshore and onsite client projects. You will be working in projects which will involve Oracle BI Applications/ FAW or OBIEE/OAC/ ODI Implementation, You will be Interacting with client to understand and gather requirement You will be responsible for technical design, development, and system/integration testing using oracle methodologies Desired Profile End –to-end ODI, OAC and Oracle BI Applications/ FAW implementation experience Expert knowledge of BI Applications/ FAW including basic and advanced configurations with Oracle eBS suite/ Fusion as the source system Expert knowledge of OBIEE/OAC RPD design and reports design Expert knowledge ETL(ODI) design/ OCI DI/ OCI Dataflow Mandatory to have 1 of these skills : PLSQL/ BI Publisher/BI Apps Good to have EDQ, Pyspark skills Architectural Solution Definition Any Industry Standard Certifications will be a plus Good knowledge in Oracle database and development Experience in the database application. Creativity, Personal Drive, Influencing and Negotiating, Problem Solving Building Effective Relationships, Customer Focus, Effective Communication, Coaching Ready to travel as and when required by project Experience 8-12 yrs of Data warehousing and Business Intelligence project experience Having 4-6 years of project experience on BI Applications/ FAW and OBIEE/OAC/ ODI/ OCI DI with at least 2 complete lifecycle implementations 4-6 yrs of specialized BI Applications and OBIEE/OAC/ ODI/ OCI DI customization and solution architecture experience. Worked on Financial, SCM or HR Analytics recently in implementation and configuration About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Job Description Job Description As a Technical Lead, you will be working on both offshore and onsite client projects. You will be working in projects which will involve Oracle BI Applications/ FAW or OBIEE/OAC/ ODI Implementation, You will be Interacting with client to understand and gather requirement You will be responsible for technical design, development, and system/integration testing using oracle methodologies Desired Profile End –to-end ODI, OAC and Oracle BI Applications/ FAW implementation experience Expert knowledge of BI Applications/ FAW including basic and advanced configurations with Oracle eBS suite/ Fusion as the source system Expert knowledge of OBIEE/OAC RPD design and reports design Expert knowledge ETL(ODI) design/ OCI DI/ OCI Dataflow Mandatory to have 1 of these skills : PLSQL/ BI Publisher/BI Apps Good to have EDQ, Pyspark skills Architectural Solution Definition Any Industry Standard Certifications will be a plus Good knowledge in Oracle database and development Experience in the database application. Creativity, Personal Drive, Influencing and Negotiating, Problem Solving Building Effective Relationships, Customer Focus, Effective Communication, Coaching Ready to travel as and when required by project Experience 8-12 yrs of Data warehousing and Business Intelligence project experience Having 4-6 years of project experience on BI Applications/ FAW and OBIEE/OAC/ ODI/ OCI DI with at least 2 complete lifecycle implementations 4-6 yrs of specialized BI Applications and OBIEE/OAC/ ODI/ OCI DI customization and solution architecture experience. Worked on Financial, SCM or HR Analytics recently in implementation and configuration Career Level - IC3 Responsibilities Job Description As a Technical Lead, you will be working on both offshore and onsite client projects. You will be working in projects which will involve Oracle BI Applications/ FAW or OBIEE/OAC/ ODI Implementation, You will be Interacting with client to understand and gather requirement You will be responsible for technical design, development, and system/integration testing using oracle methodologies Desired Profile End –to-end ODI, OAC and Oracle BI Applications/ FAW implementation experience Expert knowledge of BI Applications/ FAW including basic and advanced configurations with Oracle eBS suite/ Fusion as the source system Expert knowledge of OBIEE/OAC RPD design and reports design Expert knowledge ETL(ODI) design/ OCI DI/ OCI Dataflow Mandatory to have 1 of these skills : PLSQL/ BI Publisher/BI Apps Good to have EDQ, Pyspark skills Architectural Solution Definition Any Industry Standard Certifications will be a plus Good knowledge in Oracle database and development Experience in the database application. Creativity, Personal Drive, Influencing and Negotiating, Problem Solving Building Effective Relationships, Customer Focus, Effective Communication, Coaching Ready to travel as and when required by project Experience 8-12 yrs of Data warehousing and Business Intelligence project experience Having 4-6 years of project experience on BI Applications/ FAW and OBIEE/OAC/ ODI/ OCI DI with at least 2 complete lifecycle implementations 4-6 yrs of specialized BI Applications and OBIEE/OAC/ ODI/ OCI DI customization and solution architecture experience. Worked on Financial, SCM or HR Analytics recently in implementation and configuration About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 3 weeks ago

Apply

4.0 - 7.0 years

8 - 14 Lacs

Noida

Hybrid

Naukri logo

Data Engineer (L3) || GCP Certified Employment Type : Full-Time Work Mode : In-office/ Hybrid Notice : Immediate joiners As a Data Engineer, you will design, develop, and support data pipelines and related data products and platforms. Your primary responsibilities include designing and building data extraction, loading, and transformation pipelines across on-prem and cloud platforms. You will perform application impact assessments, requirements reviews, and develop work estimates. Additionally, you will develop test strategies and site reliability engineering measures for data products and solutions, participate in agile development ""scrums"" and solution reviews, mentor junior Data Engineering Specialists, lead the resolution of critical operations issues, and perform technical data stewardship tasks, including metadata management, security, and privacy by design. Required Skills : Design, develop, and support data pipelines and related data products and platforms. Design and build data extraction, loading, and transformation pipelines and data products across on-prem and cloud platforms. Perform application impact assessments, requirements reviews, and develop work estimates. Develop test strategies and site reliability engineering measures for data products and solutions. Participate in agile development ""scrums"" and solution reviews. Mentor junior Data Engineers. Lead the resolution of critical operations issues, including post-implementation reviews. Perform technical data stewardship tasks, including metadata management, security, and privacy by design. Design and build data extraction, loading, and transformation pipelines using Python and other GCP Data Technologies Demonstrate SQL and database proficiency in various data engineering tasks. Automate data workflows by setting up DAGs in tools like Control-M, Apache Airflow, and Prefect. Develop Unix scripts to support various data operations. Model data to support business intelligence and analytics initiatives. Utilize infrastructure-as-code tools such as Terraform, Puppet, and Ansible for deployment automation. Expertise in GCP data warehousing technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog, Cloud Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion and Dataproc (good to have). data pipelines, agile development,scrums, GCP Data Technologies, Python, DAGs, Control-M, Apache Airflow, Data solution architecture Qualifications : Bachelor's degree in Software Engineering, Computer Science, Business, Mathematics, or related field. 4+ years of data engineering experience. 2 years of data solution architecture and design experience. GCP Certified Data Engineer (preferred). Job Type : Full-time

Posted 3 weeks ago

Apply

15.0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Lead Data Engineering Manager – GCP Cloud Migration Location: [Remote] Experience: 12–15 Years (5+ Years Leading Data Teams, 8+ Years in Data Engineering) Employment Type: Full-Time About the Role: We are seeking an experienced Lead Data Engineering Manager to drive the end-to-end migration of enterprise data platforms, ETL pipelines, and data warehouses to the cloud — with a focus on Google Cloud Platform (GCP) . This role will lead high-performing engineering teams and collaborate with cross-functional stakeholders to architect and execute scalable, secure, and modern data solutions using BigQuery, Dataform, Dataplex , and other cloud-native tools. A background in premium consulting or strategic technology advisory is highly preferred, as this role will engage with executive stakeholders and contribute to data transformation strategies at the enterprise level. Key Responsibilities: Lead and mentor Data Engineering teams across design, development, and deployment of modern cloud data architectures. Drive cloud migration initiatives including re-platforming legacy ETL workflows and on-prem DWHs to GCP-based solutions . Architect and implement scalable data pipelines using BigQuery, Dataform , and orchestration tools. Ensure robust data governance and cataloging practices leveraging Dataplex and other GCP services. Collaborate with data analysts, data scientists, and business stakeholders to enable advanced analytics and ML capabilities. Establish and enforce engineering best practices, CI/CD pipelines, and monitoring strategies. Provide technical leadership, project planning, and resource management to deliver projects on time and within scope. Represent the data engineering function in client or leadership meetings, especially in a consulting or multi-client context. Required Skills & Qualifications: 12–15 years of total experience, with 8+ years in data engineering and 5+ years in team leadership roles. Proven expertise in cloud-based data platforms, especially GCP (BigQuery, Dataflow, Dataform, Dataplex, Cloud Composer, Pub/Sub) . Strong knowledge of modern ETL/ELT practices, data modeling, and pipeline orchestration. Experience with data warehouse modernization and migration from platforms like Teradata, Oracle, or Hadoop to GCP. Familiarity with data governance , metadata management , and data cataloging . Background in consulting or strategic data advisory with Fortune 500 clients preferred. Hands-on skills in SQL, Python, and cloud infrastructure-as-code (e.g., Terraform). Strong communication, stakeholder engagement, and leadership presence. Preferred Qualifications: GCP Data Engineer or Architect Certification. Experience with agile methodologies and DevOps practices. Prior work with multi-cloud or hybrid environments is a plus. Experience in regulated industries (finance, healthcare, etc.) is advantageous. Show more Show less

Posted 3 weeks ago

Apply

4.0 - 7.0 years

10 - 14 Lacs

Noida

Work from Office

Naukri logo

Location: Noida (In-office/Hybrid; client site if required) Type: Full-Time | Immediate Joiners Preferred Must-Have Skills: GCP (BigQuery, Dataflow, Dataproc, Cloud Storage) PySpark / Spark Distributed computing expertise Apache Iceberg (preferred), Hudi, or Delta Lake Role Overview: Be part of a high-impact Data Engineering team focused on building scalable, cloud-native data pipelines. You'll support and enhance EMR platforms using DevOps principles, helping deliver real-time health alerts and diagnostics for platform performance. Key Responsibilities: Provide data engineering support to EMR platforms Design and implement cloud-native, automated data solutions Collaborate with internal teams to deliver scalable systems Continuously improve infrastructure reliability and observability Technical Environment: Databases: Oracle, MySQL, MSSQL, MongoDB Distributed Engines: Spark/PySpark, Presto, Flink/Beam Cloud Infra: GCP (preferred), AWS (nice-to-have), Terraform Big Data Formats: Iceberg, Hudi, Delta Tools: SQL, Data Modeling, Palantir Foundry, Jenkins, Confluence Bonus: Stats/math tools (NumPy, PyMC3), Linux scripting Ideal for engineers with cloud-native, real-time data platform experience especially those who have worked with EMR and modern lakehouse stacks.

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description Participates in review and gathering of business requirements for various HR system updates and/or functionality; Develop specification documentation. Creates test plans and/or related documentation that would be used to manage the testing process Performs system troubleshooting. Lead review, testing and implementation of system enhancements. Collaborates with IT staff to coordinate updates and/or fixes and documents the process and results. Provides production support, including researching and resolving HR technology issues, unexpected results or process flaws. Conducts regular data audits and implement validation measures to ensure data integrity. Make recommendations to help re-engineer and improve effectiveness of various HR processes. Familiarity working with large data sets. Proficient in Excel Formulas and macros. Understanding of SQL and OTBI, BI Publisher in order to analyze data and set requirements for IT Ability to draft business requirement documents (BRD) based on Verisk standard format. Drafts and executes test scripts Ability to understand technical design requirements and communicate those requirements to both IT and business partners Qualifications 5+ years experience with Oracle HCM platforms. 3+ years experience with Oracle Fusion Experience with customizing pages with Oracle Visual Builder Studio (VBS) Oracle Redwood experience a plus Oracle Work Structure Configuration experience Coordinating with oracle and creating SR’s a plus Understanding and ability to create Oracle HDL load templates for business objects and employee data loads Experience with technologies supporting any combination of the following HR disciplines: HRIS (HR Job Management, HR Org Management, HR Master Data, Compensation, Benefits, etc.), Talent Acquisitions, Talent Development (Learning and Performance Management), Oracle Cloud Recruiting. Experience with and proficiency around Oracle data structures such as Approval Workflows, Setup and Maintenance of Data elements, locations, departments, department trees, jobs, legal employers, etc. Proficiency in Microsoft Productivity Suite Can project manage and follow through on deliverables independently with stakeholders Solid understanding of HR Functional areas and how downstream dataflow from HRIS supports each function Stakeholder Management, Managing Expectations, Prioritization Skills, Conflict Management, Decision Making Quality Strong experience in industry and/or similar positions supporting back end HR platforms About Us For over 50 years, Verisk has been the leading data analytics and technology partner to the global insurance industry by delivering value to our clients through expertise and scale. We empower communities and businesses to make better decisions on risk, faster. At Verisk, you'll have the chance to use your voice and build a rewarding career that's as unique as you are, with work flexibility and the support, coaching, and training you need to succeed. For the eighth consecutive year, Verisk is proudly recognized as a Great Place to Work® for outstanding workplace culture in the US, fourth consecutive year in the UK, Spain, and India, and second consecutive year in Poland. We value learning, caring and results and make inclusivity and diversity a top priority. In addition to our Great Place to Work® Certification, we’ve been recognized by The Wall Street Journal as one of the Best-Managed Companies and by Forbes as a World’s Best Employer and Best Employer for Women, testaments to the value we place on workplace culture. We’re 7,000 people strong. We relentlessly and ethically pursue innovation. And we are looking for people like you to help us translate big data into big ideas. Join us and create an exceptional experience for yourself and a better tomorrow for future generations. Verisk Businesses Underwriting Solutions — provides underwriting and rating solutions for auto and property, general liability, and excess and surplus to assess and price risk with speed and precision Claims Solutions — supports end-to-end claims handling with analytic and automation tools that streamline workflow, improve claims management, and support better customer experiences Property Estimating Solutions — offers property estimation software and tools for professionals in estimating all phases of building and repair to make day-to-day workflows the most efficient Extreme Event Solutions — provides risk modeling solutions to help individuals, businesses, and society become more resilient to extreme events. Specialty Business Solutions — provides an integrated suite of software for full end-to-end management of insurance and reinsurance business, helping companies manage their businesses through efficiency, flexibility, and data governance Marketing Solutions — delivers data and insights to improve the reach, timing, relevance, and compliance of every consumer engagement Life Insurance Solutions – offers end-to-end, data insight-driven core capabilities for carriers, distribution, and direct customers across the entire policy lifecycle of life and annuities for both individual and group. Verisk Maplecroft — provides intelligence on sustainability, resilience, and ESG, helping people, business, and societies become stronger Verisk Analytics is an equal opportunity employer. All members of the Verisk Analytics family of companies are equal opportunity employers. We consider all qualified applicants for employment without regard to race, religion, color, national origin, citizenship, sex, gender identity and/or expression, sexual orientation, veteran's status, age or disability. Verisk’s minimum hiring age is 18 except in countries with a higher age limit subject to applicable law. https://www.verisk.com/company/careers/ Unsolicited resumes sent to Verisk, including unsolicited resumes sent to a Verisk business mailing address, fax machine or email address, or directly to Verisk employees, will be considered Verisk property. Verisk will NOT pay a fee for any placement resulting from the receipt of an unsolicited resume. Verisk Employee Privacy Notice Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description A platform software Engineer is a versatile developer with expertise in Java or Python and a strong foundation in cloud platforms to build and manage applications at scale. Generally, platform engineers fall into two categories: backend engineers, who design and implement microservices with robust APIs, and full-stack engineers, who deliver native UI/UX solutions, and ability to develop frameworks and service to enable an enterprise data platform. With a solid understanding of the SDLC and hands-on experience in Git and CI/CD, platform engineers can independently design, code, test, and release features to production efficiently. Responsibilities Design and Build Data Pipelines: Architect, develop, and maintain scalable data pipelines and microservices that support real-time and batch processing on GCP. Service-Oriented Architecture (SOA) and Microservices: Design and implement SOA and microservices-based architectures to ensure modular, flexible, and maintainable data solutions. Full-Stack Integration: Leverage your full-stack expertise to contribute to the seamless integration of front-end and back-end components, ensuring robust data access and UI-driven data exploration. Data Ingestion and Integration: Lead the ingestion and integration of data from various sources into the data platform, ensuring data is standardized and optimized for analytics. GCP Data Solutions: Utilize GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Functions, etc.) to build and manage data platforms that meet business needs. Data Governance and Security: Implement and manage data governance, access controls, and security best practices while leveraging GCP’s native row- and column-level security features. Performance Optimization: Continuously monitor and improve the performance, scalability, and efficiency of data pipelines and storage solutions. Collaboration and Best Practices: Work closely with data architects, software engineers, and cross-functional teams to define best practices, design patterns, and frameworks for cloud data engineering. Automation and Reliability: Automate data platform processes to enhance reliability, reduce manual intervention, and improve operational efficiency. Qualifications Education: Bachelor’s degree in Computer Science, Data Engineering, Information Systems, or a related field. Master’s degree or equivalent experience preferred. Experience: Minimum 5 years of experience as a Software Engineer Technical Skills: Proficient in Java, angular or any javascript technology with experience in designing and deploying cloud-based data pipelines and microservices using GCP tools like BigQuery, Dataflow, and Dataproc. Ability to leverage best in-class data platform technologies to deliver platform features, and design & orchestrate platform services to deliver data platform capabilities. Service-Oriented Architecture and Microservices: Strong understanding of SOA, microservices, and their application within a cloud data platform context. Develop robust, scalable services using Java Spring Boot, Python, Angular, and GCP technologies. Full-Stack Development: Knowledge of front-end and back-end technologies, enabling collaboration on data access and visualization layers (e.g., React, Node.js). Design and develop RESTful APIs for seamless integration across platform services. Implement robust unit and functional tests to maintain high standards of test coverage and quality. Database Management: Experience with relational (e.g., PostgreSQL, MySQL) and NoSQL databases, as well as columnar databases like BigQuery. Data Governance and Security: Understanding of data governance frameworks and implementing RBAC, encryption, and data masking in cloud environments. CI/CD and Automation: Familiarity with CI/CD pipelines, Infrastructure as Code (IaC) tools like Terraform, and automation frameworks. Manage code changes with GitHub and troubleshoot and resolve application defects efficiently. Ensure adherence to SDLC best practices, independently managing feature design, coding, testing, and production releases. Problem-Solving: Strong analytical skills with the ability to troubleshoot complex data platform and microservices issues. Certifications (Preferred): GCP Data Engineer, GCP Professional Cloud Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

About Eucloid At Eucloid, innovation meets impact. As a leader in AI and Data Science, we create solutions that redefine industries—from Hi-tech and D2C to Healthcare and SaaS. With partnerships with giants like Databricks, Google Cloud, and Adobe, we’re pushing boundaries and building next-gen technology. Join our talented team of engineers, scientists, and visionaries from top institutes like IITs, IIMs, and NITs. At Eucloid, growth is a promise, and your work will drive transformative results for Fortune 100 clients. What You’ll Do Design, build, and optimize AI platforms and pipelines on Google Cloud Platform (GCP) . Implement AI Ops practices and DevOps workflows for end-to-end machine learning lifecycle management (model training, deployment, monitoring). Develop automation scripts for deployment and manage cluster deployment on Kubernetes (GKE) . Build and maintain scalable CI/CD pipelines using tools like Jenkins, Cloud Build, or GitLab CI/CD for AI workflows. Architect GCP-based infrastructure using Terraform (Infrastructure as Code) for AI workloads (Vertex AI, BigQuery, Dataflow). Collaborate with data scientists to operationalize AI models and ensure seamless integration into production environments. Optimize cloud infrastructure for cost, performance, and security, focusing on automation and scalability. What Makes You a Fit Academic Background: Bachelor’s or Master’s in Computer Science, Data Science, Engineering, or a related field. Technical Expertise: 5+ years of experience in AI engineering, DevOps, or cloud platforms , with a focus on Google Cloud (GCP) . Hands-on experience with GCP services : Vertex AI, Kubernetes Engine (GKE), BigQuery, Dataflow, and Terraform. Proficiency in CI/CD pipeline tools: Jenkins , GitLab CI/CD, or Cloud Build. Strong coding skills in Python for scripting and automation. Expertise in Kubernetes for cluster deployment and orchestration. Experience with Infrastructure as Code (IaC) using Terraform or Google Deployment Manager. Knowledge of automation scripts for deployment and containerization ( Docker ). Extra Skills: Professional Cloud DevOps Engineer or related certification. Familiarity with Gen-AI tools (LangChain, Vertex AI Generative AI). Exposure to monitoring tools like Cloud Monitoring, Prometheus, or Grafana . Why You’ll Love It Here Innovate with the Best Tech: Work on groundbreaking projects using AI, GenAI, LLMs, and massive-scale data platforms. Tackle challenges that push the boundaries of innovation. Impact Industry Giants: Deliver business-critical solutions for Fortune 100 clients across Hi-tech, D2C, Healthcare, SaaS, and Retail. Partner with platforms like Databricks, Google Cloud, and Adobe to create high-impact products. Collaborate with a World-Class Team: Join exceptional professionals from IITs, IIMs, NITs, and global leaders like Walmart, Amazon, Accenture, and ZS. Learn, grow, and lead in a team that values expertise and collaboration. Accelerate Your Growth: Access our Centres of Excellence to upskill and work on industry-leading innovations. Your professional development is a top priority. Work in a Culture of Excellence: Be part of a dynamic workplace that fosters creativity, teamwork, and a passion for building transformative solutions. Your contributions will be recognized and celebrated. About Our Leadership Anuj Gupta – Former Amazon leader with over 22 years of experience in building and managing large engineering teams. (B.Tech, IIT Delhi; MBA, ISB Hyderabad). Raghvendra Kushwah – Business consulting expert with 21+ years at Accenture and Cognizant (B.Tech, IIT Delhi; MBA, IIM Lucknow). Key Benefits Competitive salary and performance-based bonus. Comprehensive benefits package, including health insurance and flexible work hours. Opportunities for professional development and careers growth. Location: Gurugram Submit your resume to saurabh.bhaumik@eucloid.com with the subject line “ Application: Platform Engineer (GCP). ” Eucloid is an equal-opportunity employer. We celebrate diversity and are committed to creating an inclusive environment. Show more Show less

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Title: Data Architect – Data Integration & Engineering Location: Hybrid Experience: 8+ years Job Summary: We are seeking an experienced Data Architect specializing in data integration, data engineering, and hands-on coding to design, implement, and manage scalable and high-performance data solutions. The ideal candidate should have expertise in ETL/ELT, cloud data platforms, big data technologies, and enterprise data architecture. Key Responsibilities: 1. Data Architecture & Design: Develop enterprise-level data architecture solutions, ensuring scalability, performance, and reliability. Design data models (conceptual, logical, physical) for structured and unstructured data. Define and implement data integration frameworks using industry-standard tools. Ensure compliance with data governance, security, and regulatory policies (GDPR, HIPAA, etc.). 2. Data Integration & Engineering: Implement ETL/ELT pipelines using Informatica, Talend, Apache Nifi, or DBT. Work with batch and real-time data processing tools such as Apache Kafka, Kinesis, and Apache Flink. Integrate and optimize data lakes, data warehouses, and NoSQL databases. 3. Hands-on Coding & Development: Write efficient and scalable code in Python, Java, or Scala for data transformation and processing. Optimize SQL queries, stored procedures, and indexing strategies for performance tuning. Build and maintain Spark-based data processing solutions in Databricks and Cloudera ecosystems. Develop workflow automation using Apache Airflow, Prefect, or similar tools. 4. Cloud & Big Data Technologies: Work with cloud platforms such as AWS (Redshift, Glue), Azure (Data Factory, Synapse), and GCP (BigQuery, Dataflow). Manage big data processing using Cloudera, Hadoop, HBase, and Apache Spark. Deploy containerized data services using Kubernetes and Docker. Automate infrastructure using Terraform and CloudFormation. 5. Governance, Security & Compliance: Implement data security, masking, and encryption strategies. Define RBAC (Role-Based Access Control) and IAM policies for data access. Work on metadata management, data lineage, and cataloging. Required Skills & Technologies: Data Engineering & Integration: ETL/ELT Tools: Informatica, Talend, Apache Nifi, DBT Big Data Ecosystem: Cloudera, HBase, Apache Hadoop, Spark Data Streaming: Apache Kafka, AWS Kinesis, Apache Flink Data Warehouses: Snowflake, AWS Redshift, Google Big Query, Azure Synapse Databases: PostgreSQL, MySQL, MongoDB, Cassandra Programming & Scripting: Languages: Python, Java, Scala Scripting: Shell, PowerShell, Bash Frameworks: PySpark, SparkSQL Cloud & DevOps: Cloud Platforms: AWS, Azure, GCP Containerization & Orchestration: Kubernetes, Docker CI/CD Pipelines: Jenkins, GitHub Actions, Terraform, CloudFormation Security & Governance: Compliance Standards: GDPR, HIPAA, SOC 2 Data Cataloging: Collibra, Alation Access Controls: IAM, RBAC, ABAC Preferred Certifications: AWS Certified Data Analytics – Specialty Microsoft Certified: Azure Data Engineer Associate Google Professional Data Engineer Databricks Certified Data Engineer Associate/Professional Cloudera Certified Data Engineer Informatica Certified Professional Education & Experience: Bachelor's/Master’s degree in Computer Science/ MCA, Data Engineering, or a related field. 8+ years of experience in data architecture, integration, and engineering. Proven expertise in designing and implementing enterprise-scale data solutions. Show more Show less

Posted 3 weeks ago

Apply

Exploring Dataflow Jobs in India

The dataflow job market in India is currently experiencing a surge in demand for skilled professionals. With the increasing reliance on data-driven decision-making in various industries, the need for individuals proficient in managing and analyzing dataflow is on the rise. This article aims to provide job seekers with valuable insights into the dataflow job landscape in India.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Delhi

These cities are known for their thriving tech ecosystems and are home to numerous companies actively hiring for dataflow roles.

Average Salary Range

The average salary range for dataflow professionals in India varies based on experience levels. Entry-level positions can expect to earn between INR 4-6 lakhs per annum, while experienced professionals can command salaries upwards of INR 12-15 lakhs per annum.

Career Path

In the dataflow domain, a typical career path may involve starting as a Junior Data Analyst or Data Engineer, progressing to roles such as Senior Data Scientist or Data Architect, and eventually reaching positions like Tech Lead or Data Science Manager.

Related Skills

In addition to expertise in dataflow tools and technologies, dataflow professionals are often expected to have proficiency in programming languages such as Python or R, knowledge of databases like SQL, and familiarity with data visualization tools like Tableau or Power BI.

Interview Questions

  • What is dataflow and how is it different from data streaming? (basic)
  • Explain the difference between batch processing and real-time processing. (medium)
  • How do you handle missing or null values in a dataset? (basic)
  • Can you explain the concept of data lineage? (medium)
  • What is the importance of data quality in dataflow processes? (basic)
  • How do you optimize dataflow pipelines for performance? (medium)
  • Describe a time when you had to troubleshoot a dataflow issue. (medium)
  • What are some common challenges faced in dataflow projects? (medium)
  • How do you ensure data security and compliance in dataflow processes? (medium)
  • What are the key components of a dataflow architecture? (medium)
  • Explain the concept of data partitioning in dataflow. (advanced)
  • How would you handle a sudden increase in data volume in a dataflow pipeline? (advanced)
  • What role does data governance play in dataflow processes? (medium)
  • Can you discuss the advantages and disadvantages of using cloud-based dataflow solutions? (medium)
  • How do you stay updated with the latest trends and technologies in dataflow? (basic)
  • What is the significance of metadata in dataflow management? (medium)
  • Walk us through a dataflow project you have worked on from start to finish. (medium)
  • How do you ensure data quality and consistency across different data sources in a dataflow pipeline? (medium)
  • What are some best practices for monitoring and troubleshooting dataflow pipelines? (medium)
  • How do you handle data transformations and aggregations in a dataflow process? (basic)
  • What are the key performance indicators you would track in a dataflow project? (medium)
  • How do you collaborate with cross-functional teams in a dataflow project? (basic)
  • Can you explain the concept of data replication in dataflow management? (advanced)
  • How do you approach data modeling in a dataflow project? (medium)
  • Describe a challenging dataflow problem you encountered and how you resolved it. (advanced)

Closing Remark

As you navigate the dataflow job market in India, remember to showcase your skills and experiences confidently during interviews. Stay updated with the latest trends in dataflow and continuously upskill to stand out in a competitive job market. Best of luck in your job search journey!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies