Jobs
Interviews

187 Cdc Jobs - Page 4

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 7.0 years

4 - 9 Lacs

Noida

Work from Office

Job Area: Engineering Group, Engineering Group > Hardware Engineering General Summary: As a leading technology innovator, Qualcomm pushes the boundaries of what's possible to enable next-generation experiences and drives digital transformation to help create a smarter, connected future for all. As a Qualcomm Hardware Engineer, you will plan, design, optimize, verify, and test electronic systems, bring-up yield, circuits, mechanical systems, Digital/Analog/RF/optical systems, equipment and packaging, test systems, FPGA, and/or DSP systems that launch cutting-edge, world class products. Qualcomm Hardware Engineers collaborate with cross-functional teams to develop solutions and meet performance requirements. Minimum Qualifications: Bachelor's degree in Computer Science, Electrical/Electronics Engineering, Engineering, or related field and 2+ years of Hardware Engineering or related work experience. OR Master's degree in Computer Science, Electrical/Electronics Engineering, Engineering, or related field and 1+ year of Hardware Engineering or related work experience. OR PhD in Computer Science, Electrical/Electronics Engineering, Engineering, or related field. Minimum of 3+ years’ experience in the area of DFT-, ATPG, Scan Insertion, MBIST, JTAG -In depth knowledge of DFT concepts. -In depth knowledge and hands on experience in DFT(scan/mbist) insertion, ATPG pattern generation/verification, mbist verification and post silicon bring up/yield analysis -Expertise in test mode timing constraints definition, knowledge in providing timing fixes/corrective actions for timing violations. -Ability to analyze and devise new tests for new technologies/custom RAM design/RMA etc. -Expertise in scripting languages such as perl, shell, etc. -Experience in simulating test vectors. -Knowledge of equivalence check and RTL lint tool (like spyglass). -Ability to work in an international team, dynamic environment -Ability to learn and adapt to new tools and methodologies. -Ability to do multi-tasking & work on several high priority designs in parallel. -Excellent problem-solving skills

Posted 3 weeks ago

Apply

0.0 - 3.0 years

2 - 6 Lacs

Mohali

Work from Office

We are looking for a highly skilled and experienced Analyst to join our team at eClerx Services Ltd. The ideal candidate will have a strong background in IT Services & Consulting, with excellent analytical skills and attention to detail. Roles and Responsibility Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain complex data analysis systems and reports. Provide expert-level support for data analysis and reporting needs. Identify trends and patterns in large datasets to inform business decisions. Develop and implement process improvements to increase efficiency and productivity. Communicate findings and insights to stakeholders through clear and concise reports. Job Requirements Strong understanding of data analysis principles and techniques. Proficiency in data visualization tools and software. Excellent communication and interpersonal skills. Ability to work in a fast-paced environment with multiple priorities. Strong problem-solving skills and attention to detail. Experience working with large datasets and developing complex reports. Title: Analyst, ref: 78642.

Posted 3 weeks ago

Apply

14.0 - 20.0 years

45 - 50 Lacs

Bengaluru

Work from Office

Job Description: Job Title: Data Technical Lead As the Data Management Platform (DMP) Technical Lead , you will be responsible for embedding a world class product development and engineering culture and organization. You will work with development, architecture and operations as well as platform teams to ensure we are delivering a best-in-class technology solution. You will work closely together with the Business Platform Owner to ensure an integrated end-to-end view across people and technology for the Business Platform. You will also defend the faith and work with stakeholders across the enterprise to ensure we are developing the right solutions. In parallel, you will focus on building a high-performing team that will thrive in a fast-paced continuous delivery engineering environment The role involves architecting, designing, and delivering solutions in tool stack including Informatica MDM SaaS, Informatica Data Quality, Collibra Data Governance , and other data tools. Key responsibilities: Shape technical strategy (e.g., build vs. buy decisions, technical road-mapping) in collaboration with architects Evaluate and identify appropriate technology stacks, platforms and vendors, including web application frameworks and cloud providers for solution development Attend team ceremonies as required; in particular, feature refinement and cross-team iteration reviews/demos Drive the resolution of technical impediments Own the 'success' of foundational enablers Champion for Research and Innovation Lead in scaled agile ceremonies and activities, like quarterly reviews, quarterly increment planning and OKR writing Collaborate with the Platform Owner in the writing and prioritization of technical capabilities and enablers Present platform delivery metrics, OKR health and platform finance status to Executive audiences Collaborate with other Technical Leads Create and maintain the technical roadmap for in-scope products and services at the platform/portfolio level Key Experience: B.E. / B.Tech or equivalent Engineering professional Masters degree or equivalent experience in Marketing, Business or finance is an added advantage 10+ yrs. of experience in technical architecture, solution design, and platform engineering Strong experience in MDM, Data Quality and Data Governance practices including tool stack such as Informatica MDM SaaS, Informatica Data Quality, and Collibra is a plus Good experience with major cloud platforms and data tools in cloud including but not limited to AWS, Microsoft Azure, Kafka, CDC, Tableau , and Data virtualization tools Good experience in ETL and BI solution development and tool stack – Informatica ETL experience is a plus Good experience in Data Architecture, SQL, NoSQL, REST API, data security, and AI concepts Familiarity with agile methodologies and data factory operations processes including usage of tools like confluence, Jira and Miro Strong knowledge of industry standards and regulations: A data platform owner should have knowledge of industry standards and regulations related to data management, such as HIPAA, PCI-DSS, and GDPR . Proven knowledge of working in financial services, preferably, insurance space Experience in senior engineering and technology roles working with teams to build/deliver digital products Experience in providing guidance and insight to establish governance processes, direction, and control, to ensure objectives are achieved and risks are managed appropriately for product development A leader who has a track record of onboarding, and developing engineering and product teams Experience as a technology leader who has defined and implemented technical strategies within complex organizations and is able to Influence and contribute to the higher-level engineering strategy Has insight into the newest technologies and trends and is an expert in product development with experience in code delivery and management of full stack technology Experience in digital capabilities such as DevSecOps, CI/CD, and agile release management A wide experience and understanding of architecture in terms of solution, data, and integration Can provide direct day-to-day engineering and technology problem-solving, coaching, direction, and guidance to Technical Leads and Senior Technical Leads within their Platform Strong leadership skills with an ability to influence a diverse group of stakeholders Ability to influence technical strategy at the BG and Enterprise level Experience working in Agile teams with a strong understanding of agile ways of working Experience managing technical priorities within an evolving product backlog Understands how to decompose large technical initiatives into actionable technical enablers Experience in the continuous improvement of software development workflows and pipelines Proven leadership; ability to articulate ideas to both technical and non-technical audiences Ability to communicate strategy and objectives, and align organizations to a common goal Strong problem solver with ability to lead the team to push the solution Ability to empower teams and encourage collaboration Ability to inspire people and teams, building momentum around a vision Critical thinker and passion to challenge status quo to find new solutions and drive out of the box ideas Believes in a non-hierarchical culture of collaboration, transparency and trust across the team Experimental mindset to drive innovation and continuous improvement of team

Posted 3 weeks ago

Apply

3.0 - 5.0 years

5 - 6 Lacs

Bengaluru

Hybrid

Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. Position: Design Verification Engineer Location: Bangalore Work Type: Hybrid Job Type: Full time Job Description: Over 5 years of experience in Design verification. Good exposure to Subsystem and SOC level verification. UVM, System Verilog and C based verification environment. CDC and GLS exposure DFX and DFT verification experience Exposure to Complex SOCs. Experience in High-speed protocols like PCIe, Ethernet is add on. TekWissen Group is an equal opportunity employer supporting workforce diversity.

Posted 4 weeks ago

Apply

4.0 - 8.0 years

4 - 6 Lacs

Hyderabad, Bengaluru

Work from Office

Key Responsibilities : Lead and manage RTL design activities for complex ASICs, ensuring high performance and low power consumption. Integrating RTL components into System-on-Chip (SoC) designs Integrating RTL components into System-on-Chip (SoC) designs Architect and implement RTL for digital circuits (such as processors, communication systems, or custom IP cores). Mentor and guide junior RTL engineers in best practices for design, coding standards, and optimization techniques. Develop and refine RTL code in Verilog/SystemVerilog for ASIC development. Collaborate with cross-functional teams (Verification, Physical Design, and Software) to ensure successful integration of the ASIC design. Perform RTL design reviews, debugging, and optimization to meet design targets such as area, speed, and power. Work on creating micro-architectural specifications and ensure the design meets project requirements. Ensure designs are implemented with proper synchronization, timing constraints, and low power techniques. Participate in top-level design, integrating IP blocks, ensuring design consistency across subsystems. Drive the design flow from architecture and specifications through to implementation. Prepare and maintain technical documentation for designs and related processes. CDC, LINT and Integration expertise is expected. Required Skills & Experience : Bachelor's, Master's, or PhD in Electrical Engineering or related fields. 3-12 years of experience in RTL design for ASICs, with at least 3 years in a team lead role. Expertise in RTL design using Verilog or System Verilog. Solid understanding of digital design principles, including timing analysis, state machines, and pipelining. In-depth knowledge of ASIC design flow, from RTL to tape-out. Experience with EDA tools for synthesis, simulation, and timing analysis (e.g., Synopsys, Cadence). Strong debugging and problem-solving skills. Good knowledge on scripting (Python, Perl and Shell scripting) Knowledge of power, performance, and area (PPA) optimization techniques. Experience with designing for low-power, high-speed circuits is highly desirable. Excellent communication skills and the ability to work in a team environment.

Posted 1 month ago

Apply

4.0 - 9.0 years

15 - 30 Lacs

Noida, Ahmedabad, Bengaluru

Work from Office

Expertise and strong hands-on experience in RTL design using System Verilog or VHDL Digital system architecture, Processor subsystem architecture and block definition,complex SoCs, RTL design quality analysis – Lint, CDC, RDC,DFT,simulation,

Posted 1 month ago

Apply

4.0 - 9.0 years

12 - 17 Lacs

Gurugram, Coimbatore, Bengaluru

Work from Office

Your Role Good experience in Administering; Maintaining DB2 databases Experience in Install; Upgrade of DB2 UDB and Fix pack on Unix and Windows machineTaking database Backups and recovering the database using db2 Backup; Restore utilities Expertise in database upgradations from older to newer version of LUW databases Experience in Database restores including Redirected restores within production; test and development environment Experience in scheduling Backup scripts using Cron Jobs in Unix Environment and in DB2 UDB Command Line Utilities.Experience in maintenance of databases; performance testing; archiving and troubleshooting. Your Profile 4-12 years of experience on DB2 Database Administration Experience on snapshot/lock wait issuesPreferred to have knowledge on designing Flows; Sub Flows; and Exception handling strategies; Data Weave transformation and Mule Expression Language(MEL) Experience in SQL Tuning using db2advisor and db2explain tools Knowledge on DB2 UDB DBA and in Mule ESB; Cloud hub; Any point Platform is preferredHaving knowledge on DB2 DPF environment is preferred Preferable to have knowledge on moving databases from OS platforms and Moving data from database objects to flat files and loading data from flat files to database objects using Data Movement Utilities like Export & Import What will you love working at Capgemini Keeping up with the latest DB2 features, best practices, and security updates. Clear career progression paths from L2 support to architecture and consulting roles Be part of mission-critical projects that secure and optimize networks for Fortune 500 clients. Thrive in a diverse, inclusive, and respectful environment that values your voice and ideas as well work in agile, cross-functional teams with opportunities to lead and mentor. Location - Bengaluru,Coimbatore,Gurugram,Hyderabad,Noida,Mumbai,Pune,Chennai

Posted 1 month ago

Apply

4.0 - 8.0 years

12 - 14 Lacs

Hyderabad

Work from Office

Required Skills Experience in Logic design / RTL coding is a must. Experience is SoC design and integration for complex SoCs is a must. Experience in Verilog/System-Verilog is a must. Experience in Multi Clock designs, Asynchronous interface is a must. Experience in using the tools in ASIC development such as Lint and CDC. Experience in Synthesis / Understanding of timing concepts is a plus. Experience in ECO fixes and formal verification. Should have knowledge of AMBA protocols - AXI, AHB, APB, SoC clocking/reset architecture. Excellent oral and written communications skills. Proactive, creative, curious, motivated to learn and contribute with good collaboration skills

Posted 1 month ago

Apply

7.0 - 11.0 years

15 - 20 Lacs

Bengaluru

Work from Office

Job Description: We are looking for a RTL Design Engineer with expertise in SoC and IP-level design and integration. The ideal candidate should have a strong background in RTL coding, architecture-level understanding, and industry-standard quality checks and tools. Key Responsibilities: Develop RTL code in Verilog/SystemVerilog Understand and apply top-level SoC architecture concepts Perform SoC and IP-level integration Implement RTL quality checks including CLP (mandatory), LINT, CDC, RDC, VSI Work on design partitioning (Tilification) Handle IORING, PHYs, GPIOs Collaborate with verification and backend teams Required Skills: RTL coding in Verilog and SystemVerilog IPXACT knowledge Clock Domain Crossing (CDC), Reset Domain Crossing (RDC) UPF and SDC concepts Tools: VC_static, SpyGlass (Lint, CDC, RDC), 0in, Formality, Conformal LEC Scripting: Perl, Python, TCL Nice to Have: Experience with design quality metrics and standards Exposure to physical-aware RTL design

Posted 1 month ago

Apply

5.0 - 10.0 years

35 - 80 Lacs

Hyderabad/Secunderabad, Pune, Bangalore/Bengaluru

Hybrid

• Design Methodology, Micro-architecture, RTL. • Work with the architecture team to develop the uArch & Subsequently write RTL. • Develop Design Methodology, starting with the machine learning architecture. • Synthesis, STA, Equivalence checking. Required Candidate profile * EXP in SOC design methodology, Micro-architecture, Emulation & back-end DEV., & Chip Bring-up. * EXP in Developing ARM CPU based SoCs, Network-on-Chip & interfaces such as MIPI-CSI, Ethernet & PCIe

Posted 1 month ago

Apply

8.0 - 12.0 years

25 - 35 Lacs

Hyderabad

Remote

Role: Fivetran Developer Role Type: CONTRACTUAL Location: Remote Need to work in Dubai Time Zone Key Responsibilities: Configure and manage Fivetran connectors for data replication from SQL Server, Oracle, Salesforce, and APIs. Set up, monitor, and troubleshoot CDC (Change Data Capture) jobs on source systems to ensure real-time or near-real-time data replication. Ensure accurate and reliable data delivery to Snowflake, Azure Data Lake, and Iceberg formats/catalogs. Work closely with source system owners to validate CDC setup and permissions. Monitor Fivetran pipelines for performance, reliability, and data consistency. Resolve replication issues, data sync failures, and schema changes in coordination with business and technical stakeholders. Document all pipeline configurations and changes. Required Experience: 2+ years of hands-on experience with Fivetran , specifically for data replication use cases. Strong expertise in configuring CDC : SQL Server, Oracle, Salesforce, and/or APIs. Experience replicating data into Snowflake and Azure Data Lake ; familiarity with Iceberg formats/catalogs is a plus. Solid understanding of data replication, sync frequency, and incremental loads. Basic SQL skills for data validation and troubleshooting. Strong communication and documentation abilities. Preferred: Experience with data security and compliance during data movement. Familiarity with cloud data ecosystems (Azure, AWS, GCP). Position Overview: We are seeking a Fivetran Data Replication Engineer responsible for configuring and managing data replication pipelines using Fivetran. The primary focus will be on replicating data from SQL Server, Oracle, Salesforce, and various APIs into Snowflake, Azure Data Lake, and Fivetran Iceberg formats/catalogs. This role requires strong experience in setting up and troubleshooting Change Data Capture (CDC) jobs on source systems.

Posted 1 month ago

Apply

3.0 - 6.0 years

5 - 9 Lacs

Bengaluru

Work from Office

About the Opportunity Job TypeApplication 31 July 2025 Strategic Impact As a Senior Data Engineer, you will directly contribute to our key organizational objectives: Accelerated Innovation Enable rapid development and deployment of data-driven products through scalable, cloud-native architectures Empower analytics and data science teams with self-service, real-time, and high-quality data access Shorten time-to-insight by automating data ingestion, transformation, and delivery pipelines Cost Optimization Reduce infrastructure costs by leveraging serverless, pay-as-you-go, and managed cloud services (e.g., AWS Glue, Databricks, Snowflake) Minimize manual intervention through orchestration, monitoring, and automated recovery of data workflows Optimize storage and compute usage with efficient data partitioning, compression, and lifecycle management Risk Mitigation Improve data governance, lineage, and compliance through metadata management and automated policy enforcement Increase data quality and reliability with robust validation, monitoring, and alerting frameworks Enhance system resilience and scalability by adopting distributed, fault-tolerant architectures Business Enablement Foster cross-functional collaboration by building and maintaining well-documented, discoverable data assets (e.g., data lakes, data warehouses, APIs) Support advanced analytics, machine learning, and AI initiatives by ensuring timely, trusted, and accessible data Drive business agility by enabling rapid experimentation and iteration on new data products and features Key Responsibilities Design, develop and maintain scalable data pipelines and architectures to support data ingestion, integration and analytics Be accountable for technical delivery and take ownership of solutions Lead a team of senior and junior developers providing mentorship and guidance Collaborate with enterprise architects, business analysts and stakeholders to understand data requirements, validate designs and communicate progress Drive technical innovation within the department to increase code reusability, code quality and developer productivity Challenge the status quo by bringing the very latest data engineering practices and techniques About youCore Technical Skills Expert in leveraging cloud-based data platform (Snowflake, Databricks) capabilities to create an enterprise lake house. Advanced expertise with AWS ecosystem and experience in using a variety of core AWS data services like Lambda, EMR, MSK, Glue, S3. Experience designing event-based or streaming data architectures using Kafka. Advanced expertise in Python and SQL. Open to expertise in Java/Scala but require enterprise experience of Python. Expert in designing, building and using CI/CD pipelines to deploy infrastructure (Terraform) and pipelines with test automation. Data Security & Performance OptimizationExperience implementing data access controls to meet regulatory requirements. Experience using both RDBMS (Oracle, Postgres, MSSQL) and NOSQL (Dynamo, OpenSearch, Redis) offerings. Experience implementing CDC ingestion Experience using orchestration tools (Airflow, Control-M, etc...) Significant experience in software engineering practices using GitHub, code verification, validation, and use of copilots Bonus technical Skills: Strong experience in containerisation and experience deploying applications to Kubernetes Strong experience in API development using Python based frameworks like FastAPI Key Soft Skills: Problem-SolvingLeadership experience in problem-solving and technical decision-making. CommunicationStrong in strategic communication and stakeholder engagement. Project ManagementExperienced in overseeing project lifecycles working with Project Managers to manage resources.

Posted 1 month ago

Apply

2.0 - 4.0 years

4 - 7 Lacs

Kochi, Ernakulam, Thrissur

Work from Office

Role & responsibilities Install, configure, and upgrade Microsoft SQL Server Configure log shipping and troubleshoot synchronization issues to ensure data is consistently transferred between primary and secondary SQL Server instances. Perform manual failover in log shipping scenarios during planned maintenance or unplanned outages. Develop and manage backup and recovery strategies for Point in time Recovery. Monitor and optimize database performance using native tools and performance tuning techniques. Hands-on experience with performance tuning, query optimization, and indexing. Implement and monitor Change Data Capture (CDC) to support data replication and auditing. Troubleshoot locking, deadlocks, and long-running queries to improve database performance and ensure smooth operations. Familiarity with database security and access control. Maintain accurate documentation for database procedures, environments, and failover steps.

Posted 1 month ago

Apply

7.0 - 12.0 years

12 - 18 Lacs

Pune, Chennai

Work from Office

Key Responsibilities: Implement Confluent Kafka-based CDC solutions to support real-time data movement across banking systems. Implement event-driven and microservices-based data solute zions for enhanced scalability, resilience, and performance . Integrate CDC pipelines with core banking applications, databases, and enterprise systems . Ensure data consistency, integrity, and security , adhering to banking compliance standards (e.g., GDPR, PCI-DSS). Lead the adoption of Kafka Connect, Kafka Streams, and Schema Registry for real-time data processing. Optimize data replication, transformation, and enrichment using CDC tools like Debezium, GoldenGate, or Qlik Replicate . Collaborate with Infra team, data engineers, DevOps teams, and business stakeholders to align data streaming capabilities with business objectives. Provide technical leadership in troubleshooting, performance tuning, and capacity planning for CDC architectures. Stay updated with emerging technologies and drive innovation in real-time banking data solutions . Required Skills & Qualifications: Extensive experience in Confluent Kafka and Change Data Capture (CDC) solutions . Strong expertise in Kafka Connect, Kafka Streams, and Schema Registry . Hands-on experience with CDC tools such as Debezium, Oracle GoldenGate, or Qlik Replicate . Hands on experience on IBM Analytics Solid understanding of core banking systems, transactional databases, and financial data flows . Knowledge of cloud-based Kafka implementations (AWS MSK, Azure Event Hubs, or Confluent Cloud) . Proficiency in SQL and NoSQL databases (e.g., Oracle, MySQL, PostgreSQL, MongoDB) with CDC configurations. Strong experience in event-driven architectures, microservices, and API integrations . Familiarity with security protocols, compliance, and data governance in banking environments. Excellent problem-solving, leadership, and stakeholder communication skills .

Posted 1 month ago

Apply

8.0 - 13.0 years

1 - 4 Lacs

Gurgaon, Haryana, India

On-site

Responsibilities Should have more than 8+ years of extensive experience in Salesforce/nCino Analyse User Stories, understand requirements and raise concern(s) over technical and functional gaps OR unaddressed scenarios (if any) Provide accurate estimates for user stories. Create well-structured technical design for the assigned user story / task by carefully considering the existing implementation and thinking from customer s perspective. Develop salesforce platform based good quality, simple, maintainable, highly optimized and extendable solution which is compliant to Oaknorths Coding / Config Best Practices Perform unit testing of the developed functionality and make sure that it should have negligible functional and technical errors. Develop Test Classes with optimum code coverage as per Salesforce defined best practices and Oaknorths defined specifications. Resolve discovered bugs in an efficient manner in first iteration only. Proactively communicate progress and risks to relevant stakeholders Create documentation such as design documents, specifications etc. Help teammates on technical and other aspects, as and when needed. Should be flexible to extend working hours in critical phases. Required Experience 8+ years of total and relevant experience Experience working in an Agile/Scrum development process. Capable of working on multiple fronts Capable of meeting strict timelines Well versed with nCino and Commercial lending domain. Well versed with Lightning Framework, Lightning App Builder. Well versed with Apex Classes, Aura Components, LWC, SOQL, Visualforce and JavaScript Well versed with Salesforce Platform Events, CDC, REST and SOAP based API integrations Well versed with Salesforce Flows, Approval Processes, Process Builder and Workflows Well versed with CI/CD - GitHub Well versed with Salesforce Reports and Dashboards Well versed with Salesforce Data import and export. Working knowledge of JIRA and Confluence Experience developing with VSCode Desired Skills Knowledge of Apex Enterprise Patterns and Apex Design Patterns Working knowledge of Salesforce Digital Experience - Communities / Sites Working knowledge of Lightning Design System Certifications - Salesforce Platform Developer I and II

Posted 1 month ago

Apply

5.0 - 10.0 years

8 - 13 Lacs

Bengaluru

Work from Office

Experienced in rtl design using verilog / system Verilog Asic designers with experiences in all aspects of rtl design flow from specification/microarchitecture definition to design and verification, timing analysis, dft and implementation Integration, rtl signoff tools, upf/low power signoff and cdc/rdc, lint Strong domain knowledge of clocking, system modes. Power management, debug, interconnect, safety, security and other architectures

Posted 1 month ago

Apply

7.0 - 12.0 years

20 - 35 Lacs

Dubai, Pune, Chennai

Hybrid

Job Title: Confluent CDC System Analyst Role Overview: A leading bank in the UAE is seeking an experienced Confluent Change Data Capture (CDC) System Analyst/ Tech lead to implement real-time data streaming solutions. The role involves implementing robust CDC frameworks using Confluent Kafka , ensuring seamless data integration between core banking systems and analytics platforms. The ideal candidate will have deep expertise in event-driven architectures, CDC technologies, and cloud-based data solutions . Key Responsibilities: Implement Confluent Kafka-based CDC solutions to support real-time data movement across banking systems. Implement event-driven and microservices-based data solutions for enhanced scalability, resilience, and performance . Integrate CDC pipelines with core banking applications, databases, and enterprise systems . Ensure data consistency, integrity, and security , adhering to banking compliance standards (e.g., GDPR, PCI-DSS). Lead the adoption of Kafka Connect, Kafka Streams, and Schema Registry for real-time data processing. Optimize data replication, transformation, and enrichment using CDC tools like Debezium, GoldenGate, or Qlik Replicate . Collaborate with Infra team, data engineers, DevOps teams, and business stakeholders to align data streaming capabilities with business objectives. Provide technical leadership in troubleshooting, performance tuning, and capacity planning for CDC architectures. Stay updated with emerging technologies and drive innovation in real-time banking data solutions . Required Skills & Qualifications: Extensive experience in Confluent Kafka and Change Data Capture (CDC) solutions . Strong expertise in Kafka Connect, Kafka Streams, and Schema Registry . Hands-on experience with CDC tools such as Debezium, Oracle GoldenGate, or Qlik Replicate . Hands on experience on IBM Analytics Solid understanding of core banking systems, transactional databases, and financial data flows . Knowledge of cloud-based Kafka implementations (AWS MSK, Azure Event Hubs, or Confluent Cloud) . Proficiency in SQL and NoSQL databases (e.g., Oracle, MySQL, PostgreSQL, MongoDB) with CDC configurations. Strong experience in event-driven architectures, microservices, and API integrations . Familiarity with security protocols, compliance, and data governance in banking environments. Excellent problem-solving, leadership, and stakeholder communication skills .

Posted 1 month ago

Apply

10.0 - 13.0 years

12 - 15 Lacs

Bengaluru

Work from Office

About the Opportunity Job TypeApplication 31 July 2025 TitlePrincipal Data Engineer (Associate Director) DepartmentISS LocationBangalore Reports ToHead of Data Platform - ISS Grade 7 Department Description ISS Data Engineering Chapter is an engineering group comprised of three sub-chapters - Data Engineers, Data Platform and Data Visualisation that supports the ISS Department. Fidelity is embarking on several strategic programmes of work that will create a data platform to support the next evolutionary stage of our Investment Process.These programmes span across asset classes and include Portfolio and Risk Management, Fundamental and Quantitative Research and Trading. Purpose of your role This role sits within the ISS Data Platform Team. The Data Platform team is responsible for building and maintaining the platform that enables the ISS business to operate. This role is appropriate for a Lead Data Engineer capable of taking ownership and a delivering a subsection of the wider data platform. Key Responsibilities Design, develop and maintain scalable data pipelines and architectures to support data ingestion, integration and analytics.Be accountable for technical delivery and take ownership of solutions.Lead a team of senior and junior developers providing mentorship and guidance.Collaborate with enterprise architects, business analysts and stakeholders to understand data requirements, validate designs and communicate progress.Drive technical innovation within the department to increase code reusability, code quality and developer productivity.Challenge the status quo by bringing the very latest data engineering practices and techniques. Essential Skills and Experience Core Technical Skills Expert in leveraging cloud-based data platform (Snowflake, Databricks) capabilities to create an enterprise lake house.Advanced expertise with AWS ecosystem and experience in using a variety of core AWS data services like Lambda, EMR, MSK, Glue, S3.Experience designing event-based or streaming data architectures using Kafka.Advanced expertise in Python and SQL. Open to expertise in Java/Scala but require enterprise experience of Python.Expert in designing, building and using CI/CD pipelines to deploy infrastructure (Terraform) and pipelines with test automation.Data Security & Performance Optimization:Experience implementing data access controls to meet regulatory requirements.Experience using both RDBMS (Oracle, Postgres, MSSQL) and NOSQL (Dynamo, OpenSearch, Redis) offerings.Experience implementing CDC ingestion.Experience using orchestration tools (Airflow, Control-M, etc..) Bonus technical Skills: Strong experience in containerisation and experience deploying applications to Kubernetes.Strong experience in API development using Python based frameworks like FastAPI. Key Soft Skills: Problem-Solving:Leadership experience in problem-solving and technical decision-making.Communication:Strong in strategic communication and stakeholder engagement.Project Management:Experienced in overseeing project lifecycles working with Project Managers to manage resources. Feel rewarded For starters, well offer you a comprehensive benefits package. Well value your wellbeing and support your development. And well be as flexible as we can about where and when you work finding a balance that works for all of us. Its all part of our commitment to making you feel motivated by the work you do and happy to be part of our team. For more about our work, our approach to dynamic working and how you could build your future here, visit careers.fidelityinternational.com.

Posted 1 month ago

Apply

2.0 - 7.0 years

3 - 6 Lacs

Pune

Work from Office

Educational Bachelor of Engineering Service Line Enterprise Package Application Services Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements. You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Ability to work with clients to identify business challenges and contribute to client deliverables by refining, analyzing, and structuring relevant data Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Technical and Professional : 2+ years of total experience in SAP CDC / CDP solutioning, hands on experience of coding as part of SAP CDC. Good experience in web technologies like java scripting, JSON and NodeJS. Good to have experience HTML, CSS etc. Good experience in writing global scripts. Knowledge on SOAP and REST APIs. Experience working on creation of data flows and involved in large scale data migration activities from various systems. Good experience working on integrating CDC / CDP system with other SAP / non-SAP systems using standard connectors like G-Connectors or API based integration. Preferred Skills: Technology-SAP Functional-SAP C4HANA-SAP Customer Data Cloud

Posted 1 month ago

Apply

5.0 - 10.0 years

12 - 24 Lacs

Bengaluru

Work from Office

Responsibilities: Should have good understanding of SoC design flow Hands-on expertise in writing RTL in Verilog and System Verilog (optional VHDL) language, SoC level RTL integration ,Linting, CDC checks, STA ,constraints, UPF

Posted 1 month ago

Apply

5.0 - 10.0 years

6 - 9 Lacs

Hyderabad

Work from Office

Immediate Job Openings on #SAP Data Intelligence_ Pan India_ Contract #Skill: SAP Data Intelligence #Location: Pan India #Notice Period: Immediate #Employment Type: Contract Skill Mandatory SAP Data Intelligence with data transfer to SAP and Non SAP systems Yes Able to manage data pipelines for Integration, data preparation, processing and orchestration Yes SAP system S/4, SLT, ABAP, CDC Views Good To have.

Posted 1 month ago

Apply

4.0 - 9.0 years

0 - 2 Lacs

Hyderabad, Bengaluru

Hybrid

position: Contract to Hire(C2H) Skill: SAP ABAP Experience:5+ Location: Hyd Notice Period: Immediate to 15 Days SKILLS : SAP ABAP + Odata + Adobe + CDC +Flori Design, develop, test, and deploy SAP ABAP programs using CDC (Central Data Control) principles. Collaborate with cross-functional teams to identify business requirements and implement solutions using Adobe Forms and Fiori UI technologies. Develop OData services using SAP Gateway for integration purposes. Ensure high-quality deliverables by following coding standards, best practices, and testing methodologies. Participate in project meetings to discuss progress updates and provide input on technical decisions. Candidates who are Interested for above position , please share your Resume to bhargavi.maddela@kiya.ai

Posted 1 month ago

Apply

4.0 - 9.0 years

4 - 21 Lacs

Bengaluru, Karnataka, India

On-site

Responsibilities Build microarchitectures for various digital designs and components. Implement IP in RTL using industry-standard methodologies and tools. Perform CDC (Clock Domain Crossing) and Lint analysis to ensure design reliability and correctness. Utilize scripting languages such as TCL, Perl, and Python for automation and tool integration. Generate synthesis constraints to facilitate the synthesis process and ensure optimal performance. Collaborate with cross-functional teams to validate design and address any issues that arise during implementation. Qualifications Bachelor's or Master's degree in Electrical Engineering, Computer Engineering, or a related field. 4 to 8 years of experience in RTL design and verification. Strong knowledge of microarchitecture principles and RTL coding practices. Hands-on experience in UPF (Unified Power Format) for low power design. Proficiency in CDC and Lint analysis tools. Familiarity with scripting languages: TCL, Perl, and Python. Experience in generating synthesis constraints for various RTL designs.

Posted 1 month ago

Apply

0.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI . Inviting applications for the role of Lead Consulta nt- Snowflake Data Engineer ( Python+Cloud ) ! In this role, the Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification is must. Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL , Bulk copy, Snowpipe , Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight . Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/ Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python and Pyspark . Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2 . Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science , Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Snowflake Data Engineer. Skill Metrix: Snowflake, Python/ PySpark , AWS/Azure, ETL concepts, & Data Warehousing concepts Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. For more information, visit . Follow us on Twitter, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training .

Posted 1 month ago

Apply

8.0 - 10.0 years

5 - 15 Lacs

Hyderabad

Work from Office

hiring for RTL design Lead, for Hyderabad location , Exp - 8+ yrs RTL Design, SOC integration, CDC / Lint, IP Enhancement. Interested, kindly share with me your updated profile to anand.arumugam@modernchipsolutions.com

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies