Jobs
Interviews

25 Scd Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You will have the opportunity to work at Capgemini, a company that empowers you to shape your career according to your preferences. You will be part of a collaborative community of colleagues worldwide, where you can reimagine what is achievable and contribute to unlocking the value of technology for leading organizations to build a more sustainable and inclusive world. Your Role: - You should have a very good understanding of current work, tools, and technologies being used. - Comprehensive knowledge and clarity on Bigquery, ETL, GCS, Airflow/Composer, SQL, Python are required. - Experience with Fact and Dimension tables, SCD is necessary. - Minimum 3 years of experience in GCP Data Engineering is mandatory. - Proficiency in Java/ Python/ Spark on GCP, with programming experience in Python, Java, or PySpark, SQL. - Hands-on experience with GCS (Cloud Storage), Composer (Airflow), and BigQuery. - Ability to work with handling big data efficiently. Your Profile: - Strong data engineering experience using Java or Python programming languages or Spark on Google Cloud. - Experience in pipeline development using Dataflow or Dataproc (Apache Beam etc). - Familiarity with other GCP services or databases like Datastore, Bigtable, Spanner, Cloud Run, Cloud Functions, etc. - Possess proven analytical skills and a problem-solving attitude. - Excellent communication skills. What you'll love about working here: - You can shape your career with a range of career paths and internal opportunities within the Capgemini group. - Access to comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage, or new parent support via flexible work. - Opportunity to learn on one of the industry's largest digital learning platforms with access to 250,000+ courses and numerous certifications. About Capgemini: Capgemini is a global business and technology transformation partner, helping organizations accelerate their dual transition to a digital and sustainable world while creating tangible impact for enterprises and society. With a diverse team of over 340,000 members in more than 50 countries, Capgemini leverages its over 55-year heritage to unlock the value of technology for clients across the entire breadth of their business needs. The company delivers end-to-end services and solutions, combining strengths from strategy and design to engineering, fueled by market-leading capabilities in AI, generative AI, cloud, and data, along with deep industry expertise and a strong partner ecosystem.,

Posted 2 days ago

Apply

0.0 years

0 Lacs

noida, uttar pradesh, india

On-site

Inviting applications for the role of Principal Consultant-Data Engineer Responsibilities Strong DWH experience. Strong in one of the query languages like MS SQL, PL/SQL etc. Good understanding of best practices of cloud database technologies. Experience in Snowflake and/or dbt would be preferred while is not mandatory. Experience in CICD tools like GitHub etc. Understanding of batch orchestration tools like Apache Airflow , Control-m etc. Experience range Experience of any kind of data migration project is nice to have. Python is nice to have but DWH skills are must have . Qualifications we seek in you! Minimum Qualifications Bachelor%27s degree in computer science, information technology, or a related field. IT experience with a major focus on data warehouse/database-related projects Preferred Qualifications/ Skills You have experience in data warehousing, data modeling, and the building of data engineering pipelines. You are well versed in data engineering methods, such as ETL and ELT techniques through scripting and/or tooling. You are good at analyzing performance bottlenecks and providing enhancement recommendations you have a passion for customer service and a desire to learn and grow as a professional and a technologist. Strong analytical skills related to working with structured, semi-structured, and unstructured datasets. Collaborating with product owners to identify requirements, define desired and deliver trusted results. Building processes supporting data transformation, data structures, metadata, dependency, and workload management. In this role, SQL is heavily focused. An ideal candidate must have hands-on experience with SQL database design. Plus, Python . Demonstrably deep understanding of SQL (level: advanced) and analytical data warehouses ( Snowflake preferred). Demonstrated ability to write new code i.e., well-documented and stored in a version control system (we use GitHub & Bitbucket ) Extremely talented in applying SCD, CDC, and DQ/DV framework. Familiar with JIRA & Confluence . Must have exposure to technologies such as dbt , Apache airflow, and Snowflake . Desire to continually keep up with advancements in data engineering practices. Knowledge of AWS cloud, and Python is a plus. Must have exposure to technologies such as dbt , Apache Airflow, and Snowflake. Experience in other data platforms: Oracle, SQL Server, MDM, etc Expertise in writing SQL and database objects - Stored procedures, functions, and views. Hands-on experience in ETL/ELT and data security, SQL performance optimization and job orchestration tools and technologies e.g., dbt , APIs, Apache Airflow, etc. Experience in data modeling and relational database design Well-versed in applying SCD, CDC, and DQ/DV framework. Demonstrate ability to write new code i.e., well-documented and stored in a version control system (we use GitHub & Bitbucket) Good to have experience with Cloud Platforms such as AWS, Azure, GCP and Snowflake Good to have strong programming/ scripting skills (Python, PowerShell, etc.) Experience working with agile methodologies (Scrum, Kanban) and Meta Scrum with cross-functional teams (Product Owners, Scrum Master, Architects, and data SMEs) Excellent written, and oral communication and presentation skills to present architecture, features, and solution recommendations Global functional product portfolio technical leaders (Finance, HR, Marketing, Legal, Risk, IT), product owners, functional area teams across levels Global Data Product Portfolio Management & teams (Enterprise Data Model, Data Catalog, Master Data Management)

Posted 5 days ago

Apply

6.0 - 9.0 years

7 - 16 Lacs

chennai, bengaluru

Hybrid

Role & responsibilities SAP BusinessObjects Data Services Consultant Preferred candidate profile Job Title: Developer Work Location: ~BANGALORE~BANGALORE~BANGALORE~BANGALORE~CHENNAI~ Skills Required: Digital : SAP BusinessObjects Data Services Experience Range Required: 6-8 Years Job Description: SAP BODS Developer, SAP BusinessObjects Data Services Strong SAP BODS development experience • Strong Understanding of Dimensional modelling, Star/Snowflake Schemas. • Implementation experience of various SCD Types using BODS flows. • Works on building Change Data Capture or computation of Delta by various methods. • Works on building Change Data Capture/Delta logic using Hash MD5 Algorithm. • Works on enabling CDC at source. • Strong in SQL & PL/SQL programming. • Experience of SAP BODS Integration & Migration. • Unification of Datawarehouse by consolidating/optimizing existing 2 Data warehouses.

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

The Reference Data Analyst position at Deutsche Bank in Pune, India focuses on managing strategic data utilized across the bank's systems and infrastructure. The role involves handling reference data, which includes crucial enterprise-wide information such as pricing, securities, financial products, clients, legal entities, and mandates. The Reference Data Analyst is responsible for ensuring the accuracy and quality of data, from its capture to validation, classification, and implementing controls to optimize its quality and coverage. Additionally, the role involves maintaining processes related to data setup, storage, and distribution, as well as system configuration, and participating in projects to enhance infrastructure efficiency. Collaboration with various stakeholders such as Front Office, Middle Office, Audit, Compliance, and Regulatory Reporting is crucial for delivering solutions that align with immediate business priorities and long-term strategic objectives. The role also involves supporting daily MIS activities, participating in reviews, updating SOP/KOP, and working closely with different teams and stakeholders in the securities services domain. Candidates for this role should have knowledge of asset classes like Fixed Income, Equities & Derivatives, familiarity with applications such as SCD, PACE, Aladdin, and awareness of data vendors like Refinitiv and Bloomberg. Strong collaboration, communication skills, ability to work within tight deadlines, and meet Key Performance Indicators (KPIs) are essential for success in this position. Deutsche Bank offers a range of benefits including a leave policy, gender-neutral parental leaves, childcare assistance benefit, sponsorship for industry certifications, employee assistance program, hospitalization insurance, life insurance, and health screening. Training, coaching, and a culture of continuous learning are provided to support employees in their career progression. Deutsche Bank Group aims to create an inclusive work environment where employees are empowered to excel together, act responsibly, think commercially, take initiative, and collaborate effectively. Applicants from all backgrounds are encouraged to apply for positions within the organization. For more information about Deutsche Bank, visit their company website: https://www.db.com/company/company.htm.,

Posted 2 weeks ago

Apply

9.0 - 14.0 years

9 - 16 Lacs

bengaluru

Work from Office

Job Title: Tableau Desktop Developer + PowerBI developer Location: BANGALORE [work from office, atleast 3 days] Working Hours: General Shift/9am IST to 6pm IST Experience: 9-15 Year Employment Type: Full-Time Preferred Industry: Healthcare, Financial Services, or Enterprise IT No. Of positions - 5 INTERESTED CANDIDATES CAN MAIL CVs ON SO00831295@TechMahindra.com LOOKING FOR CANDIDATES TO JOIN TILL MAX END OF SEPTEMBER The candidate should have proficiency both in Tableau & POWER BI and that is the MUST criteria. SCD (slowly changing dimensions) is also a MUST HAVE SKILL. Job Description Experience range 9-15 years Design, develop, conducts unit testing, and maintains complex Tableau reports. PowerBI desktop development Knowledge on Row level Security Dashboard optimization Good knowledge on action filters Good knowledge on different types of filters and their usage Knowledge on creating different types of charts Knowledge on groups, sets, bins, etc Provide technical expertise in areas of architecture, design, and implementation. Knowledge on how data is stored and retrieved from a SCD (slowly changing dimensions) will be good to have. A solid understanding of SQL, relational database management system

Posted 3 weeks ago

Apply

5.0 - 6.0 years

8 - 10 Lacs

bengaluru

Work from Office

We seek a professional to develop ETL pipelines with PySpark, Airflow, and Python, work with large datasets, write Oracle SQL queries, manage schemas, optimize performance, and maintain data warehouses, while guiding the team on scalable solutions.

Posted 3 weeks ago

Apply

6.0 - 10.0 years

22 - 30 Lacs

hyderabad, chennai, bengaluru

Hybrid

Curious about the role? What your typical day would look like? As a Senior Data Engineer, you will work to solve some of the organizational data management problems that would enable them as a data-driven organization; Seamlessly switch between roles of an Individual Contributor, team member and Data Modeling lead as demanded by each project to define, design, and deliver actionable insights. On a typical day, you might • Engage the clients & understand the business requirements to translate those into data models. • Create and maintain a Logical Data Model (LDM) and Physical Data Model (PDM) by applying best practices to provide business insights. • Contribute to Data Modeling accelerators • Create and maintain the Source to Target Data Mapping document that includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. • Gather and publish Data Dictionaries. • Involve in maintaining data models as well as capturing data models from existing databases and recording descriptive information. • Use the Data Modelling tool to create appropriate data models. • Contribute to building data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. • Use version control to maintain versions of data models. • Collaborate with Data Engineers to design and develop data extraction and integration code modules. • Partner with the data engineers to strategize ingestion logic and consumption patterns. Job Requirement Expertise and Qualifications What do we expect? • 6+ years of experience in Data space. • Decent SQL skills. • Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) • Real-time experience working in OLAP & OLTP database models (Dimensional models). • Good understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. • Eye to analyze data & comfortable with following agile methodology. • Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP). You are important to us, lets stay connected! Every individual comes with a different set of skills and qualities so even if you dont tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow. We are an equal-opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encourage people to grow the way they desire. Note: The designation will be commensurate with expertise and experience. Compensation packages are among the best in the industry

Posted 3 weeks ago

Apply

10.0 - 14.0 years

30 - 37 Lacs

hyderabad, chennai, bengaluru

Hybrid

Curious about the role? What your typical day would look like? As an Architect, you will work to solve some of the most complex and captivating data management problems that would enable them as a data-driven organization; Seamlessly switch between roles of an Individual Contributor, team member, and Data Modeling Architect as demanded by each project to define, design, and deliver actionable insights. On a typical day, you might • Engage the clients & understand the business requirements to translate those into data models. • Analyze customer problems, propose solutions from a data structural perspective, and estimate and deliver proposed solutions. • Create and maintain a Logical Data Model (LDM) and Physical Data Model (PDM) by applying best practices to provide business insights. • Use the Data Modelling tool to create appropriate data models • Create and maintain the Source to Target Data Mapping document that includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. • Gather and publish Data Dictionaries. • Ideate, design, and guide the teams in building automations and accelerators • Involve in maintaining data models as well as capturing data models from existing databases and recording descriptive information. • Contribute to building data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. • Use version control to maintain versions of data models. • Collaborate with Data Engineers to design and develop data extraction and integration code modules. • Partner with the data engineers & testing practitioners to strategize ingestion logic, consumption patterns & testing. • Ideate to design & develop the next-gen data platform by collaborating with cross-functional stakeholders. • Work with the client to define, establish and implement the right modelling approach as per the requirement • Help define the standards and best practices • Involve in monitoring the project progress to keep the leadership teams informed on the milestones, impediments, etc. • Coach team members, and review code artifacts. • Contribute to proposals and RFPs Job Requirement What do we expect? • 10+ years of experience in Data space. • Decent SQL knowledge • Able to suggest modeling approaches for a given problem. • Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) • Real-time experience working in OLAP & OLTP database models (Dimensional models). • Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. • Eye to analyze data & comfortable with following agile methodology. • Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP) • Enthuse to coach team members & collaborate with various stakeholders across the organization and take complete ownership of deliverables. • Experience in contributing to proposals and RFPs • Good experience in stakeholder management • Decent communication and experience in leading the team You are important to us, lets stay connected! Every individual comes with a different set of skills and qualities so even if you dont tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow. We are an equal-opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encourage people to grow the way they desire. Note: The designation will be commensurate with expertise and experience. Compensation packages are among the best in the industry .

Posted 3 weeks ago

Apply

0.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Scope: Systems Engineer responsibilities include deploying product updates, identifying production issues and implementing integrations that meet customer needs. The position works closely with internal teams and is responsible for configuration and troubleshooting of Karix products. Candidates should be familiar with Ruby or Python, Ultimately, you will execute and automate operational processes fast, accurately, and securely. Job Location: Hyderabad What You&aposd Do Implement integrations requested by Internal teams / customers Provide Level 2 technical support and queries coming from Internal teams Build tools to reduce occurrences of errors and improve customer experience Perform root cause analysis for production errors Investigate and resolve technical issues, Deploy updates and fixes Develop scripts to automate visualization Design procedures for system troubleshooting and maintenance 24*7 support, should be flexible for shifts What You&aposd Have Should have basic knowledge on ITIL Strong hands-on Linux Administration & troubleshooting, Scripting, VMware, Cron scheduling for backups, LVM, Basics on Java, C++, Ansible, Nginx, Apache, Tomcat Networking fundamentals (TCP/IP, LAN, WAN, VPN, routing) Basic knowledge on UAT-DC-DR environment, VAPT, SCD, SCR, SAN Storage Basic Knowledge on MySQL, Oracle, MongoDB, Redis, Backup & restore, Commvault Any other technical exposure to different technologies will be an advantage Installation of packages, patch management and upgrades Basic Understanding of Languages (Shell scripting, Python, Perl) Knowledge of best practices and IT operations in an always-up, always-available service. Show more Show less

Posted 1 month ago

Apply

3.0 - 6.0 years

4 - 9 Lacs

Mumbai

Work from Office

We are seeking an experienced and detail-oriented System Administrator with a strong background in Vulnerability Assessment (VA) and Secure Configuration Document (SCD) implementation. The ideal candidate will be adept at managing and securing both Linux and Windows environments , applying industry best practices to ensure infrastructure integrity and compliance. Experience with automation tools like Ansible , and scripting in Bash or PowerShell , is essential. Key Responsibilities: Administer and maintain Linux (Red Hat, CentOS, Ubuntu) and Windows Server environments. Conduct regular vulnerability assessments using industry-standard tools and remediate identified security issues. Apply and enforce secure configurations based on SCD principles, CIS Benchmarks, and other compliance frameworks. Perform system hardening, security patching, and routine maintenance to ensure a secure and stable infrastructure. Monitor system performance, availability, and security alerts; respond to incidents and anomalies as needed. Automate repetitive administrative tasks using Ansible, Bash, PowerShell, or similar tools. Collaborate with security, development, and IT teams to enhance the overall security posture. Maintain comprehensive documentation including system configurations, SOPs, remediation steps, and compliance reports. Required Skills & Qualifications: Minimum 3 years of hands-on experience as a System Administrator. Proficient in Linux distributions (Red Hat, CentOS, Ubuntu) and Microsoft Windows Server environments. Practical experience with Vulnerability Assessment tools (e.g., Nessus, OpenVAS, Qualys). Sound knowledge of Secure Configuration Documents (SCD) and compliance frameworks (e.g., CIS, NIST). Strong scripting skills in Bash and PowerShell. Experience with automation/configuration management tools like Ansible , Puppet, or Chef. Good understanding of networking concepts, firewalls, access controls, and system monitoring. Ability to troubleshoot complex system issues and work under minimal supervision. Excellent communication and documentation skills.

Posted 2 months ago

Apply

3.0 - 6.0 years

4 - 9 Lacs

Mumbai

Work from Office

We are seeking an experienced and detail-oriented System Administrator with a strong background in Vulnerability Assessment (VA) and Secure Configuration Document (SCD) implementation. The ideal candidate will be adept at managing and securing both Linux and Windows environments , applying industry best practices to ensure infrastructure integrity and compliance. Experience with automation tools like Ansible , and scripting in Bash or PowerShell , is essential. Key Responsibilities: Administer and maintain Linux (Red Hat, CentOS, Ubuntu) and Windows Server environments. Conduct regular vulnerability assessments using industry-standard tools and remediate identified security issues. Apply and enforce secure configurations based on SCD principles, CIS Benchmarks, and other compliance frameworks. Perform system hardening, security patching, and routine maintenance to ensure a secure and stable infrastructure. Monitor system performance, availability, and security alerts; respond to incidents and anomalies as needed. Automate repetitive administrative tasks using Ansible, Bash, PowerShell, or similar tools. Collaborate with security, development, and IT teams to enhance the overall security posture. Maintain comprehensive documentation including system configurations, SOPs, remediation steps, and compliance reports. Required Skills & Qualifications: Minimum 3 years of hands-on experience as a System Administrator. Proficient in Linux distributions (Red Hat, CentOS, Ubuntu) and Microsoft Windows Server environments. Practical experience with Vulnerability Assessment tools (e.g., Nessus, OpenVAS, Qualys). Sound knowledge of Secure Configuration Documents (SCD) and compliance frameworks (e.g., CIS, NIST). Strong scripting skills in Bash and PowerShell. Experience with automation/configuration management tools like Ansible , Puppet, or Chef. Good understanding of networking concepts, firewalls, access controls, and system monitoring. Ability to troubleshoot complex system issues and work under minimal supervision. Excellent communication and documentation skills.

Posted 2 months ago

Apply

3.0 - 6.0 years

4 - 9 Lacs

Mumbai

Work from Office

We are seeking a highly skilled and self-motivated Senior .NET Developer with deep expertise in secure application development. The ideal candidate will combine advanced .NET development skills with a strong foundation in application security, vulnerability assessments (VA), and secure coding practices (SCD). You will play a crucial role in leading a technical team, ensuring application security, and collaborating with stakeholders to deliver secure and high-performance applications. Key Responsibilities: Lead the design, development, and deployment of enterprise-grade applications using C#, ASP.NET, MVC, and .NET Core . Ensure secure coding practices in line with OWASP, SANS Top 25, and organizational standards. Conduct and guide team members in Vulnerability Assessment (VA) and Secure Code Development (SCD) . Review and implement modern security headers including CSP, HSTS, X-Content-Type-Options, X-Frame-Options, etc. Configure, optimize, and troubleshoot IIS web servers to ensure secure and high-performing hosting environments. Collaborate with application owners and conduct security-focused meetings with HODs to communicate risks, mitigation plans, and progress updates. Act as a subject matter expert during security audits, assessments, and compliance checks . Develop and maintain technical documentation , including security guidelines, server configurations, and incident reports. Mentor and lead a team of developers and security analysts to instill a culture of secure development practices. Required Skills & Qualifications: Strong hands-on experience with C#, ASP.NET, MVC, and .NET Core . Deep understanding of IIS server architecture and web application deployment best practices. Expertise in application security , including VA tools , OWASP Top 10 , SANS Top 25 , and CWE . Proficient in applying HTTP security headers and other web application security mechanisms. Excellent analytical, troubleshooting, and problem-solving skills. Strong communication and stakeholder management skills. Experience in leading teams and working in cross-functional environments. Preferred Qualifications: Certifications such as CEH , OSCP , CSSLP , or Microsoft Security Certifications . Exposure to DevSecOps practices and CI/CD security integration. Familiarity with cloud security (Azure/AWS) and containerized application security (Docker/Kubernetes).

Posted 2 months ago

Apply

1.0 - 5.0 years

2 - 5 Lacs

Bengaluru

Work from Office

PLC/DCS/Drives : Emerson Automation Solutions-IP (PAC Systems, GMR/TMR Systems, 90-30 Series, Micro/Nano PLC). DeltaV, Siemens, and ABB 800xA with PM864 controllers and Control Builder Programming. Adapt in managing any PLC/DCS. Control Software : Proficy suite, iFIX, GMR – GEIP, ABB 800xA Control Builder, Siemens TIA Portal, Aveva Plant SCADA (With ASM Standards), Schneider Unity Pro, IA Ignition SCADA, PTC VT SCADA. Communication Protocols : Modbus Communication, Profinet, IEC61850, DNP, HART, OPC etc. Field Instruments and PLC Hardware : Proficient analyzation of field instruments like pressure/temperature/flow/level switches & transmitters, Fire & Gas detection Systems, limit switches, Installation support and testing, PLC IO wiring support and testing, configuring the field instruments for the PLC IO’s, developing logics for the same and testing, FAT/SAT support, review and rectification of customer queries during FAT/SAT. Proficient documentation skills in SCDs, C&Es, instrument and cable list documentation, IO tag databases, control systems, instruments, and valves.

Posted 2 months ago

Apply

5.0 - 9.0 years

15 - 22 Lacs

Chennai

Work from Office

We are looking for a skilled and motivated Senior Data Engineer to join data integration and analytics team. The ideal candidate will have hands-on experience with Informatica IICS, AWS Redshift, Python scripting, and Unix/Linux systems. You will be responsible for building and maintaining scalable ETL pipelines to support business intelligence and analytics needs. We value individuals who are passionate about continuous learning, problem-solving, and enabling data-driven decision-making. Years of Experience: Min. 5 years (Note: with 3+ years of hands-on experience in Informatica IICS (Cloud Data Integration, Application Integration) Primary Skills: Informatica IICS, AWS (especially Redshift) Secondary Skills: Python, Unix/Linux Role Description: As a Senior Data Engineer, you will lead the design, development, and management of scalable data platforms and pipelines. This role demands a strong technical foundation in data architecture, big data technologies, and database systems (both SQL and NoSQL), along with the ability to collaborate across functional teams to deliver robust, secure, and high-performing data solutions. Key Responsibilities: Design, develop, and maintain end-to-end data pipelines and infrastructure. Translate business and functional requirements into scalable, well-documented technical solutions. Build and manage data flows across structured and unstructured data sources, including streaming and batch integrations. Ensure data integrity and quality through automated validations, unit testing, and comprehensive documentation. Optimize data processing performance and manage large datasets efficiently. Collaborate closely with stakeholders and project teams to align data solutions with business objectives. Implement and maintain security and privacy protocols to ensure safe data handling. Set up development environments and configure tools and services. Mentor junior data engineers and contribute to continuous improvement and automation initiatives. Coordinate with QA and UAT teams during testing and release phases. Role Requirements: Strong proficiency in SQL, including procedures, performance tuning, and analytical functions. Solid understanding of data warehousing concepts, including dimensional modeling and slowly changing dimensions (SCDs). Hands-on experience with scripting languages (Shell / PowerShell). Proficiency in data profiling, validation, and testing practices. Excellent problem-solving, communication (written and verbal), and documentation skills. Exposure to Agile methodologies and CI/CD practices. Additional Requirements: Overall 5+ years of experience, with 3+ years of hands-on experience in Informatica IICS (Cloud Data Integration, Application Integration). Strong proficiency in AWS Redshift and writing complex SQL queries. Solid programming experience in Python for scripting, data wrangling, and automation.

Posted 2 months ago

Apply

1.0 - 4.0 years

3 - 7 Lacs

Bengaluru

Work from Office

: Job TitleReconciliation Production Analyst, NCT LocationBangalore, India Role Description The role requires the individual to manage the cash publishing and reconciliation (for Cash, Custody, and Intersystem Positions) for set of portfolios. The cash publishing process ensures the right cash projections to front office for their investment decisions. Cash Publishing is a sensitive process and requires transaction-based research on a list of Portfolios where errors will most likely result intofinancial/operationalrisk, hence it is very important to fill the position so that we have adequate time to train the person and avoid any impact on business. Good understanding of Reconciliation on various product classes such as Equity bond etc. including the end-to-end investigation on discrepancy between us vs external party. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Conduct cash publishing and reconciliation on the breaks Timely follow ups on the open breaks Securities & OTC recon Good team player Preparing daily MIS Your skills and experience Reconciliations on cash & positions Hands on experience on TLM, Aladdin, SCD Should be able to understand the accounting vs Investment book of records. Experience / Qualifications: Bachelors degree with 1-4 years of experience How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm

Posted 2 months ago

Apply

0.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI . Inviting applications for the role of Lead Consulta nt- Snowflake Data Engineer ( Python+Cloud ) ! In this role, the Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification is must. Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL , Bulk copy, Snowpipe , Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight . Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/ Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python and Pyspark . Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2 . Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science , Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Snowflake Data Engineer. Skill Metrix: Snowflake, Python/ PySpark , AWS/Azure, ETL concepts, & Data Warehousing concepts Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. For more information, visit . Follow us on Twitter, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training .

Posted 2 months ago

Apply

0.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Inviting applications for the role of Lead Consultant -Data Engineer! . Design, document & implement the data pipelines to feed data models for subsequent consumption in Snowflake using dbt, and airflow. . Ensure correctness and completeness of the data being transformed via engineering pipelines for end consumption in Analytical Dashboards. . Actively monitor and triage technical challenges in critical situations that require immediate resolution. . Evaluate viable technical solutions and share MVPs or PoCs in support of the research . Develop relationships with external stakeholders to maintain awareness of data and security issues and trends . Review work from other tech team members and provide feedback for growth . Implement Data Performance and data security policies that align with governance objectives and regulatory requirements . Effectively mentor and develop your team members . You have experience in data warehousing, data modeling, and the building of data engineering pipelines. . You are well versed in data engineering methods, such as ETL and ELT techniques through scripting and/or tooling. . You are good at analyzing performance bottlenecks and providing enhancement recommendations you have a passion for customer service and a desire to learn and grow as a professional and a technologist. . Strong analytical skills related to working with structured, semi-structured, and unstructured datasets. . Collaborating with product owners to identify requirements, define desired and deliver trusted results. . Building processes supporting data transformation, data structures, metadata, dependency, and workload management. . In this role, SQL is heavily focused. An ideal candidate must have hands-on experience with SQL database design. Plus, Python. . Demonstrably deep understanding of SQL (level: advanced) and analytical data warehouses (Snowflake preferred). . Demonstrated ability to write new code i.e., well-documented and stored in a version control system (we use GitHub & Bitbucket) . Extremely talented in applying SCD, CDC, and DQ/DV framework. . Familiar with JIRA & Confluence. . Must have exposure to technologies such as dbt, Apache airflow, and Snowflake. . Desire to continually keep up with advancements in data engineering practices. Qualifications we seek in you! Minimum qualifications: Essential Education Bachelor%27s degree or equivalent combination of education and experience. Bachelor%27s degree in information science, data management, computer science or related field preferred. Essential Experience & Job Requirements . IT experience with a major focus on data warehouse/database-related projects . Must have exposure to technologies such as dbt, Apache Airflow, and Snowflake. . Experience in other data platforms: Oracle, SQL Server, MDM, etc . Expertise in writing SQL and database objects - Stored procedures, functions, and views. Hands-on experience in ETL/ELT and data security, SQL performance optimization and job orchestration tools and technologies e.g., dbt, APIs, Apache Airflow, etc. . Experience in data modeling and relational database design . Well-versed in applying SCD, CDC, and DQ/DV framework. . Demonstrate ability to write new code i.e., well-documented and stored in a version control system (we use GitHub & Bitbucket) . Good to have experience with Cloud Platforms such as AWS, Azure, GCP and Snowflake . Good to have strong programming/ scripting skills (Python, PowerShell, etc.) . Experience working with agile methodologies (Scrum, Kanban) and Meta Scrum with cross-functional teams (Product Owners, Scrum Master, Architects, and data SMEs) o Excellent written, and oral communication and presentation skills to present architecture, features, and solution recommendations Global functional product portfolio technical leaders (Finance, HR, Marketing, Legal, Risk, IT), product owners, functional area teams across levels o Global Data Product Portfolio Management & teams (Enterprise Data Model, Data Catalog, Master Data Management) Preferred Qualifications Knowledge of AWS cloud, and Python is a plus. . . . . . .

Posted 3 months ago

Apply

7.0 - 10.0 years

10 - 16 Lacs

Bengaluru

Hybrid

Job Description: The candidate should be result-oriented, self-starter and self-motivated who can manage multiple time-sensitive assignments. Should have expert data modeling skills including conceptual, logical and physical model design. The candidate should be able to participate in the development of Business Intelligence solutions to meet the needs of the Commercial Organizations (Finance, CRM, Sales, Sales Operations, Marketing) Evaluate business requirements, conduct a POC and proactively call out data anomalies within the source system. Perform end to end data validation for a solution ensuring quality data delivery. Develop/performance tune complex SQL (PL/SQL and/or T-SQL) queries and stored procedures. Should be well versed with project tracking tools like Jira and Github. Able to batch processes using any scripting language like pearl, Unix or python. Experience with source systems like salesforce (SOQL), NetSuite or any other CRM is a plus. Candidate should be able to create strategic design and mapping of business requirements to system/technical requirements. Candidate should be able to communicate, translate, and simplify business requirements to ensure buy-in from all stakeholders. Requirements 7-10 years of experience with Database Development using Packages/Stored Procedures/Triggers in SQL and PLSQL 7-10 years of experience in Dimensional Modeling, Star and Snowflake Schemas design and maintaining dimensions using Slowly Changing Dimensions(SCD) 7+ years of experience in SQL performance tuning and optimization techniques Ability to work on on ad-hoc requests and should be able to work on simultaneous projects or tasks Effective and positive communication and Team working skills.

Posted 3 months ago

Apply

1.0 - 2.0 years

4 - 9 Lacs

Pune, Chennai, Bengaluru

Work from Office

Job Title: Data Engineer Experience: 12 to 20 months Work Mode: Work from Office Locations: Bangalore, Chennai, Kolkata, Pune, Gurgaon Role Overview We are seeking a driven and hands-on Data Engineer with 12 to 20 months of experience to support modern data pipeline development and transformation initiatives. The role requires solid technical skills in SQL , Python , and PySpark , with exposure to cloud platforms such as Azure Databricks or GCP . As a Data Engineer at Tredence, you will work on ingesting, processing, and modeling large-scale data, implementing scalable data pipelines, and applying foundational data warehousing principles. This role also includes direct collaboration with cross-functional teams and client stakeholders. Key Responsibilities Develop robust and scalable data pipelines using PySpark in cloud platforms like Azure Databricks or GCP Dataflow . Write optimized SQL queries for data transformation, analysis, and validation. Implement and support data warehouse models and principles, including: Fact and Dimension modeling Star and Snowflake schemas Slowly Changing Dimensions (SCD) Change Data Capture (CDC) Medallion Architecture Monitor, troubleshoot, and improve pipeline performance and data quality. Work with teams across analytics, business, and IT functions to deliver data-driven solutions. Communicate technical updates and contribute to sprint-level delivery. Mandatory Skills Strong hands-on experience with SQL and Python Working knowledge of PySpark for data transformation Exposure to at least one cloud platform: Azure Databricks or GCP Good understanding of data engineering and warehousing fundamentals Excellent debugging and problem-solving skills Strong written and verbal communication skills Preferred Skills Experience working with Databricks Community Edition or enterprise version Familiarity with data orchestration tools like Airflow or Azure Data Factory Exposure to CI/CD processes and version control (e.g., Git) Understanding of Agile/Scrum methodology and collaborative development Basic knowledge of handling structured and semi-structured data (JSON, Parquet, etc.)

Posted 3 months ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Dear Candidate, We are hiring a Data Warehouse Architect to design scalable, high-performance data warehouse solutions for analytics and reporting. Perfect for engineers experienced with large-scale data systems. Key Responsibilities: Design and maintain enterprise data warehouse architecture Optimize ETL/ELT pipelines and data modeling (star/snowflake schemas) Ensure data quality, security, and performance Work with BI teams to support analytics and reporting needs Required Skills & Qualifications: Proficiency with SQL and data warehousing tools (Snowflake, Redshift, BigQuery, etc.) Experience with ETL frameworks (Informatica, Apache NiFi, dbt, etc.) Strong understanding of dimensional modeling and OLAP Bonus: Knowledge of cloud data platforms and orchestration tools (Airflow) Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies

Posted 3 months ago

Apply

7.0 - 12.0 years

0 - 2 Lacs

Pune, Ahmedabad, Gurugram

Work from Office

Urgent Hiring: Azure Data Engineer (Strong PySpark + SCD II/III Expert) Work Mode: Remote Client-Focused Interview on PySpark + SCD II/III Key Must-Haves: Very Strong hands-on PySpark coding Practical experience implementing Slowly Changing Dimensions (SCD) Type II and Type III Strong expertise in Azure Data Engineering (ADF, Databricks, Data Lake, Synapse) Proficiency in SQL and Python for scripting and transformation Strong understanding of data warehousing concepts and ETL pipelines Good to Have: Experience with Microsoft Fabric Familiarity with Power BI Domain knowledge in Finance, Procurement, and Human Capital Note: This role is highly technical. The client will focus interviews on PySpark coding and SCD Type II/III implementation . Only share profiles that are hands-on and experienced in these areas. Share strong, relevant profiles to: b.simrana@ekloudservices.com

Posted 3 months ago

Apply

1.0 - 4.0 years

5 - 9 Lacs

Bengaluru

Work from Office

About The Role : Job TitleOperations Analyst, NCT LocationBangalore, India Role Description The Analyst / Sr. Analyst will be responsible for completion of day-to-day activity as per standards and ensure accurate and timely delivery of assigned production duties. Candidate needs to ensure adherence to all cut-off times and quality of processing as maintained in SLAs. Candidate should ensure that all queries/first level escalations related to routine activities are responded to within the time frames pre-specified. Should take responsibility and act as backup for the Peers in their absence and share best practices with the team. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Manage daily Reconciliation of Securities/Cash Internal book of records V/s Custodian Books. Basic knowledge of Daily Uploads of feeds and its Maintenance. Investigating Margin differences / Tax related differences. Research and bookings of Dummy forexes Manage cash reconciliation between Aladdin and the custodian feeds on trade and Currency level. Identify the cause and assign the cash/position break to correct team for further investigation & resolution. Perform primary investigation on the cash/position breaks on Aladdin escalate all issues properly, in time, to the appropriate level, to avoid any adverse impact on the business Responsible for understanding clients needs from a technical and operational perspective Ensure support for managing internal projects/initiatives, Timely response to all front office/ Internal queries Ensure strict adherence to all internal and external process guidelines including compliance and legal. Ensure candidate has assisted in creating proper backups through adequate cross training, within the department Your skills and experience Experience in handling Cash and Position reconciliation.(Preferred) Knowledge of Trade Life Cycle. Preferred Knowledge of Financial products like Debt, Equity, Derivatives etc. Functional Skills: Have Working knowledge of SSR/TLM/SCD/Aladdin reconciliation tool Cognos reporting Have basic knowledge of Reconciliation process and understand various (ledger and statement) feeds/swifts. Have experience of Bank Custody, FOBO reconciliation. Knowledge of Trade Life Cycle of various financial products will be an advantage. Have Working knowledge of SSR/TLM reconciliation tool. Attention to Details. Skills Needs to be a self-starter with significant ability to undertake initiatives. Strong interpersonal / good negotiations skills are required. Follow through skills, Effective communication skills, fluency in Microsoft Office skills, ability to confidently handle internal clients, futuristic and innovative approach will be expected. Ability and willingness to work in night shift is a must. Education / Certification Qualification Graduates with good academic records. Any certifications in securities such as NCFM modules, will be good but not compulsory. How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm

Posted 3 months ago

Apply

5.0 - 10.0 years

12 - 22 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Job Title: ======= Senior MS BI Developer Onsite Location: ============= Dubai, UAE Doha , Qatar Riyadh, Saudi Onsite Monthly Salary: ============== 10k AED - 15k AED - Full tax free salary , Depending on Experience Gulf work permit will be sponsored by our client Project duration: ============= 2 Years, Extendable Desired Experience Level Needed: =========================== 5 - 10 Years Qualification: ========== B.Tech / M.Tech / MCA / M.Sc or equivalent Experience Needed: =============== Over all: 5 or more Years of total IT experience Solid 3+ Years experience or MS - BI Developer with Microsoft Stack / MS - DWH Engineer Job Responsibilities: ================ - Design and develop DWH data flows - Able to build SCD -1 / SCD - 2 / SCD -3 dimensions - Build Cubes - Maintain SSAS / DWH data - Design Microsoft DWH & its ETL packages - Able to code T-SQL - Able to create Orchestrations - Able to design batch jobs / Orchestrations runs - Familiarity with data models - Able to develop MDM (Master Data Management) Experience: ================ - Experience as DWH developer with Microsoft DWH data flows and cubes - Exposure and experience with Azure services including Azure Data Factory - Sound knowledge of BI practices and visualization tools such as PowerBI / SSRS/ QlikView - Collecting / gathering data from various multiple source systems - Creating automated data pipelines - Configuring Azure resources and services Skills: ================ - Microsoft SSIS / SSAS / SSRS - Informatica - Azure Data Factory - Spark - SQL Nice to have: ========== - Any on site experience is added advantage, but not mandatory - Microsoft certifications are added advantage Business Vertical: ============== - Banking / Investment Banking - Capital Markets - Securities / Stock Market Trading - Bonds / Forex Trading - Credit Risk - Payments Cards Industry (VISA/ Master Card/ Amex) Job Code: ====== MSBI_DEVP_0525 No.of positions: ============ 05 Email: ===== spectrumconsulting1977@gmail.com if you are interested, please email your CV as ATTACHMENT with job ref. code [ MSBI_DEVP_0525 ] as subject

Posted 3 months ago

Apply

9 - 11 years

37 - 40 Lacs

Ahmedabad, Bengaluru, Mumbai (All Areas)

Work from Office

Dear Candidate, We are hiring a Data Engineer to build scalable data pipelines and infrastructure to power analytics and machine learning. Ideal for those passionate about data integrity, automation, and performance. Key Responsibilities: Design ETL/ELT pipelines using tools like Airflow or dbt Build data lakes and warehouses (BigQuery, Redshift, Snowflake) Automate data quality checks and monitoring Collaborate with analysts, data scientists, and backend teams Optimize data flows for performance and cost Required Skills & Qualifications: Proficiency in SQL, Python, and distributed systems (e.g., Spark) Experience with cloud data platforms (AWS, GCP, or Azure) Strong understanding of data modeling and warehousing principles Bonus: Experience with Kafka, Parquet/Avro, or real-time streaming Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies

Posted 4 months ago

Apply

5 - 8 years

14 - 18 Lacs

Hyderabad

Work from Office

Role & responsibilities Plan, develop, and coordinate test activities including creation and execution of test plans and test cases. Perform debugging, defect tracking, test analysis, and documentation. Understand business functionality and application technology under test. Collaborate with on-site teams and other stream areas during release cycles. Utilize ESG QA tools, methodologies, and processes. Ensure low bug rates and high code quality during releases. Manage build deployments to QA and flag risks/issues proactively. Skills Required: Experience with SQL and ETL testing, schema validation, and SCD types. Strong knowledge in data warehouse/BI testing and cloud-based services (Azure). Expertise in writing complex SQL queries and validating data during migration. Proficient in UFT, TFS, Microsoft tools, and peripheral technologies (SAP, PeopleSoft, Aderant). Strong communication, estimation, and project delivery skills. Team leadership, remote collaboration, and quality focus. Interested candidates can share your resume to sarvani.j@ifinglobalgroup.com

Posted 4 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies