Jobs
Interviews

1055 Etl Processes Jobs - Page 28

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a data engineering professional at our leading organization in the data engineering sector, you will be instrumental in designing and developing scalable data pipelines using Python and PySpark. Your primary responsibilities will include ingesting, processing, and storing large volumes of structured and unstructured data to ensure data accuracy, integrity, and accessibility for analytical projects. Collaboration with cross-functional teams to understand data needs and deliver data-driven solutions to meet business objectives will be a key aspect of your role. Your expertise in optimizing data models and queries for performance improvement, leveraging database technology, will be essential in enhancing data quality and performance. Additionally, maintaining version control using Git and adhering to best practices in coding and documentation will be crucial in ensuring streamlined processes. Monitoring and troubleshooting data workflows to meet reliability and performance standards will also be part of your responsibilities. The ideal candidate for this role must possess a minimum of 5 years of experience in data engineering with a strong focus on Python and PySpark. Proficiency in developing data pipelines and ETL processes, along with a solid understanding of SQL databases and data modeling concepts, is required. Experience with cloud technologies such as AWS, Azure, and GCP for data storage and processing is essential. Strong analytical and problem-solving skills will enable you to drive data quality and performance enhancements effectively. Preferred qualifications include familiarity with containerization tools like Docker and orchestration frameworks, as well as knowledge of Big Data technologies such as Hadoop or Spark. Experience working in Agile development environments and using CI/CD practices will be advantageous in this role. Joining our dynamic work environment will offer opportunities for professional development and growth within the company. Our inclusive culture is centered around team success and respect for individual contributions, making it a rewarding place to work. Key Skills: Python, PySpark, SQL, AWS, Azure, GCP, ETL, Data Modeling, Git, Docker, Hadoop, Spark, Cloud Technologies, Performance Tuning, CI/CD Practices.,

Posted 2 months ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Engineer specializing in Databricks, your primary responsibility will be to develop, support, and drive end-to-end business intelligence solutions using Databricks. You will collaborate with business analysts and data architects to transform requirements into technical implementations. Your role will involve designing, developing, implementing, and maintaining PySpark code through the Databricks UI to facilitate data and analytics use cases for the client. Additionally, you will code, test, and document new or enhanced data systems to build robust and scalable applications for data analytics. You will also delve into performance, scalability, capacity, and reliability issues to identify and address any arising challenges. Furthermore, you will engage in research projects and proof of concepts to enhance data processing capabilities. Key Requirements: - 3+ years of hands-on experience with Databricks and PySpark. - Proficiency in SQL and adept data manipulation skills. - Sound understanding of data warehousing concepts and technologies. - Familiarity with Google Pub sub, Kafka, or Mongo DB is a plus. - Knowledge of ETL processes and tools for data extraction, transformation, and loading would be beneficial. - Experience with cloud platforms such as Databricks, Snowflake, or Google Cloud. - Understanding of data governance and data quality best practices. Qualifications: - Bachelor's degree in computer science, engineering, or a related field. - Continuous learning demonstrated through technical certifications or related methods. - 3+ years of relevant experience in Data Analytics, preferably within the Retail domain. Desired Qualities: - Self-motivated and dedicated to achieving outcomes for a rapidly growing team and organization. - Effective communication skills through verbal, written, and client presentations. Location: India Years of Experience: 3 to 5 years In this role, your expertise in Databricks and data engineering will play a crucial part in driving impactful business intelligence solutions and contributing to the growth and success of the organization.,

Posted 2 months ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Analyst - Technical Data PM in Pune with 10-15 years of experience, your key responsibilities will include leading the end-to-end project management process, developing comprehensive project plans, ensuring project deliverables meet quality standards and are completed on time and within budget, and monitoring and reporting on project progress while identifying and mitigating risks as they arise. You will provide technical guidance and support to the project team, particularly in areas of Data Engineering, collaborate with technical leads and architects to ensure alignment of technical solutions with business objectives, and drive continuous improvement in project delivery processes and methodologies. Engaging with stakeholders to gather requirements, define project scope, and establish clear project goals will be crucial. You will facilitate regular communication and reporting to stakeholders, ensuring transparency and alignment, manage stakeholder expectations, and address any issues or concerns promptly. Team management will also be a key aspect of your role, as you lead and mentor a multidisciplinary team of developers, engineers, and analysts, foster a collaborative and high-performance team environment, conduct regular performance reviews, and provide constructive feedback to team members. Your technical competencies should include a strong understanding of data engineering principles, proficiency in SQL and experience with database management systems (e.g., MySQL, PostgreSQL, Oracle), experience in Java development, knowledge of software development best practices, and experience with version control systems and continuous integration/continuous deployment pipelines. Qualifications for this role include a Bachelors or masters degree in computer science, Engineering, or a related field, 8 to 10 years of experience in technical project management, proven experience in managing large-scale, complex projects with multiple stakeholders, strong analytical and problem-solving skills, excellent communication and interpersonal skills, and a PMP, PRINCE2, or similar project management certification is a plus.,

Posted 2 months ago

Apply

1.0 - 6.0 years

0 - 0 Lacs

pune, maharashtra

On-site

As a Senior Data Analyst specializing in Tableau and Databricks, you will be a key player in our data team, responsible for converting complex data into actionable insights through visually engaging dashboards and reports. Your role will involve creating and maintaining advanced Tableau visualizations, integrating Tableau with Databricks for efficient data access, collaborating with various stakeholders to understand business requirements, and ensuring data accuracy and security in visualizations. Your responsibilities will include designing, developing, and optimizing Tableau dashboards to support business decision-making, integrating Tableau with Databricks to visualize data from various sources, and working closely with data engineers and analysts to translate business needs into effective visualization solutions. You will also be expected to document best practices for visualization design, data governance, and dashboard deployment, as well as staying updated with the latest features of Tableau and Databricks. To qualify for this role, you should hold a Bachelor's or Master's degree in Computer Science, Information Systems, Data Science, or a related field, along with at least 5 years of experience in building Tableau dashboards and visualizations. Additionally, you should have hands-on experience integrating Tableau with Databricks, a strong understanding of data modeling and ETL processes, proficiency in writing optimized SQL queries, and experience with Tableau Server or Tableau Cloud deployment. Preferred qualifications include experience with scripting or automation tools, familiarity with other BI tools and cloud platforms, Tableau certification, and knowledge of data privacy and compliance standards. Soft skills such as strong analytical abilities, excellent communication skills, attention to detail, and the ability to work independently and collaboratively in a fast-paced environment will also be beneficial in this role. If you meet the requirements and are looking for a challenging opportunity to leverage your Tableau and Databricks expertise, we encourage you to share your CV with us at sathish.m@tekgence.com.,

Posted 2 months ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You are now evolving the company at a rapid pace to meet exciting market demand. The culture of the company has won awards for executive leadership and employee diversity, and received top ratings from engineering, sales, and product design teams. The company is also a top-ranked company on Glassdoor. The Calix Professional Services Team is seeking a dynamic and customer-minded Cloud Data Engineer to join the Calix Cloud Implementation Team. As a Cloud Delivery Engineer, you will be responsible for transforming the CSPs raw data to enriched powerful data that will allow them to enrich their business using the Calix Cloud Product lines. You will be working across the Calix organization and directly with the CSPs to ensure their data is well represented in the Calix Cloud. You will operate as part of an overall implementation team to implement Calix Cloud SaaS solutions requiring custom input data. You will design data import and ETL processes based on customer-provided data sets and Calix Cloud data input requirements. Your responsibilities will include validating and troubleshooting data transformation and loading into SQL and noSQL databases. Documenting findings, test results, and as-built configurations for review by the implementation team and customer is a key part of your role. You will need to respond to requests for assistance, perform triage, prioritize, and escalate appropriately. Collaborating with customers to understand and document their business and technical requirements is also expected from you. Using regular expressions to search and extract text-based data, querying SQL databases, or modifying SQL statements to produce custom results are some of the tasks you will be handling. As for qualifications, you are expected to have a working knowledge of Cloud-based applications and complex data integration, knowledge of common database management tools, and REST/JSON API methodologies. Strong written and verbal communication skills are required to provide updates, explain data, and document findings. Strong analysis, organizational skills, ability to work independently, and manage multiple projects simultaneously are essential. You should excel at learning quickly and adapting to change, with a proven track record of providing a high level of technical or project support. Proficiency with Microsoft Visio, Word, and Excel is necessary. Resourceful disposition, able to work independently, is also a valuable trait. Preferred qualifications include experience with BSS/OSS enterprise architectures, familiarity with the Calix product portfolio or similar telecom technologies, experience with REST/JSON tools such as Postman, cURL, and relevant programming language libraries like requests and httplib, knowledge of a scripting language such as Python, Perl, or Java, familiarity with Service Provider networking technologies, experience operating as part of an Agile/Scrum team, experience with Salesforce, TaskRay, JIRA, Confluence, MS Teams, and a bachelor's or master's degree in computer science/engineering or related fields. The location for this position is Bengaluru, and you should be willing to work in US shift timings. If you are a person with a disability needing assistance with the application process, you can email at calix.interview@calix.com or call at +1 (408) 514-3000. Calix delivers a broadband platform and managed services that enable customers to improve life one community at a time. The company is at the forefront of a once-in-a-generational change in the broadband industry. The mission of Calix is to enable CSPs of all sizes to Simplify, Innovate, Grow.,

Posted 2 months ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

As a BI (Business Intelligence) Support Engineer at our company based in Bengaluru, KA, you will play a crucial role in maintaining and supporting our business intelligence systems and tools. Your primary responsibility will be to ensure that BI platforms run smoothly, troubleshoot issues, and assist end-users in maximizing the potential of BI tools for data analysis, reporting, and decision-making. Your key responsibilities will include: - System Maintenance and Monitoring: Overseeing the smooth operation of BI platforms, implementing updates, and monitoring the performance and health of BI systems and pipelines. - Troubleshooting and Issue Resolution: Identifying and resolving issues related to BI reports, dashboards, or data connections. - User Support and Training: Providing guidance to end-users on how to use BI tools, addressing their queries, and assisting in troubleshooting report issues. - Data Integration and ETL Support: Assisting in integrating various data sources and ensuring error-free ETL (Extract, Transform, Load) processes. - Collaboration with IT Teams: Working closely with developers, database administrators, and data engineers to ensure robust data pipelines and accurate reports. - Documentation: Creating detailed documentation for troubleshooting procedures, system configurations, and user guides. In terms of technical experience, you should possess: - Proficiency in BI Tools like Power BI, Tableau, etc. - Expertise in writing and optimizing SQL queries using technologies such as SQL Server, Oracle, MySQL, PostgreSQL, Redshift, Snowflake. - Knowledge of ETL tools and processes (e.g., SSIS) for integrating data into BI systems. - Understanding of data warehousing concepts and architecture using technologies like Snowflake, Azure Synapse, Google BigQuery. - Familiarity with Cloud Platforms such as AWS, Microsoft Azure, Google Cloud. - Experience with API development and integration using tools like Postman, OpenAPI specs, Swagger, YAML. - Hands-on experience in integrating Azure Functions with multiple services for serverless workflows. Your problem-solving and analytical skills will be critical in: - Troubleshooting data issues, system errors, and performance bottlenecks in BI tools. - Identifying trends, anomalies, and issues in data and reporting systems. - Diagnosing and resolving technical issues in data pipelines, BI tools, or databases. Moreover, your soft skills should include: - Clear communication skills for interacting with end-users, explaining technical issues, and providing training or support. - Ability to collaborate effectively in a cross-functional team, including developers, data scientists, and business analysts. - Strong time management skills to prioritize tasks efficiently, especially when supporting multiple users and systems. - Customer service orientation with a focus on delivering high-quality support to end-users and resolving issues promptly and effectively.,

Posted 2 months ago

Apply

5.0 - 10.0 years

0 Lacs

pune, maharashtra

On-site

You will be responsible for designing and developing logical and physical database models aligned with business needs. Your role will involve implementing and configuring databases, tables, views, and stored procedures. Monitoring and optimizing database performance, tuning queries, and resolving bottlenecks will be a key aspect of your responsibilities. Furthermore, you will need to implement database security measures, including access controls, encryption, and compliance measures. Developing and maintaining backup and disaster recovery strategies to ensure data continuity is crucial. You will also play a significant role in designing and implementing data integration mechanisms across systems to ensure consistency. Planning and executing strategies for scalability and high availability, such as sharding, replication, and failover, will be part of your tasks. Leading data migration projects and validating data integrity post-migration will also be within your scope of work. Creating and maintaining detailed documentation for schemas, data flows, and specifications is essential. Collaboration with developers, system admins, and stakeholders to align database architecture is a key aspect of this role. Troubleshooting and resolving database issues related to errors, performance, and data inconsistencies will be part of your responsibilities. You should have a strong knowledge of RDBMS (MySQL, PostgreSQL, SQL Server) and NoSQL (MongoDB, Cassandra) databases. Proficiency in SQL, database query optimization, indexing, and ETL processes is required. Experience with cloud database platforms like AWS RDS, Azure SQL, or Google Cloud SQL will be beneficial. Excellent problem-solving and communication skills are essential for this role. Any relevant database certifications will be considered a plus. If you have a deep understanding of database technologies, a passion for data integrity and performance, and a knack for designing efficient data solutions, we encourage you to apply. Join our team and play a vital role in managing our data infrastructure to support the organization's data-driven initiatives. Please share an updated copy of your CVs on hrdept@cstech.ai.,

Posted 2 months ago

Apply

5.0 - 9.0 years

0 Lacs

thane, maharashtra

On-site

As the BI / BW Lead at DMart, you will lead and manage a dedicated SAP BW team to ensure the timely delivery of reports, dashboards, and analytics solutions. Your role will involve managing the team effectively, overseeing all SAP BW operational support tasks and development projects with a focus on high quality and efficiency. You will be responsible for maintaining the stability and performance of the SAP BW environment, managing daily support activities, and ensuring seamless data flow and reporting across the organization. Acting as the bridge between business stakeholders and your technical team, you will play a crucial role in enhancing DMart's data ecosystem. You should possess a Bachelor's or Master's degree in Computer Science, Information Systems, Engineering, or a related field. While SAP BW certifications are preferred, they are not mandatory. Key Responsibilities: - Lead and manage the SAP BW & BOBJ team, ensuring efficient workload distribution and timely task completion. - Oversee the daily operational support of the SAP BW & BOBJ environment to maintain stability and performance. - Provide direction and guidance to the team for issue resolution, data loads, and reporting accuracy. - Serve as the primary point of contact for business users and internal teams regarding SAP BW support and enhancements. - Ensure the team follows best practices in monitoring, error handling, and performance optimization. - Drive continuous improvement of support processes, tools, and methodologies. - Proactively identify risks and bottlenecks in data flows and take corrective actions. - Ensure timely delivery of data extracts, reports, and dashboards for critical business decisions. - Provide leadership in system upgrades, patching, and data model improvements. - Facilitate knowledge sharing and skill development within the SAP BW team. - Maintain high standards of data integrity and security in the BW environment. Professional Skills: - Strong functional and technical understanding of SAP BW / BW on HANA & BOBJ. - At least 5 years of working experience with SAP Analytics. - Solid knowledge of ETL processes and data extraction. - Experience with Data lakes such as Snowflake, Big Query, Data bricks, and Dashboard tools like Power BI, Tableau is advantageous. - Experience in Retail, CPG, or SCM is a plus. - Experience in managing SAP BW support activities and coordinating issue resolution. - Strong stakeholder management skills with the ability to translate business needs into technical actions. - Excellent problem-solving and decision-making abilities under pressure.,

Posted 2 months ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a Tableau Developer at Deutsche Bank Group, located in Pune, India, you will have a crucial role in converting data into actionable insights. Your primary responsibility will be to collaborate closely with the data analysis team to create interactive dashboards and reports that support data-driven decision-making throughout the organization. In this role, you will design, develop, and deploy scalable Tableau dashboards and reports to meet the analytical requirements of various business units. Your tasks will include working with data analysts and stakeholders to gather requirements, translating them into technical solutions, maintaining and enhancing existing Tableau reports, and ensuring data accuracy through cleansing and preparation processes. To excel in this position, you should possess strong analytical skills, a fundamental understanding of SQL and relational databases, and experience in developing visually appealing and user-friendly dashboards. Additionally, expertise in data modeling, ETL processes, and collaborative teamwork is essential. Preferred qualifications include experience with other BI tools, programming languages like Python, and familiarity with data warehousing concepts and Agile methodologies. Technical proficiency in Tableau, SQL, and database technologies such as PostgreSQL, MySQL, or Oracle is required. Experience with data cleaning techniques, relational databases, and cloud data warehousing solutions will be advantageous for your success in this role. At Deutsche Bank Group, you will benefit from a comprehensive leave policy, gender-neutral parental leaves, sponsorship for industry certifications, employee assistance programs, hospitalization and life insurance, and health screening. Moreover, you will receive training, coaching, and support to enhance your skills and advance in your career within a culture of continuous learning and collaboration. Join us at Deutsche Bank Group, where we strive to create a positive, fair, and inclusive work environment that empowers our employees to excel together every day. Visit our company website for more information: https://www.db.com/company/company.htm. We celebrate the successes of our people and welcome applications from individuals of all backgrounds.,

Posted 2 months ago

Apply

2.0 - 6.0 years

0 Lacs

chennai, tamil nadu

On-site

You are a highly skilled Salesforce Developer with over 3 years of experience, possessing comprehensive end-to-end business process knowledge. Your role involves working on enhancement and support projects. Your key responsibilities include managing the data migration process, developing best practices and protocols, evaluating different source systems, coordinating with clients to understand their data needs, establishing testing procedures, providing technical support for the data migration process, and creating documentation of the migration process for future projects. To be successful in this role, you must have a minimum of 2 years of experience in data migration, expertise in Snowflake, knowledge of ETL processes and data deduplication, proficiency in SQL, XML, and JSON, experience with REST API and SOAP, strong problem-solving skills, attention to detail, excellent communication and coordination skills. Knowledge of sales processes such as quoting and Opportunity management in Salesforce is an added advantage.,

Posted 2 months ago

Apply

5.0 - 9.0 years

0 Lacs

navi mumbai, maharashtra

On-site

As a Project Manager in the AML technology domain, you will be responsible for leading AML technology projects to ensure timely and cost-effective delivery. Your role will involve defining project scope, objectives, timelines, and deliverables in alignment with business goals. You will oversee the end-to-end Software Development Life Cycle (SDLC), from requirements gathering to deployment and post-implementation support. Collaboration with internal teams such as IT, Compliance, Risk, and Business, as well as external vendors, will be a key aspect of your responsibilities. In addition, you will be overseeing the deployment, integration, and maintenance of AML systems, ensuring that AML solutions meet regulatory and compliance requirements. Your role will involve working with compliance teams to fine-tune transaction monitoring, customer screening, and reporting systems. You will lead data mapping, transformation, and Extract, Transform, Load (ETL) processes for AML-related data. Your technical expertise will be crucial as you work with development teams on Application Programming Interfaces (APIs), automation, and enhancements to AML platforms. You will also oversee system testing, User Acceptance Testing (UAT), and deployment strategies to ensure smooth implementation. The ideal candidate for this role should have project management experience in the IT field, with proven expertise in SDLC, Agile, Waterfall, or other project management methodologies. Knowledge of databases such as Oracle, SQL Server, ETL processes, APIs, and cloud technologies is essential. Strong analytical, problem-solving, and stakeholder management skills are also required. Certifications such as PMP, PRINCE2, or Agile certification would be advantageous for this position. The educational qualification required for this role includes IT Graduates (B.E. (IT), B.Sc.(IT), B. Tech, BCA, MCA, M.Tech). This is a permanent position in the AML - IT / Compliance Technology department, with an experience level expectation of "Experienced". The posting date for this job opportunity is February 14, 2025.,

Posted 2 months ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Expected to be an SME, you will collaborate and manage the team to perform efficiently. You will be responsible for team decisions and engage with multiple teams to contribute to key decisions. Providing solutions to problems for your immediate team and across multiple teams will be part of your responsibilities. You will mentor junior team members to enhance their skills and knowledge in data engineering. Continuously evaluating and improving data processes to enhance efficiency and effectiveness will also be a key aspect of your role. In terms of professional & technical skills, proficiency in Snowflake Data Warehouse is a must. Experience with data modeling and database design is considered a good-to-have skill. You should have a strong understanding of ETL processes and data integration techniques, familiarity with cloud platforms and services related to data storage and processing, as well as experience in performance tuning and optimization of data queries. The candidate should have a minimum of 5 years of experience in Snowflake Data Warehouse. This position is based in Mumbai and requires 15 years of full-time education.,

Posted 2 months ago

Apply

5.0 - 10.0 years

7 - 17 Lacs

Bengaluru

Work from Office

About this role: Wells Fargo is seeking a Lead Business Execution Consultant Reconciliation Utility In this role, you will: Lead cross functional teams to strategize, plan, and execute a variety of programs, services and initiatives Drive accountability for assigned initiatives, limit risk exposure, and create efficiencies as appropriate Review strategic approaches and effectiveness of support function and business performance Perform assessments through fact finding and data requiring creative approaches to solving complex issues, and develop appropriate solutions or recommendations Make decisions in highly complex and multifaceted situations requiring solid understanding of business group's functional area or products, facilitate decision making and issue resolution, and support implementation of developed solutions and plans Collaborate and consult with members of the Business Execution team and team leaders to drive strategic initiatives Influence, guide and lead less experienced Strategy and Execution staff within the group Optimize IntelliMatch/Xceptor usage, enhancing automated matching logic and exception workflows to improve overall reconciliation efficiency Design and enforce reject management processes, ensuring failed transactions are swiftly investigated and resolved Mange SWIFT message processing, ensuring accurate interpretation of MTs and MX messages across reconciliation platforms Strengthen ETL data integrity, ensuring seamless extraction, transformation, and loading of financial data into reconciliation engines Monitor missing statement controls, proactively addressing data gaps that impact reconciliation completeness and closely work with external agent banks/tech teams/relationship managers Develop feed management controls, ensuring accurate and timely ingestion of financial data into reconciliation systems Establish a robust static data maintenance framework, ensuring accuracy in reference data and system configurations Enhance reconciliation governance, aligning with regulatory and internal risk control standards Drive automation and process optimization, reducing manual intervention and operational risk Lead investigation on breaks, escalations, and remediation efforts, collaborating with risk, compliance, and technology teams Partner with senior stakeholders across operations, risk, technology, and finance to drive reconciliation improvements Collaborate with external vendors, ensuring optimal IntelliMatch/Xceptor configurations and reconciliations capabilities Measure, monitor and ensure all source feeds (both internal and external) are received within agreed SLA., identifying process improvements to gain synergies on an ongoing basis Extensive experience on monitoring ledger and statement rejects queues in IntelliMatch/Xceptor Extensive experience on out-of-proof checks to ensure data integrity of ledger and statement feeds Escalate and resolve missing/incomplete feed data loads. Strong knowledge and experience of Reconciliations within Investment Banking/Corporate/Wealth businesses Should lead the end-of-day control checks on rejects/Out of proofs & ensure all are resolved or anything outstanding should appropriately reported as per the regulatory framework Responsible for ensuring all account static maintenance gets completed including creation, amendment and closure of account setup/Department code/Match rules/Auto coding rules after validation and checks Extensive knowledge on IntelliMatch/Xceptor look up table & Group filter maintenance Fair understanding around Securities merging for Depot recs Fair understanding of IntelliMatch/Xceptor static fields such as legal entity/client firm maintenance & Reconciliation Catalogue Static updates Should have fair understanding around leading multiple tech/ops related projects Proficient in the maintenance of matching criteria in IntelliMatch/Xceptor along with the facilitation of testing and setup. Responsible for escalation of any outstanding Rejects/Out of Proofs/Missing Statements to Operational Risk & Control and regulatory teams Act as the key point of contact with stakeholders (i.e. Technology, Change Manager) in order to implement changes Required Qualifications: 5+ years of Business Execution, Implementation, or Strategic Planning experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: Minimum B.Com Graduation Degree / MBA (Finance)/Post Graduation 15+ years of relevant experience in Fintech Domain, must in reconciliations Expertise in IntelliMatch, Xceptor & Other reconciliation tool, including configuration, automation, and exception management In-depth knowledge of SWIFT messages, including MT940, MT950, MT103, MT535/536, and ISO 20022 standards Strong understanding of ETL processes, data transformation techniques, and financial data integrity controls Proven track record in reconciliation risk management, static maintenance, and reject management controls Exceptional analytical skills, with the ability to access complex data issues and drive resolution Experience in process automation, leveraging Al/ML or RPA solutions for reconciliation enhancement Leadership experience, managing global teams and stakeholder engagements Strong communication and problem-solving skills, with an ability to influence across levels Exhibits strong relationship / negotiation orientation with senior level business partners globally Ability to work in a matrix environment and managing multiple competing priorities Strong stakeholder management, networking and relationship management skills Ability to drive change and transformation, project management & implementation in business operations Demonstrates a high degree of reliability, integrity, and trustworthiness in all areas Takes ownership and accountability for responsibilities, business outcomes, and for management of risk exposure Advanced PC skills, including Microsoft Office Applications Excellent Communication, planning and organizing skills Excellent verbal, written, and interpersonal communication skills

Posted 2 months ago

Apply

3.0 - 6.0 years

3 - 6 Lacs

Pune, Maharashtra, India

On-site

Job Description:** You will be the part of data migration team leading the testing team. Responsible for the understanding the existing ETL process, data ingestion, test strategy for the migration project including one time migrtion, daily sync up, code change verification, defect report, test plan, and QA sign off. **Key Responsibilities:** - Understanding the existing ecosystem of the data being migrated - Prepare test strategy, test plan for the migration plan (Data & ETL) - Prepare automated testing framework along with developers - Define test scenarios and prepare test data - Perform and guide the testing team for the QA - Collaborate with the team members, clients for successful implementation and QA sign off of the project. - Analyse any data or code related issues for root cause. **Requirements:** - 5 - 6 years of hands-on experience working with Ab Initio and manual testing - A>I GDE, Control Centre, EME, C>Op - Strong understanding of generic frameworks in A>I, parameterization in graphs, psets, plans, et.al. - Strong understanding of xml, xfr, MD Hub for data lineage, data quality. - Understanding of SQL, ETL processes, and data migration. - Database include - Cloudera, Oracle, Snowflake, other partition database. - Familiarity with Unix shell scripts. - Familiarity with AWS - Excellent problem-solving skills and the ability to work in a dynamic, fast-paced environment. - Strong communication skills and the ability to collaborate with cross-functional teams. **Key Responsibilities:** - Understanding the existing ecosystem of the data being migrated - Prepare test strategy, test plan for the migration plan (Data & ETL) - Prepare automated testing framework along with developers - Define test scenarios and prepare test data - Perform and guide the testing team for the QA - Collaborate with the team members, clients for successful implementation and QA sign off of the project. - Analyse any data or code related issues for root cause. **Requirements:** - 5 - 6 years of hands-on experience working with Ab Initio and manual testing - A>I GDE, Control Centre, EME, C>Op - Strong understanding of generic frameworks in A>I, parameterization in graphs, psets, plans, et.al. - Strong understanding of xml, xfr, MD Hub for data lineage, data quality. - Understanding of SQL, ETL processes, and data migration. - Database include - Cloudera, Oracle, Snowflake, other partition database. - Familiarity with Unix shell scripts. - Familiarity with AWS - Excellent problem-solving skills and the ability to work in a dynamic, fast-paced environment. - Strong communication skills and the ability to collaborate with cross-functional teams.

Posted 2 months ago

Apply

4.0 - 6.0 years

5 - 7 Lacs

Chennai, Tamil Nadu, India

On-site

At least 4-6 Years of experience in SAP BI. At least worked on 2 implementation projects with end-to-end design, build, test, and deployment of SAP BI technical solution. Hands-on development experience with BW on HANA, but also experience with BW/4 HANA would most probably work. Experience working with classical (e. g. Info Cube, Standard DSO and MultiProvider) as well as newer objects (e. g. ADSO, Composite Provider). Excellent knowledge of data extraction, data modelling, and reporting. Experience on extraction via ODP source system (type ABAP CDS and ODP SAP ). Ideally, having experience on CDS-based extraction from S/4 HANA using CDC ( Change Data Capture ). Good knowledge on SAP BW extractors like 2LISand generic extractors based on function module or view/table. Good understanding of delta functionality and data loads is must. Should be able to implement start and end routines within transformations. Basic knowledge about ABAP object-oriented programming is a must, as transformation logic is encapsulated in custom-built classes framework. Good knowledge about efficient code writing and debugging is required, especially performant handling of data in ABAP internal tables. Should be able to build, test and publish Bex queries (with BW Modelling Tools (Eclipse)) and Analysis for Office (AFO) workbooks. Should have basic knowledge on Open Hub destination. Should have basic knowledge on HANA native modelling objects (mainly Calculation Views) and their consumption within BW data models. Should have knowledge/hands-on experience in creation of customer exits variables that include BADI and classes. Should have knowledge on performance improvement and data load tuning techniques.

Posted 2 months ago

Apply

4.0 - 6.0 years

5 - 7 Lacs

Kolkata, West Bengal, India

On-site

At least 4-6 Years of experience in SAP BI. At least worked on 2 implementation projects with end-to-end design, build, test, and deployment of SAP BI technical solution. Hands-on development experience with BW on HANA, but also experience with BW/4 HANA would most probably work. Experience working with classical (e. g. Info Cube, Standard DSO and MultiProvider) as well as newer objects (e. g. ADSO, Composite Provider). Excellent knowledge of data extraction, data modelling, and reporting. Experience on extraction via ODP source system (type ABAP CDS and ODP SAP ). Ideally, having experience on CDS-based extraction from S/4 HANA using CDC ( Change Data Capture ). Good knowledge on SAP BW extractors like 2LISand generic extractors based on function module or view/table. Good understanding of delta functionality and data loads is must. Should be able to implement start and end routines within transformations. Basic knowledge about ABAP object-oriented programming is a must, as transformation logic is encapsulated in custom-built classes framework. Good knowledge about efficient code writing and debugging is required, especially performant handling of data in ABAP internal tables. Should be able to build, test and publish Bex queries (with BW Modelling Tools (Eclipse)) and Analysis for Office (AFO) workbooks. Should have basic knowledge on Open Hub destination. Should have basic knowledge on HANA native modelling objects (mainly Calculation Views) and their consumption within BW data models. Should have knowledge/hands-on experience in creation of customer exits variables that include BADI and classes. Should have knowledge on performance improvement and data load tuning techniques.

Posted 2 months ago

Apply

7.0 - 11.0 years

0 Lacs

ahmedabad, gujarat

On-site

As a Database Developer lead, you will play a crucial role in designing, implementing, and maintaining database solutions to cater to our organization's data requirements. With 7-10 years of experience in database development, your expertise in database design principles, SQL programming, and data modeling will be instrumental in this role. Your key responsibilities will include designing and developing relational databases for various applications, performing database tuning for optimal performance, creating SQL queries and stored procedures, and collaborating with cross-functional teams to ensure that database solutions align with business needs. Data modeling, database security implementation, and prompt resolution of database issues will also be part of your daily tasks. Documenting database design and processes will be essential to maintain transparency and efficiency. To excel in this role, you should hold a Bachelor's degree in computer science or a related field, possess proficiency in SQL programming, and have hands-on experience with relational database management systems like MySQL, PostgreSQL, or SQL Server. Your strong understanding of database design principles, data modeling techniques, and database security best practices will be valuable assets. Furthermore, your problem-solving skills, attention to detail, and ability to work both independently and collaboratively will contribute to your success in this position. Preferred qualifications include experience with NoSQL databases such as MongoDB or Cassandra, familiarity with ETL processes, and knowledge of cloud-based database services like AWS RDS or Azure SQL Database. Additionally, holding a certification in database administration or development would be advantageous. If you are a motivated Database Developer lead with a passion for database development and a keen eye for detail, we invite you to join our team and contribute to our organization's data-driven success.,

Posted 2 months ago

Apply

5.0 - 7.0 years

4 - 7 Lacs

Hyderabad, Telangana, India

On-site

Key Responsibilities: Design and implement PostgreSQL database schemas , tables, indexes, and constraints. Develop, test, and optimize complex SQL queries , stored procedures, functions, and triggers. Analyze and improve query performance using EXPLAIN plans , indexing, and tuning techniques. Maintain data integrity, security, and availability in PostgreSQL environments. Collaborate with development and data teams to support applications, ETL processes, and reporting needs. Perform data extraction, transformation, and loading (ETL) for business or analytics use cases. Assist in database backup, recovery, monitoring, and upgrades as needed. Document database structures, procedures, and best practices. Mandatory Skills: Strong hands-on experience with PostgreSQL (version 12 or later preferred) Advanced proficiency in SQL (DDL, DML, joins, subqueries, window functions, etc.) Experience in writing and debugging stored procedures , functions , and triggers Desirable Skills: Experience with PL/pgSQL scripting Knowledge of PostgreSQL performance tuning , indexing strategies, and partitioning Familiarity with ETL tools or data migration frameworks Experience with database monitoring tools (e.g., pgAdmin, pg_stat_statements) Exposure to Linux-based database administration Version control with Git

Posted 2 months ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Chennai, Perungudi

Work from Office

Job Summary A detail-oriented and results-driven Adobe Campaign Technical Consultant with experience in the banking or financial services domain The ideal candidate will have 4+ years of experience designing, executing, and optimizing data-driven marketing campaigns using Adobe Campaign The role requires strong proficiency in SQL and experience with ETL tools such as SSIS to manage end-to-end campaign workflows, including data extraction, transformation, and loading from complex data environments The candidate will work closely with marketing, data, and compliance teams to deliver targeted communications, enhance customer engagement, and ensure regulatory adherence Key Responsibilities 1. Design and implement personalized 1:1 customer communication journeys using Adobe Campaign. 2. Build and optimize workflows and campaign automation to support acquisition, retention, and cross-sell initiatives for banking products. 3. Extend and customize Adobe Campaign database schema to align with customer lifecycle and compliance requirements. 4. Work with SQL to extract, transform, and segment customer data from banking data warehouses and external sources (via FDA or ETL processes). 5. Design and maintain ETL processes using tools like SSIS to enable seamless campaign data integration. 6. Ensure compliance with banking regulations (e.g., data privacy, opt-out preferences) in all campaigns. 7. Coordinate the execution, testing, monitoring, and tracking of multichannel campaigns (Email, SMS, Push). 8. Manage offer library and dynamic content blocks for various banking services (credit cards, loans, savings accounts). 9. Collaborate with marketing teams to deliver campaigns aligned with business goals and regulatory policies. 10. Resolve high-priority production issues and ensure minimal impact on critical campaigns. 11. Monitor delivery rates, bounce rates, and engagement metrics, and recommend data-driven improvements. Required Skills and Experience 1. 4+ years of experience in Adobe Campaign development and end-to-end campaign execution. (Mandatory) 2. Strong proficiency in SQL for data extraction, segmentation, and transformation. 3. Experience with ETL processes and tools such as SSIS for handling campaign data flows. (Mandatory) 4. Hands-on experience with campaign workflow orchestration and Adobe Campaign Classic or Standard. 5. Hands-on experience with HTML/CSS for email template customization. 6. Experience working with banking or financial institutions is preferred. 7. Good understanding of data privacy laws (e.g., GDPR, local banking compliance standards). 8. Experience with FDA (Federated Data Access) integration for data extraction. 9. Strong communication skills and ability to work in cross-functional teams. Nice to Have 1. Knowledge of other marketing automation platforms. 2. Exposure to reporting tools and dashboards to measure campaign performance.

Posted 2 months ago

Apply

2.0 - 5.0 years

3 - 12 Lacs

Hyderabad, Telangana, India

On-site

We are seeking a Python/SQL Professional with 2-5 years of experience to join our dynamic team in India. The ideal candidate will possess strong programming skills in Python and a solid understanding of SQL for data management. This role requires collaboration with various teams to develop solutions that meet business needs. Responsibilities Develop and maintain Python applications to support data processing and analysis. Write and optimize SQL queries for data retrieval and manipulation. Collaborate with cross-functional teams to gather requirements and implement solutions. Troubleshoot and debug applications to ensure optimal performance. Create documentation for processes, code, and technical specifications. Skills and Qualifications Proficient in Python programming language. Strong knowledge of SQL and database management systems such as MySQL or PostgreSQL. Experience with data analysis and manipulation libraries in Python (e.g., Pandas, NumPy). Familiarity with version control systems, preferably Git. Understanding of software development methodologies and practices.

Posted 2 months ago

Apply

6.0 - 9.0 years

5 - 7 Lacs

Hyderabad, Telangana, India

On-site

Candidate should have 6 to 9 years of experience and should currently working as Power BI dev (Reporting & Analytics) Candidate should be have hands on exp with Semantic layer in Power BI Expertise in Data ware housing and hands on Data sets ,Delta load & Transformations (Domain is DWH and Data) Good expertise in interacting with multiple teams on the data analysis Good Communication and Client interactive skills

Posted 2 months ago

Apply

4.0 - 5.0 years

3 - 12 Lacs

Hyderabad, Telangana, India

On-site

We are seeking an experienced ADF Pipeline Developer to join our team in India. The ideal candidate will have a strong background in designing and implementing data pipelines using Azure Data Factory, ensuring seamless data integration and transformation. Responsibilities Design, develop and maintain Azure Data Factory (ADF) pipelines. Collaborate with data engineers and analysts to understand data requirements. Optimize data workflows and ensure efficient data integration processes. Monitor and troubleshoot data pipeline issues, ensuring data quality and reliability. Document data pipeline processes and maintain updated project documentation. Skills and Qualifications 4-5 years of experience in Azure Data Factory or similar ETL tools. Strong understanding of data integration concepts and best practices. Proficient in writing complex SQL queries and data manipulation. Experience with Azure Data Lake, Azure Blob Storage, and other Azure services. Familiarity with data modeling and data warehousing concepts. Knowledge of programming languages such as Python or C# is a plus. Excellent problem-solving skills and attention to detail. Effective communication skills and ability to work in a team environment.

Posted 2 months ago

Apply

5.0 - 7.0 years

5 - 10 Lacs

Hyderabad, Telangana, India

On-site

Having 5 to 7 years experince in Python, Pyspark, Azure Databricks Development Having good experience in Python coding Good at Agile Process and Communication skills, Client Status Reporting and Client communication.Python, Pyspark, Azure DatabricksFinancial Services

Posted 2 months ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

As the Manager, Business Insights, you will play a crucial role in the Business Insights team, ensuring that Services can make data-driven decisions and operate effectively and efficiently. Your responsibilities will include partnering with leadership across various functions such as Sales, Delivery, Product, and others to enhance strategic decision-making through facts and data. You will be tasked with diagnosing strategic gaps and opportunities within the operations of a function and identifying corrective measures. Your experience in building data-driven infrastructure, configuring systems, managing data storage, and utilizing BI platforms will be essential in driving productivity enhancements through technology solutions that match business needs. In this role, you will influence the decision-making process within a dedicated Services function by providing a fact-base and thought partnership to functional leaders. You will establish measurement frameworks, KPIs, and analysis questions to assess the health of the business for the specific function you are supporting. Your focus will be on ensuring that team members can maximize their time on core activities by minimizing other efforts through automation, process simplification, and hands-on partnership. Additionally, you will lead special projects that may not have a clear owner, building cross-functional teams for initiatives such as M&A integration and Agile projects. You will drive the development of the overall Data & Services analytic infrastructure, with an emphasis on optimizing system configurations and centrally aggregating data. Your problem-solving skills will be crucial in developing scalable and automated frameworks and processes. Your strong business knowledge relevant to Services functions and comfort with data sets and analytic tools like SQL, ETL Processes, Tableau, and Salesforce will be key assets in this role. Your technical orientation, experience in collaborating with internal developers, and configuring third-party technical systems will be beneficial. A generalist mentality with a well-rounded skill set is desired, and previous consulting experience would be a plus. Strong verbal and written communication skills at all levels of the organization will be essential for success in this role.,

Posted 2 months ago

Apply

5.0 - 10.0 years

4 - 7 Lacs

Hyderabad, Telangana, India

On-site

We are currently looking for Azure Data Engineers to become a member of Neudesic s Data & AI team. Minimum Experience: 5 Years and above Must Have Skills Prior experience in ETL, data pipelines, data flow techniques using Azure Data Services Working experience in Python, Scala, Py Spark, Azure Data Factory, Azure Data Lake Gen2, Databricks, Azure Synapse and file formats like JSON & Parquet Experience in creating ADF Pipelines to source and process data sets. Experience in creating Databricks notebooks to cleanse, transform and enrich data sets. Good understanding about SQL, Databases, NO-SQL DBs, Data Warehouse, Hadoop and various data storage options on the cloud. Development experience in orchestration of pipelines Experience in deployment and monitoring techniques Working experience with Azure DevOps CI/CD pipelines to deploy Azure resources. Experience in handling operations/Integration with source repository Must have good knowledge on Datawarehouse concepts and Datawarehouse modelling Good To Have Skills Familiarity with DevOps, Agile Scrum methodologies and CI/CD Domain-driven development exposure Analytical / problem solving skills Strong communication skills Good experience with unit, integration and UAT support Able to design and code reusable components and functions Should be able to review design, code & provide review comments with justification Zeal to learn new tool/technologies and adoption Power BI and Data Catalog experience Be aware of phishing scams involving fraudulent career recruiting and fictitious job postings; visit our Phishing Scams page to learn more.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies