Jobs
Interviews

32 Sqls Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 10.0 years

0 Lacs

karnataka

On-site

As a Salesforce Marketing Cloud professional with 2-10 years of experience, you will be responsible for various tasks related to Marketing Cloud functionalities. Your role will involve working hands-on with Marketing Cloud components such as Email, Mobile, Automation Studio, as well as content, contact & journey builder. You will need to leverage your expertise in Rest & Soap API to facilitate seamless integration and data exchange. In addition to your technical skills, you are expected to have experience in JavaScript, HTML & CSS. You will play a crucial role in Business Requirement gathering, Analysis, and conceptualizing high-level framework & design. Your familiarity with SFDC and its integration with SFMC will be essential to ensure smooth operations. Your responsibilities will also include conceptualizing integration via API (inbound and outbound), sFTP, and middleware. Knowledge and experience in REST & SOAP APIs, along with an understanding of Salesforce limits involved during Integration, will be critical. Your proficiency in data modeling and data transformation using SQLs will be key to drive effective decision-making processes. Moreover, your role will require you to have strong experience in designing and working with large scale multi-cloud applications. Excellent consulting skills, both oral and written communication, and analytical skills are essential for effective collaboration within the team and with stakeholders. Possessing Marketing Cloud Email Specialist & Marketing Cloud Developer certifications will be an added advantage. If you possess the required skills and experience, and are looking for an opportunity in Bangalore, Pune, or Gurgaon, we encourage you to apply by sending your resume to soumya.srivastava@innovaesi.com.,

Posted 2 days ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Join us as a Data & Analytics Analyst This is an opportunity to take on a purpose-led role in a cutting edge Data & Analytics team Youll be consulting with our stakeholders to understand their needs and identify suitable data and analytics solutions to meet them along with business challenges in line with our purpose Youll bring advanced analytics to life through visualisation to tell powerful stories and influence important decisions for key stakeholders, giving you excellent recognition for your work We're offering this role at associate vice president level What you'll do As a Data & Analytics Analyst, youll be driving the use of advanced analytics in your team to develop business solutions which increase the understanding of our business, including its customers, processes, channels and products Youll be working closely with business stakeholders to define detailed, often complex and ambiguous business problems or opportunities which can be supported through advanced analytics, making sure that new and existing processes are designed to be efficient, simple and automated where possible, As Well As This, Youll Be Leading and coaching your colleagues to plan and deliver strategic project and scrum outcomes Planning and delivering data and analytics resource, expertise and solutions, which brings commercial and customer value to business challenges Communicating data and analytics opportunities and bringing them to life in a way that business stakeholders can understand and engage with Adopting and embedding new tools, technologies and methodologies to carry out advanced analytics Developing strong stakeholder relationships to bring together advanced analytics, data science and data engineering work that is easily understandable and links back clearly to our business needs The skills you'll need Were looking for someone with a passion for data and analytics together with knowledge of data architecture, key tooling and relevant coding languages Along with advanced analytics knowledge, youll bring an ability to simplify data into clear data visualisations and compelling insight using appropriate systems and tooling , Youll Also Demonstrate Strong knowledge of data management practices and principles Experience of translating data and insights for key stakeholders Good knowledge of data engineering, data science and decisioning disciplines Data Analysis skills to identify Control gaps and measuring data quality against the standards, Automation and dashboarding of Data Quality metrics Practical knowledge of implementing Data Quality tools, Working experience in SQLs and Data Cloud environment Good understanding of Risk and Controls Frameworks Candidates must possess 8-10 years of experience Show

Posted 3 days ago

Apply

7.0 - 10.0 years

20 - 25 Lacs

Chennai

Work from Office

Critical Skills to Possess: Java 8 Spring Boot and Hibernate Spring Batch Experience writing SQLs & procedures in various environments Spring Rest Services REST API Maven Azure background Angular/ReactJS Java Script, HTML Preferred Qualifications: Bachelor’s degree in computer science or a related field (or equivalent work experience) Roles and Responsibilities Roles and Responsibilities: Design, develop, and maintain Java-based software applications throughout the entire software development lifecycle. Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. Develop user-facing features using Java frameworks (such as Spring, Hibernate, or JavaServer Faces) for web applications. Design and implement RESTful APIs for seamless integration with front-end components. Create and maintain database schemas, write efficient SQL queries, and optimize database performance. Develop and maintain scalable and high-performance back-end systems. Familiar with Continuous Integration methodologies and tools, including Jenkins Good to have: Exposure to Microservices, Docker, Kubernetes and cloud deployment. Ensure the security, performance, and reliability of the applications by implementing best practices and industry standards. Conduct code reviews, identify and fix bugs, and troubleshoot production issues. Collaborate with front-end developers to integrate user interfaces with back-end systems.

Posted 4 days ago

Apply

7.0 - 10.0 years

20 - 25 Lacs

Hyderabad, Chennai

Work from Office

Candidate should have 7+ years of experience in Java+Python Full Stack Critical Skills to Possess: Java 8 Python Spring Boot and Hibernate Spring Batch Experience writing SQLs & procedures in various environments Spring Rest Services REST API Maven Azure background Angular/ReactJS Java Script, HTML Preferred Qualifications: Bachelor’s degree in computer science or a related field (or equivalent work experience) Roles and Responsibilities Roles and Responsibilities: Design, develop, and maintain Java-based software applications throughout the entire software development lifecycle. Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. Develop user-facing features using Java frameworks (such as Spring, Hibernate, or JavaServer Faces) for web applications. Design and implement RESTful APIs for seamless integration with front-end components. Create and maintain database schemas, write efficient SQL queries, and optimize database performance. Develop and maintain scalable and high-performance back-end systems. Familiar with Continuous Integration methodologies and tools, including Jenkins Good to have: Exposure to Microservices, Docker, Kubernetes and cloud deployment. Ensure the security, performance, and reliability of the applications by implementing best practices and industry standards. Conduct code reviews, identify and fix bugs, and troubleshoot production issues. Collaborate with front-end developers to integrate user interfaces with back-end systems.

Posted 4 days ago

Apply

4.0 - 9.0 years

8 - 15 Lacs

Hyderabad, Telangana, India

On-site

Job description - Work closely with business process owners to understand data and solution requirements.- Design and build data models on HANA to provide efficient business solutions and analytical capabilities.- Design and build highly scalable data pipelines using native HANA tools like SLT and BODS- Translate complex business requirements into scalable technical solutions meeting HANA data modeling design standards.- Build business interactive solutions using XSJS/XSA.- Collaborate with multiple multi-functional teams and work on solutions which has larger impact on Apple business- We seek a self starter, forward-thinking person with strong leadership capabilities- Ability to communicate effectively with technical and non-technical multi-functional teams- You will interact with many other group s internal team to lead and deliver outstanding products in an exciting fast-paced environment- Dynamic people and inspiring, innovative technologies are the norm here. Will you join us in crafting solutions that do not yet exist Key Qualifications 4+ years of experience in Data Warehouse/Business Intelligence space We would like for you to have In-depth understanding of SAP HANA data Modeling, algorithms and version management Strong hands-on experience in Data warehouse development in SAP HANA Have a good experience in core HANA data modeling Experience in designing and developing ETL data pipelines using SLT and BODS. Experience in developing UI reporting solutions using XSJS/XSA Should be proficient in writing Advanced SQLs, Expertise in performance tuning of SQLs and Scripting You will demonstrate excellent understanding of development processes and agile methodologies Strong analytical and interpersonal skills Enthusiastic, highly motivated and ability to learn quick Thrives in a dynamic, fast-paced environment with multiple opposing priorities Education & Experience Bachelor s Degree or equivalent in data engineering, computer science or similar field.

Posted 4 days ago

Apply

6.0 - 11.0 years

13 - 17 Lacs

Pune

Work from Office

Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount . With our presence across 32 cities across globe, we support 100+ clients acrossbanking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. Job TitleSRE Exp6 - 12 years NP0 - 30 days Mandatory Skills: Devops , Linux and SQL, Trouble Shooting, Splunk & Dynatrace Job locationPune, Chennai Roles & Responsibilities: Possess excellent Linux skills (proper hands-on) as most of the applications estate is on unix platform, Possess very good SQL skills. Should be able to write SQLs queries using joins, sub-queries, DMLs and DDLs. Must to have DevOps tools (CI/CD pipelines in Jenkins) where deployments troubleshooting, RCA analysis, hands on is needed, Monitoring tools knowledge and experience in Splunk, Dynatrace for Batch job monitoring, Well versed with Infra and Application KPIs L2 PROD Support experience on Java Applications and awareness of ITSM Process (Incident Management, Change management, Problem tickets and tools like Remedy, Jira and confluence) Good and confident communication skills along with good presentations skills Good to have Cloud knowledge and automation tools like Chef, Puppet/XLR Willing to work in shifts and on call support.

Posted 1 week ago

Apply

3.0 - 6.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Advanced Programming Skills in Python,Scala,GoStrong expertise in developing and maintaining microservices in Go (or other similar languages), with the ability to lead and mentor others in this area. Extensive exposure in developing Big Data Applications ,Data Engineering ,ETL and Data Analytics . Cloud ExpertiseIn-depth knowledge of IBM Cloud or similar cloud platforms, with a proven track record of deploying and managing cloud-native applications. Leadership and CollaborationAbility to lead cross-functional teams, work closely with product owners, and drive platform enhancements while mentoring junior team members. Security and ComplianceStrong understanding of security best practices and compliance standards, with experience ensuring that platforms meet or exceed these requirements. Analytical and Problem-Solving Skills: Excellent problem-solving abilities with a proven track record of resolving complex issues in a multi-tenant environment. Required education Bachelor's Degree Required technical and professional expertise 4-7 years' experience primarily in using Apache Spark, Kafka and SQL preferably in Data Engineering projects with a strong TDD approach. Advanced Programming Skills in languages like Python ,Java , Scala with proficiency in SQL Extensive exposure in developing Big Data Applications, Data Engineering, ETL ETL tools and Data Analytics. Exposure in Data Modelling, Data Quality and Data Governance. Extensive exposure in Creating and maintaining Data pipelines - workflows to move data from various sources into data warehouses or data lakes. Cloud ExpertiseIn-depth knowledge of IBM Cloud or similar cloud platforms, with a proven track record of developing, deploying and managing cloud-native applications. Good to have Front-End Development experienceReact, Carbon, and Node for managing and improving user-facing portals. Leadership and CollaborationAbility to lead cross-functional teams, work closely with product owners, and drive platform enhancements while mentoring junior team members. Security and ComplianceStrong understanding of security best practices and compliance standards, with experience ensuring that platforms meet or exceed these requirements. Analytical and Problem-Solving Skills: Excellent problem-solving abilities with a proven track record of resolving complex issues in a multi-tenant environment. Preferred technical and professional experience Hands on experience with Data Analysis & Querying using SQLs and considerable exposure to ETL processes. Expertise in developing Cloud applications with High Volume Data processing. Worked on building scalable Microservices components using various API development frameworks.

Posted 1 week ago

Apply

6.0 - 8.0 years

0 - 2 Lacs

Hyderabad, Telangana, India

On-site

Job description Combine interface design concepts with digital design and establish milestones to encourage cooperation and teamwork. Develop overall concepts for improving the user experience within a business webpage or product, ensuring all interactions are intuitive and convenient for customers. Collaborate with back-end web developers and programmers to improve usability. Conduct thorough testing of user interfaces in multiple platforms to ensure all designs render correctly and systems function properly. Converting the jobs from Talend ETL to Python and convert Lead SQLS to Snowflake. Developers with Python and SQL Skills. Developers should be proficient in Python especially Pandas, Pyspark, or Dask for ETL scripting, with strong SQL skills to translate complex queries. They need expertise in Snowflake SQL for migrating and optimizing queries, as well as experience with data pipeline orchestration (e.g., Airflow) and cloud integration for automation and data loading. Familiarity with data transformation, error handling, and logging is also essential. Converting the jobs from Talend ETL to Python and convert Lead SQLS to Snowflake. Developers with Python and SQL SkillsDevelopers should be proficient in Python especially Pandas, Pyspark or Dask for ETL scripting, with strong SQL skills to translate complex queries. They need expertise in Snowflake SQL for migrating and optimizing queries, as well as experience with data pipeline orchestration (e.g., Airflow) and cloud integration for automation and data loading. Familiarity with data transformation, error handling, and logging is also essential. Python Programming, Data Processing Libraries Pandas, NumPy, Pyspark and Dask Familiarity with Python scheduling and orchestration tools like Airflow or Luigi and API Integration Integrating Python with external systems via APIs if needed for your ETL jobs.SQL Skills SQL Querying Proficiency in writing complex SQL queries, joins, aggregationsand data transformations. This is important for both Talend SQL Queries that need to be rewritten in Python or other system Migrating SQL code to Snowflake SQL syntax. and Database Design and Optimization Stored Procedures and Functions and knowledge of Snowflake stored procedures and user-defined functions UDF is essential. Snowflake Skills Snowflake SQL, Snowflake Data Types: Knowledge of how Snowflake handles various data types and how to map data types from the original ETL flow. ETL Concepts & Data Transformation ETL Concepts: Understanding of ETL best practices like batch vs. stream processing, handling slowly changing dimensions, data cleaning, and transformation. Data Flow Design: Design of data pipelines in Python (using libraries like Pandas, Pyspark, etc.) to replicate the ETL jobs previously done in Talend. Scheduling and Orchestration: Setting up and managing job scheduling/orchestration tools like Apache Airflow, Luigi, or Prefect to automate Python scripts and Snowflake tasks (if Talend orchestration functionality is also being replaced).Formal training or certification on software engineering concepts and 5+ years applied experience and Hands-on practical experience in system design, application development, testing, and operational stability Qualification Combine interface design concepts with digital design and establish milestones to encourage cooperation and teamwork. Develop overall concepts for improving the user experience within a business webpage or product, ensuring all interactions are intuitive and convenient for customers. Collaborate with back-end web developers and programmers to improve usability. Conduct thorough testing of user interfaces in multiple platforms to ensure all designs render correctly and systems function properly. Converting the jobs from Talend ETL to Python and convert Lead SQLS to Snowflake. Developers with Python and SQL Skills. Developers should be proficient in Python especially Pandas, Pyspark, or Dask for ETL scripting, with strong SQL skills to translate complex queries. They need expertise in Snowflake SQL for migrating and optimizing queries, as well as experience with data pipeline orchestration (e.g., Airflow) and cloud integration for automation and data loading. Familiarity with data transformation, error handling, and logging is also essential. Converting the jobs from Talend ETL to Python and convert Lead SQLS to Snowflake. Developers with Python and SQL SkillsDevelopers should be proficient in Python especially Pandas, Pyspark or Dask for ETL scripting, with strong SQL skills to translate complex queries. They need expertise in Snowflake SQL for migrating and optimizing queries, as well as experience with data pipeline orchestration (e.g., Airflow) and cloud integration for automation and data loading. Familiarity with data transformation, error handling, and logging is also essential. Python Programming, Data Processing Libraries Pandas, NumPy, Pyspark and Dask Familiarity with Python scheduling and orchestration tools like Airflow or Luigi and API Integration Integrating Python with external systems via APIs if needed for your ETL jobs.SQL Skills SQL Querying Proficiency in writing complex SQL queries, joins, aggregationsand data transformations. This is important for both Talend SQL Queries that need to be rewritten in Python or other system Migrating SQL code to Snowflake SQL syntax. and Database Design and Optimization Stored Procedures and Functions and knowledge of Snowflake stored procedures and user-defined functions UDF is essential. Snowflake Skills Snowflake SQL, Snowflake Data Types: Knowledge of how Snowflake handles various data types and how to map data types from the original ETL flow. ETL Concepts & Data Transformation ETL Concepts: Understanding of ETL best practices like batch vs. stream processing, handling slowly changing dimensions, data cleaning, and transformation. Data Flow Design: Design of data pipelines in Python (using libraries like Pandas, Pyspark, etc.) to replicate the ETL jobs previously done in Talend. Scheduling and Orchestration: Setting up and managing job scheduling/orchestration tools like Apache Airflow, Luigi, or Prefect to automate Python scripts and Snowflake tasks (if Talend orchestration functionality is also being replaced).Formal training or certification on software engineering concepts and 5+ years applied experience and Hands-on practical experience in system design, application development, testing, and operational stability

Posted 1 week ago

Apply

7.0 - 12.0 years

4 - 8 Lacs

Chennai

Work from Office

Project Role : Business Analyst Project Role Description : Analyze an organization and design its processes and systems, assessing the business model and its integration with technology. Assess current state, identify customer requirements, and define the future state and/or business solution. Research, gather and synthesize information. Must have skills : Finastra Fusion Global PAYplus (GPP) Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Business Analyst, you will analyze an organization and design its processes and systems, assessing the business model and its integration with technology. You will assess the current state, identify customer requirements, and define the future state and/or business solution. Your role involves researching, gathering, and synthesizing information to drive business decisions. Roles & Responsibilities:- Responsible for defining detailed business requirements through engagement with business and technology stakeholders, ensuring that traceability is maintained between requirements and solution elements throughout the project lifecycle- End to end management of requirements including definition, review, approval and business acceptance via end-user testing- Role may be required to define and execute business acceptance tests including Live Confidence Tests on behalf of the business stakeholders, providing evidence of test planning and performance for review/approval by senior stakeholders- Will be required to validate solution proposals to ensure they meet business requirements Professional & Technical Skills: - Candidate must have core Payment experience with ISO20022 ( MT2MX mapping for CHIPS & FED ISO MIgration )Prior payment platform experience in any of platforms like GPP, FIS, Fiserv, ACI etc. will be helpful- Should have hands-on experience on GPP Business Rule/Profile configuration, GPP Logs reading, exposure to important database tables- Should have experience on Agile/Scrum teams- Should have used JIRA/Confluence- Ability to gather business requirements and write user stories- Detailed understanding of end to end Payments processing To/From Scheme CHIPS/SWIFT/FedWire Payments very useful experience of capturing complex requirements and maintaining traceability- Will be expected to engage with and present to senior stakeholders- Responsible for co-ordination and supporting business readiness for the implementation and live support of the solution- Should have hands-on experience in Oracle & basic SQLs Additional Information:- The candidate should have a minimum of 7.5 years of experience in Finastra Fusion Global PAYplus (GPP).- This position is based at our Pune office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

8.0 - 15.0 years

30 - 35 Lacs

Bengaluru

Work from Office

JD Should have good experience in DB testing, writing Complex SQLs Should have worked on ETL applications or BigData technologies . Hands on experience in building automation frameworks for Database testing and data pipeline using Python. Should have worked on Python + Pytest/Robot automation frameworks. Should be proficient in Web Testing and have working automation experience in UI Testing

Posted 2 weeks ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

Kolkata, Hyderabad, Pune

Work from Office

Data Analyst1 Very good knowledge of Snowflake data warehouse, data infrastructure, data platforms, ETL implementation, data modelling and design Ability to gather, view, and analyze data Apply statistical and data analysis techniques to identify patterns, trends, correlations, and anomalies in large datasets. Utilize advanced analytical tools and programming languages (e.g., Python, R, SQL) to conduct exploratory data analysis. Develop and implement data models to support predictive and prescriptive analytics. Connecting to data sources, importing data and transforming data for Business Intelligence. Strong hands-on experience in writing SQLs, basic and complex Good knowledge of Snowflake cloud data platform Create clear and informative visualizations (charts, graphs, dashboards) to present insights to non-technical stakeholders. Strong exposure to Visualization , transformation, data analysis and formatting skills. Develop interactive dashboards and reports using data visualization tools (e.g.Snowsight, Tableau, Power BI) to facilitate data-driven decision-making. Good to have knowledge of Finance and Accounting domain Familiarity with cloud ecosystems Should be able to back track and do deep analysis of issues and provide RCA. Good testing and documentation skills Should be able to communicate with fast-paced, dynamic, client-facing role where delivering solid work products to exceed high expectations is a measure of success. Ability to be creative and analytical in a problem-solving environment. Effective verbal and written communication skills. Adaptable to new environments, people, technologies, and processes Ability to manage ambiguity and solve undefined problems

Posted 1 month ago

Apply

8.0 - 13.0 years

3 - 7 Lacs

Hyderabad

Work from Office

Dev + BA Profile who is having experience in Vanguard. 8 - 10 years of Exp as Business Analysts, Data Analysts, Development mainly Spark/Scala Must be Self Driven and individually own the assignments e2e Must have experience in banking domain like Payments, Liquidity, Trade domains or commercial cards. Must have experience in Data analysis Experience in writing SQLs Experience with Agile projects, JIRA

Posted 1 month ago

Apply

8.0 - 13.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Review, analyse and evaluate business systems and user needs. Document requirements, define scope and objectives and formulate systems to parallel overall business strategies. Rely on experience and judgment to plan and accomplish goals. Write reusable, testable, and efficient code Work collaboratively with design team to understand end user requirements to provide technical solutions and for the implementation of new software features Determine operational objectives by studying business functions; gathering information; evaluating output requirements and formats Construct workflow charts and diagrams; studying system capabilities; writing specifications Improve systems by studying current practices; designing modifications Recommend controls by identifying problems; writing improved procedures Define project requirements by identifying project milestones, phases and elements; forming project team; establishing project budget Monitor project progress by tracking activity; resolving problems; publishing progress reports; recommending actions Maintain user confidence and protect operations by keeping information confidential Prepare technical reports by collecting, analyzing and summarizing information and trends Contribute to team effort by accomplishing related results as needed Validate resource requirements and develop cost estimate models Conduct and coordinate financial, product, market, operational and related research to support strategic and business planning within the various departments and programs of the client group Interpret, evaluate and interrelate research data and develop integrated business analyses and projections for incorporation into strategic decision-making Plan and coordinate the development of primary and secondary market research studies in support of strategic planning and specific marketing initiatives, as required and presents findings of studies to client committees Perform daily, weekly and monthly reviews and analyses of current processes using operational metrics and reports Review a variety of areas including operations, purchasing, inventory, distribution and facilities Understand and communicate the financial and operational impact of any changes Suggest changes to senior management using analytics to support your recommendations. Actively participate in the implementation of approved changes Create informative, actionable and repeatable reporting that highlights relevant business trends and opportunities for improvement. Conduct insightful, ad hoc analyses to investigate ongoing or one-time operational issues 8 - 10 years of Exp as Business Analysts, Data Analysts, Development mainly Spark/Scala Must be Self Driven and individually own the assignments e2e Must have experience in banking domain like Payments, Liquidity, Trade domains or commercial cards. Must have experience in Data analysis Experience in writing SQLs Experience with Agile projects, JIRA

Posted 1 month ago

Apply

8.0 - 11.0 years

10 - 13 Lacs

Pune

Work from Office

: Job Title- Lead Business Functional Analyst for Adjustments acceleration, VP Location- Pune, India Role Description The Credit Risk Data Unit provides quality assured, and timely Finance relevant Risk information and analysis to key stakeholders in a transparent and controlled manner covering the end to end processes for all relevant metrics in an efficient and regulatory compliant way. This role is for the Global Risk Data Control and Validation Group Function team responsible for aggregating, quality assuring and timely submitting credit exposure data into FDW as per BCBS standards. This data impacts all downstream regulatory and regional reporting of the Bank including key metrics like Credit Risk RWA, Leverage Exposure and Regulatory Capital. RDV- GF is part of the Credit Risk Data Unit (CRDU) team within Group Finance and their key stakeholders include but are not limited toCRDU, Business Finance, Accounting Close, Book Runners and Source & FDW IT Support teams. This Group process is centrally based out of Pune. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities This is a key role requiring proactively managing the resolution of Data Quality (DQ) issues relating to sourcing of good quality input data into FDW from various source systems i.e LS2, SUMMIT, RMS, Magellan etc. This includes strategic, non strategic and manual data feeds. Support the change book of work as set out by FRM KD workstream, by engaging with Business, Finance, Change teams and Technology on initiatives for strategic implementations and Data Quality (DQ) remediation Navigate through the complex logics and algorithms built in the data enrichment layers i.e FCL, EOS, Kannon, risk engine to perform root cause analysis on the data quality issues. Provide input into relevant governance processes relating to of Data Quality issues, ensuring accurate monitoring, tracking and escalation. Providing subject matter expertise and analytics to support Finance and the Risk team regarding risk and regulatory topics or initiatives e.g. optimization topics Represent the team in relevant Production and Change forums and raise issues relating to month end data quality issues and their resolution Your skills and experience Minimum 8-9 years experience in Credit Risk Controls, Banking Operations, Business Process Reengineering, Change, Audit or Finance Industry. Good understanding of banking products (Debt, SFT and Derivatives) with working knowledge of Global Markets Financial products A good working knowledge of the front to back system architecture within an investment bank. Advance skills in MS Applications (Excel, Word, PowerPoint and Access). Working knowledge of SQLs a plus. Strong quantitative analysis skills Strong stakeholder management skills/able to manage diverse stakeholders across regions. How well support you

Posted 1 month ago

Apply

6.0 - 11.0 years

5 - 8 Lacs

Bengaluru

Work from Office

Job Details: Skill: Java Backend Developer Experience: 6+ Years Notice Period: Immediate Joiners or within 15 days Job Description: Very good design and development knowledge on Java , J2EE with Fullstack Design the solution in accordance with the requirements and Oragization standard Develop Analysis and Design document at the level that it is easily understandable by developer. Support in creating Test Cases for in sprint testing Support development team in resolving all technical queries Support PM, in developing estimate and schedule Actively participate in SIT , UAT , implementation details Good knowledge in Banking domain Experience and Expertise in developing Test Harness in J2SE Experience in Continuous Integration Tool Jenkins / Hudson Experience / Knowledge in Version Control systems like Git , CVS, Subversion, etc Experience in writing SQLs / PL-SQL Good knowledge in shell scripting & Unix commands ROLE SPECIFIC TECHNICAL COMPETENCIES JAVA / J2EE Expert PLSQL Expert spring boot / angular JS / react JS Core Agile Process Expert Unix Core Communication Expert Teamwork Expert Stakeholders Management Expert Kafka Core (Please select target proficiency level) QUALIFICATIONS: Training, licenses, memberships and certifications BE , Msc, ,MCA.

Posted 1 month ago

Apply

8.0 - 12.0 years

2 - 5 Lacs

Hyderabad

Work from Office

Role/Requirement: Oracle Retail Merchandising System (RMS) Development Lead Grade C2 Experience 8 to 12 years Main Responsibilities 1. Responsible for the development, review and test code based on the technical designs. Apply business and functional knowledge to prepare test scenarios. 2. Plan, execute and capture test results. 3. Prepare technical designs based on the functional requirements. 4. Maintain and check-in developed code in code repository. 5. Monitor and troubleshoot bugs and issues related to nightly batches, applications etc. 6. Identify bottle neck queries and suggest tuning ideas for long running batches based on NFRs. 7. Actively participate in the configuration discussions and defect triaging and align with teams for appropriate solution 8. Develop custom extensions and integration across RMS with other applications 9. Provides technical assistance and support for queries and issues raised by the business and other IT functions related to Oracle solutions Skills Needed 9. Very client focused with regards to service delivery 10. Exposure to ORMS functionalities v14, v16 (Foundation Data, Item Supplier, PO, Transfer/Allocation, Shipment/Receipt Stock Ledger etc. 11. Worked on at least 2 implementation projects on RMS v14/v16 12. At least 8-10 years of development experience in Oracle Retail application 13. Should have strong PL/SQL knowledge, expertise in writing stored procedures, triggers, functions, dynamic SQLs, materialized views, PL/SQL collections. 14. Should have unix shell scripting skills and decent ProC knowledge 15. Strong written and communication skills 16. Working knowledge of complete RMS Batch flow Nice to have 17. Good people/interpersonal skills 18. Capable of working under a fast pace and dynamic working environment 19. Should have overall understanding of all the Oracle Retail Modules 20. Hands on ADF knowledge will be added advantage 21. Exposure in RMS Deals Module

Posted 1 month ago

Apply

6.0 - 10.0 years

11 - 14 Lacs

Bengaluru

Work from Office

Notice Period: Immediate Joiners or within 15 days Employee type : Contract Should have Over 6+years of IT experience in Analysis, Design, Development, Implementation, Reporting, Testing and Visualization of Business Intelligence Tools . Extensive Knowledge in creating data visualizations using BI Tools Desktops and regularly publishing and presenting dashboards. Familiar with SQL Concepts SQL tables, Views, indexes, joins, writing stored procedures and query optimization, performance tuning etc. Able to generate reports using BI Tools to analyse data from multiple data sources like Oracle, SQL Server, Excel, Hive and Tera data. Good knowledge on various BI Tools functionalities like Tableau Extracts, Parameters, Trends, Hierarchies, Sets, Groups, Data Blending and joins etc. Able to develop various customized charts. Work extensively with Actions, Calculations, Parameters, Background images, Maps, Trend lines, LOD functions and Table Calculations. Able to generate Context filters, Extract and Data Source filters while handling huge volume of data. Hands on building Groups, hierarchies, Sets and create detail level summary reports and Dashboard using KPIs. Building, publishing customized interactive reports and dashboards, report scheduling using Tableau server. Handling User onboarding activities under BI tools and various servers. Able to manage User groups and access relative activities in BI Tools and tableau server. Ability to handle the data security. Good knowledge in Banking domain Experience and Expertise in developing Test Harness in J2SE Experience in Continuous Integration Tool Jenkins / Hudson Experience / Knowledge in Version Control systems like Git , CVS, Subversion, etc Experience in writing SQLs / PL-SQL Good knowledge in shell scripting & Unix commands

Posted 1 month ago

Apply

4.0 - 9.0 years

6 - 16 Lacs

Tamil Nadu

Work from Office

Primary Core JAVA, Selenium and Cucumber , Secondary SQL Good knowledge in Banking domain Experience in Continuous Integration Tool Jenkins / Hudson Experience and Expertise in developing Test Harness in J2SE / .NET Experience / Knowledge in Version Control systems like Git , CVS, Subversion, etc Experience in writing SQLs / PL-SQL Knowledge in Test Automation Design patterns like PageObject Model, PageObject Factory, etc Experience in any one of industry standard Test Automation Tools like QTP, TestComplete, etc Good knowledge in shell scripting & Unix commands

Posted 1 month ago

Apply

5.0 - 8.0 years

3 - 7 Lacs

Hyderabad

Work from Office

AS 400 Developer Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Candidate should be strong in AS400. Good customer interaction skills. Ability to individually handle multiple modules and complicated projects. 8+ Years of Development experience on core AS/400 technologiesDB2400, CL, RPG III/RPG IV/RPG LE. In-depth knowledge on SQL/400, QUERY/400, and other AS/400 utilities. Excellent understanding of Database TechnologiesDDS Physical and Logical files, SQL files, Stored Procedures and Complex SQLs. Experience with Advanced ILE concepts Procedures, Service programs, Functions. Good knowledge on RPG Upgrade / Migration from Fixed Format to Free Format. Good understanding of Application and Database Performance Best Practices and Tuning. Rich experience in creating Design and Estimation Document needed for a development change proposed by Business Users. Interact with business stake holders to get clarity on queries and prepare Technical Specification Document. Design and develop applications to the given Specifications using the Technical Specification document. Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: AS400. Experience5-8 Years.

Posted 1 month ago

Apply

3.0 - 5.0 years

6 - 10 Lacs

Chennai

Work from Office

Design automation framework, develop and execute test scripts. Hands-on experience in developing Test Automation suite using Java, Cucumber & selenium, Api ,Rest Assured. Experience in design and implementation of Test Automation Framework Good knowledge in Banking domain Experience and Expertise in developing Test Harness in J2SE / Python Experience in Continuous Integration Tool Jenkins / Hudson Experience / Knowledge in Version Control systems like Git , CVS, Subversion, etc Experience in writing SQLs / PL-SQL Knowledge in Test Automation Design patterns like Page Object Model, Page Object Factory, etc Experience in any one of industry standard Test Automation Tools like QTP, Test Complete, etc Good knowledge in shell scripting & Unix commands Mandatory Skills: Selenium. Experience3-5 Years.

Posted 1 month ago

Apply

7.0 - 12.0 years

4 - 8 Lacs

Pune

Work from Office

Project Role : Business Analyst Project Role Description : Analyze an organization and design its processes and systems, assessing the business model and its integration with technology. Assess current state, identify customer requirements, and define the future state and/or business solution. Research, gather and synthesize information. Must have skills : Finastra Fusion Global PAYplus (GPP) Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Business Analyst, you will analyze an organization and design its processes and systems, assessing the business model and its integration with technology. You will assess the current state, identify customer requirements, and define the future state and/or business solution. Your role involves researching, gathering, and synthesizing information to drive business decisions. Roles & Responsibilities:- Responsible for defining detailed business requirements through engagement with business and technology stakeholders, ensuring that traceability is maintained between requirements and solution elements throughout the project lifecycle- End to end management of requirements including definition, review, approval and business acceptance via end-user testing- Role may be required to define and execute business acceptance tests including Live Confidence Tests on behalf of the business stakeholders, providing evidence of test planning and performance for review/approval by senior stakeholders- Will be required to validate solution proposals to ensure they meet business requirements Professional & Technical Skills: - Candidate must have core Payment experience with ISO20022 ( MT2MX mapping for CHIPS & FED ISO MIgration )Prior payment platform experience in any of platforms like GPP, FIS, Fiserv, ACI etc. will be helpful- Should have hands-on experience on GPP Business Rule/Profile configuration, GPP Logs reading, exposure to important database tables- Should have experience on Agile/Scrum teams- Should have used JIRA/Confluence- Ability to gather business requirements and write user stories- Detailed understanding of end to end Payments processing To/From Scheme CHIPS/SWIFT/FedWire Payments very useful experience of capturing complex requirements and maintaining traceability- Will be expected to engage with and present to senior stakeholders- Responsible for co-ordination and supporting business readiness for the implementation and live support of the solution- Should have hands-on experience in Oracle & basic SQLs Additional Information:- The candidate should have a minimum of 7.5 years of experience in Finastra Fusion Global PAYplus (GPP).- This position is based at our Pune office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

5.0 - 8.0 years

5 - 8 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Job Description & Responsibilities: Work with business and technical leadership to understand requirements Design to the requirements and document the designs Ability to write product-grade performant code for data extraction, transformations and loading using Spark, Py-Spark Do data modeling as needed for the requirements Write performant queries using Teradata SQL, and Spark SQL against Teradata and Implementing dev-ops pipelines to deploy code artifacts on to the designated platform/servers like AWS(pref) or Implement Hadoop job orchestration using Shell scripting, CA7 Enterprise Scheduler and Airflow Troubleshooting the issues, providing effective solutions and jobs monitoring in the production environment Participate in sprint planning sessions, refinement/story-grooming sessions, daily scrums, demos and retrospectives Experience Required: Overall 5 - 8 Years Experience Desired: Experience in Jira and Confluence Health care domain knowledge is a plus Excellent work experience on Databricks OR Teradata as data warehouse Experience in Agile and working knowledge on DevOps tools Education and Training Required: Primary Skills: Spark, Py-Spark, Shell scripting, Teradata SQLs (using Teradata SQLand Spark SQL) and Stored Procedures Git, Jenkins, Artifactory CA7 Enterprise Scheduler, Airflow AWS data services(S3, EC2, SQS) AWS Services- SNS Lambda, ECS, Glue, IAM Cloud Watch Monitoring tool Databricks (Delta lake, Notebooks, Pipelines, cluster management, Azure/AWS integration) Good to have: Unix / Linux Shell scripting (KSH) and basic administration of Unix servers Additional Skills: Exercises considerable creativity, foresight, and judgment in conceiving, planning, and delivering initiatives.

Posted 1 month ago

Apply

11.0 - 13.0 years

11 - 13 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Position Summary: Data engineer on the Data integration team Job Description & Responsibilities: Work with business and technical leadership to understand requirements. Design to the requirements and document the designs Ability to write product-grade performant code for data extraction, transformations and loading using Spark, Py-Spark Do data modeling as needed for the requirements. Write performant queries using Teradata SQL, Hive SQL and Spark SQL against Teradata and Hive Implementing dev-ops pipelines to deploy code artifacts on to the designated platform/servers like AWS. Troubleshooting the issues, providing effective solutions and jobs monitoring in the production environment Participate in sprint planning sessions, refinement/story-grooming sessions, daily scrums, demos and retrospectives. Experience Required: Overall 11-13 years of experience Experience Desired: Strong development experience in Spark, Py-Spark, Shell scripting, Teradata. Strong experience in writing complex and effective SQLs (using Teradata SQL, Hive SQL and Spark SQL) and Stored Procedures Health care domain knowledge is a plus Primary Skills: Excellent work experience on Databricks as Data Lake implementations Experience in Agile and working knowledge on DevOps tools (Git, Jenkins, Artifactory) AWS (S3, EC2, SNS, SQS, Lambda, ECS, Glue, IAM, and CloudWatch) Databricks (Delta lake, Notebooks, Pipelines, cluster management, Azure/AWS integration Additional Skills: Experience in Jira and Confluence Exercises considerable creativity, foresight, and judgment in conceiving, planning, and delivering initiatives.

Posted 1 month ago

Apply

4.0 - 8.0 years

13 - 18 Lacs

Noida

Work from Office

Position Summary To be a technology expert architecting solutions and mentoring people in BI / Reporting processes with prior expertise in the Pharma domain. Job Responsibilities Independently, he/she should be able to drive and deliver complex reporting and BI project assignments in PowerBI on AWS/Azure Cloud. Should be able to design and deliver across Power BI services, Power Query, DAX, and data modelling concepts. Should be able to write complex SQLs focusing on Data Aggregation and analytic calculations used in the reporting KPIs. Be able to analyse the data and understand the requirements directly from customer or from project teams across pharma commercial data sets Should be able to drive the team on the day-to-day tasks in alignment with the project plan and collaborate with team to accomplish milestones as per plan. Should be comfortable in discussing and prioritizing work items in an onshore-offshore model. Able to think analytically, use a systematic and logical approach to analyse data, problems, and situations. Manage client communication and client expectations independently. Should be able to deliver results back to the Client as per plan. Should have excellent communication skills . Education BE/B.Tech Master of Computer Application Work Experience Should have 4-8 years of working on experience in developing Power BI reports. Must have proficiency in Power BI services, Power Query, DAX, and data modelling concepts. Should have experience in design techniques such as UI designing and creating mock-ups/intuitive visualizations seamless user experience. Should have expertise in writing complex SQLs focusing on Data Aggregation and analytic calculations used for deriving the reporting KPIs. Strong understanding of data integration, ETL processes, data warehousing , preferably on AWS Redshift and/or Snowflake. Excellent problem-solving skills with the ability to troubleshoot and resolve technical issues. Strong communication and interpersonal skills, with the ability to collaborate effectively with cross-functional teams. Good to have experience in the Pharma Commercial data sets and related KPIs for Sale Performance, Managed Market, Customer 360, Patient Journey etc. Good to have experience and additional know-how on other reporting tools. Behavioural Competencies Teamwork & Leadership Motivation to Learn and Grow Ownership Cultural Fit Talent Management Technical Competencies Problem Solving Lifescience Knowledge Communication Capability Building / Thought Leadership Power BI SQL Business Intelligence(BI) Snowflake

Posted 1 month ago

Apply

10.0 - 15.0 years

4 - 9 Lacs

Bengaluru

Work from Office

Req ID: 322003 We are currently seeking a Sr. ETL Developers to join our team in Bangalore, Karntaka (IN-KA), India (IN). Strong hands-on experience in SQLs, PL/SQLs [Procs, Functions]. Expert level knowledge ETL flows & Jobs [ADF pipeline exp preferred]"‚"‚"‚"‚ Experience on MS-SQL [preferred], Oracle DB, PostgreSQL, MySQL. Good knowledge of Data Warehouse/Data Mart. Good knowledge of Data Structures/Models, Integrities constraints, Performance tuning etc. Good Knowledge in Insurance Domain (preferred)"‚"‚"‚"‚"‚"‚"‚"‚"‚ Total Exp7 "“ 10 Yrs.

Posted 1 month ago

Apply
Page 1 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies