Jobs
Interviews

25068 Etl Jobs - Page 47

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

karnataka

On-site

The Software Engineer position is a part of the Delivery Excellence team in the Strategic Operations Department. Your primary responsibility will be to support GSC with Power BI Development, Reporting using BI, Data Analysis, and MIS across various Systems. Your key duties and responsibilities will include creating and managing Microsoft PowerBI Dashboards, utilizing Microsoft DAX, and managing JIRA Boards. You will also be expected to write SQL queries for data extraction and analysis, with a strong proficiency in writing complex SQL queries. Additionally, you will design, develop, deploy, and maintain BI interfaces, Containers - including data visualizations, dashboards, and reports using Power BI. Other responsibilities will include organizing backlogs in the Jira ticketing tool, monitoring and troubleshooting existing ETL jobs, BI models, dashboards, and reports, as well as troubleshooting and fixing failures/errors in data or dashboards. You should have exposure to Azure Delta lakes and other cloud offerings, along with a solid understanding of relational databases and strong SQL skills. You will be tasked with assembling, analyzing, and evaluating data to make appropriate recommendations and decisions to support business and project teams. Managing and overseeing 5-6 BI reporting projects simultaneously to ensure that KPIs aligned with each project are being reported will also be part of your responsibilities. Collaborating with cross-functional teams to populate data to BI Boards & Containers periodically is essential to this role. Ensuring data accuracy and integrity through regular reviews, data validation, troubleshooting, and documentation is crucial. You will also be expected to enhance efficiency through Lean methodologies, automation, and digital integration to improve processes. Staying up-to-date with industry trends and advancements in reporting and analytics tools and techniques is necessary, along with having fundamental knowledge about JIRA & Servicenow. Qualifications & Skills: - Bachelor's Degree in a relevant field; 3-4 Years of experience in BI Development, BI reporting, Business Analysis, DAX & other MIS - Intermediate to advanced skills in MS Suite of products - Ability to work on multiple tasks and self-manage deliverables, meetings, and information gathering - Excellent communication skills to present options and solutions in a way easily understood by business users.,

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

maharashtra

On-site

You will be working full-time from the office in Mumbai, Chennai, or Ahmedabad. As a PL/SQL DB Developer with 6 to 8 years of relevant experience, you will be expected to have a solid understanding of database concepts, stored procedures, functions/triggers, Unix, and ETL tools such as Data stage, Informatica, or SSIS. Your responsibilities will include hands-on experience in PL/SQL and Unix, along with strong communication skills and the ability to work well in a team. The key skills required for this role include proficiency in PLSQL, ETL, and Unix. The hiring process will involve screening by HR, followed by two technical rounds and a final HR round.,

Posted 1 week ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Where Data Does More. Join the Snowflake team. Snowflake’s Support team is expanding! We are looking for a Senior Cloud Support Engineer who likes working with data and solving a wide variety of issues utilizing their technical experience having worked on a variety of operating systems, database technologies, big data, data integration, connectors, and networking. Snowflake Support is committed to providing high-quality resolutions to help deliver data-driven business insights and results. We are a team of subject matter experts collectively working toward our customers’ success. We form partnerships with customers by listening, learning, and building connections. Snowflake’s values are key to our approach and success in delivering world-class Support. Putting customers first, acting with integrity, owning initiative and accountability, and getting it done are Snowflake's core values, which are reflected in everything we do. As a Senior Cloud Support Engineer , your role is to delight our customers with your passion and knowledge of Snowflake Data Warehouse. Customers will look to you for technical guidance and expert advice with regard to their effective and optimal use of Snowflake. You will be the voice of the customer regarding product feedback and improvements for Snowflake’s product and engineering teams. You will play an integral role in building knowledge within the team and be part of strategic initiatives for organizational and process improvements. Based on business needs, you may be assigned to work with one or more Snowflake Priority Support customers . You will develop a strong understanding of the customer’s use case and how they leverage the Snowflake platform. You will deliver exceptional service, enabling them to achieve the highest levels of continuity and performance from their Snowflake implementation. Ideally, you have worked in a 24x7 environment, handled technical case escalations and incident management, worked in technical support for an RDBMS, been on-call during weekends, and are familiar with database release management. AS A SENIOR CLOUD SUPPORT ENGINEER AT SNOWFLAKE, YOU WILL: Drive technical solutions to complex problems providing in-depth analysis and guidance to Snowflake customers and partners using the following methods of communication: email, web, and phone Adhere to response and resolution SLAs and escalation processes to ensure fast resolution of customer issues that exceed expectations Demonstrate good problem-solving skills and be process-oriented Utilize the Snowflake environment, connectors, 3rd party partner software, and tools to investigate issues Document known solutions to the internal and external knowledge base Report well-documented bugs and feature requests arising from customer-submitted requests Partner with engineering teams in prioritizing and resolving customer requests Participate in a variety of Support initiatives Provide support coverage during holidays and weekends based on business needs OUR IDEAL SENIOR CLOUD SUPPORT ENGINEER WILL HAVE: Bachelor’s. or Master’s degree in Computer Science or equivalent discipline. 5+ years experience in a Technical Support environment or a similar technical function in a customer-facing role Solid knowledge of at least one major RDBMS In-depth understanding of SQL data types, aggregations, and advanced functions including analytical/window functions A deep understanding of resource locks and experience with managing concurrent transactions Proven experience with query lifecycle, profiles, and execution/explain plans Demonstrated ability to analyze and tune query performance and provide detailed recommendations for performance improvement Advanced skills in interpreting SQL queries and execution workflow logic Proven ability with rewriting joins for optimization while maintaining logical consistency In-depth knowledge of various caching mechanisms and ability to take advantage of caching strategies to enhance performance Ability to interpret systems performance metrics (CPU, I/O, RAM, Network stats) Proficiency with JSON, XML, and other semi-structured data formats Proficient in database patch and release management NICE TO HAVES: Knowledge of distributed computing principles and frameworks (e.g., Hadoop, Spark) Scripting/coding experience in any programming language Database migration and ETL experience Ability to monitor and optimize cloud spending using cost management tools and strategies. SPECIAL REQUIREMENTS: Participate in pager duty rotations during nights, weekends, and holidays Ability to work the 4th/night shift which typically starts from 10 pm IST Applicants should be flexible with schedule changes to meet business needs Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake. How do you want to make your impact? For jobs located in the United States, please visit the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Join us as a Solution Architect at Barclays, where you'll take part in the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences. As a part of the team, you will deliver technology stack, using strong analytical and problem solving skills to understand the business requirements and deliver quality solutions. You'll be working on complex technical problems that will involve detailed analytical skills and analysis. This will be done in conjunction with fellow engineers, business analysts and business stakeholders. To be successful as a Solution Architect you should have experience with: A very good broad understanding of a wide variety of technologies pertinent to Barclaycard, including emerging technologies. (e.g. AWS/Azure, Java, Adaptive and Responsive design, etc.) Awareness of IT Security patterns, considerations, best practice. Experience designing secure, scalable, highly available, resilient performant solutions. Knowledge of Software delivery and deployment patterns (e.g. Continuous Delivery, Continuous Integration, etc.) with deep understanding of Enterprise Container Platforms (e.g. Docker). Knowledge of different integration mechanisms (e.g. RESTful Web Services, ETL etc.). Awareness of different data solutions and data architecture best practice (e.g. Mongo, Data Driven Design, etc.). Awareness of SCM, packaging and build tools GIT, Jenkins and Maven Gradle Some Other Highly Valued Skills Include Payments/ Acquiring domain knowledge / experience. Good understanding of Customer Journeys in the Acquiring (Authorisations, Scheme Clearing, Scheme settlement, Merchant payments, Chargeback Processing). Familiar with integration and implementation issues and their architectural implications. Excellent understanding of best practice architectural and design methods with proven innovative and leading edge thinking (e.g. Domain Driven Architecture, event-based architecture, building for resilience, scalability, performance, Microservice design patterns etc.). Project Delivery - Understands different project methodologies, project lifecycles, major phases, dependencies and milestones within a project, and the required documentation needs. Service Delivery - Good understanding of concepts of service delivery and support and how this can be affected by technical delivery. Appreciation of different Infrastructure patterns (e.g. Internet Facing Environment, Operational Data Stores, DMZ, etc.). You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in Pune. Purpose of the role To design, develop, and implement solutions to complex business problems, collaborating with stakeholders to understand their needs and requirements, and design and implement solutions that meet those needs and create solutions that balance technology risks against business delivery, driving consistency. Accountabilities Design and development of solutions as products that can evolve, meeting business requirements that align with modern software engineering practices and automated delivery tooling. This includes identification and implementation of the technologies and platforms. Targeted design activities that apply an appropriate workload placement strategy and maximise the benefit of cloud capabilities such as elasticity, serverless, containerisation etc. Best practice designs incorporating security principles (such as defence in depth and reduction of blast radius) that meet the Bank’s resiliency expectations. Solutions that appropriately balance risks and controls to deliver the agreed business and technology value. Adoption of standardised solutions where they fit. If no standard solutions fit, feed into their ongoing evolution where appropriate. Fault finding and performance issues support to operational support teams, leveraging available tooling. Solution design impact assessment in terms of risk, capacity and cost impact, inc. estimation of project change and ongoing run costs. Development of the requisite architecture inputs required to comply with the banks governance processes, including design artefacts required for architecture, privacy, security and records management governance processes. Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions. Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc).to solve problems creatively and effectively. Communicate complex information. 'Complex' information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.

Posted 1 week ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Overview TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place – one that benefits lives, communities and the planet Job Title: Specialty Development Practitioner Location: Chennai Work Type: Hybrid Position Description At the client's Credit Company, we are modernizing our enterprise data warehouse in Google Cloud to enhance data, analytics, and AI/ML capabilities, improve customer experience, ensure regulatory compliance, and boost operational efficiencies. As a GCP Data Engineer, you will integrate data from various sources into novel data products. You will build upon existing analytical data, including merging historical data from legacy platforms with data ingested from new platforms. You will also analyze and manipulate large datasets, activating data assets to enable enterprise platforms and analytics within GCP. You will design and implement the transformation and modernization on GCP, creating scalable data pipelines that land data from source applications, integrate into subject areas, and build data marts and products for analytics solutions. You will also conduct deep-dive analysis of Current State Receivables and Originations data in our data warehouse, performing impact analysis related to the client's Credit North America's modernization and providing implementation solutions. Moreover, you will partner closely with our AI, data science, and product teams, developing creative solutions that build the future for the client's Credit. Experience with large-scale solutions and operationalizing data warehouses, data lakes, and analytics platforms on Google Cloud Platform or other cloud environments is a must. We are looking for candidates with a broad set of analytical and technology skills across these areas and who can demonstrate an ability to design the right solutions with the appropriate combination of GCP and 3rd party technologies for deployment on Google Cloud Platform. Skills Required Big Query,, Data Flow, DataForm, Data Fusion, Dataproc, Cloud Composer, AIRFLOW, Cloud SQL, Compute Engine, Google Cloud Platform - Biq Query Experience Required GCP Data Engineer Certified Successfully designed and implemented data warehouses and ETL processes for over five years, delivering high-quality data solutions. 5+ years of complex SQL development experience 2+ experience with programming languages such as Python, Java, or Apache Beam. Experienced cloud engineer with 3+ years of GCP expertise, specializing in managing cloud infrastructure and applications into production-scale solutions. Big Query,, Data Flow, DataForm, Data Fusion, Dataproc, Cloud Composer, AIRFLOW, Cloud SQL, Compute Engine, Google Cloud Platform Biq Query, Data Flow, Dataproc, Data Fusion, TERRAFORM, Tekton,Cloud SQL, AIRFLOW, POSTGRES, Airflow PySpark, Python, API, cloudbuild, App Engine, Apache Kafka, Pub/Sub, AI/ML, Kubernetes Experience Preferred In-depth understanding of GCP's underlying architecture and hands-on experience of crucial GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, Big Query, Dataflow, Pub/Sub, Data form, astronomer, Data Fusion, DataProc, Pyspark, Cloud Composer/Air Flow, Cloud SQL, Compute Engine, Cloud Functions, Cloud Run, Cloud build and App Engine, alongside and storage including Cloud Storage DevOps tools such as Tekton, GitHub, Terraform, Docker. Expert in designing, optimizing, and troubleshooting complex data pipelines. Experience developing with microservice architecture from container orchestration framework. Experience in designing pipelines and architectures for data processing. Passion and self-motivation to develop/experiment/implement state-of-the-art data engineering methods/techniques. Self-directed, work independently with minimal supervision, and adapts to ambiguous environments. Evidence of a proactive problem-solving mindset and willingness to take the initiative. Strong prioritization, collaboration & coordination skills, and ability to simplify and communicate complex ideas with cross-functional teams and all levels of management. Proven ability to juggle multiple responsibilities and competing demands while maintaining a high level of productivity. Data engineering or development experience gained in a regulated financial environment. Experience in coaching and mentoring Data Engineers Project management tools like Atlassian JIRA Experience working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Experience with data security, governance, and compliance best practices in the cloud. Experience with AI solutions or platforms that support AI solutions Experience using data science concepts on production datasets to generate insights Experience Range 5+ years Education Required Bachelor's Degree TekWissen® Group is an equal opportunity employer supporting workforce diversity.

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

The Applications Development Intermediate Programmer Analyst position at our organization involves participating in the establishment and implementation of new or revised application systems and programs in collaboration with the Technology team. Your primary goal in this role will be to contribute to application systems analysis and programming activities. You will be responsible for hands-on experience in ETL and Big Data Testing, delivering high-quality solutions, proficient in Database & UI Testing using Automation tools, and knowledgeable in Performance, Volume & Stress testing. A strong understanding of SDLC / STLC processes, different types of manual Testing, and Agile methodology will be essential. You will be skilled in designing and executing test cases, authoring user stories, defect tracking, and aligning with business requirements. Being open to learning and implementing new innovations in automation processes according to project needs will be crucial. Your role will also involve managing complex tasks and teams, fostering a collaborative, growth-oriented environment through strong technical and analytical skills. You will utilize your knowledge of applications development procedures, concepts, and other technical areas to identify and define necessary system enhancements, including using script tools and analyzing/interpreting code. Familiarity with Test Management Tool JIRA and Automation Tools like Python, PySpark, Java, Spark, MySQL, Selenium, and Tosca is required, with experience in Hadoop / ABINTIO being a plus. Consulting with users, clients, and other technology groups on issues, and recommending programming solutions, installing, and supporting customer exposure systems will also be part of your responsibilities. Qualifications: - 4-8 years of relevant experience in the Financial Service industry - Intermediate level experience in Applications Development role - Clear and concise written and verbal communication skills - Demonstrated problem-solving and decision-making abilities - Ability to work under pressure, manage deadlines, and adapt to unexpected changes in expectations or requirements Education: - Bachelors degree/University degree or equivalent experience Please note that this job description provides a high-level overview of the work performed, and additional job-related duties may be assigned as required.,

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Join us as a Sr Data Tester at Barclays, where you'll take part in the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences. As a part of the team, you will deliver technology stack, using strong analytical and problem solving skills to understand the business requirements and deliver quality solutions. You'll be working on complex technical problems that will involve detailed analytical skills and analysis. This will be done in conjunction with fellow engineers, business analysts and business stakeholders. To be successful as a Sr Data Tester you should have experience with: Lead end-to-end testing of complex ETL workflows, with a strong focus on Ab Initio. Validate data transformations, integrations, and migrations across Data Warehousing environments. Design and execute test cases, test plans, test strategies for Ab Initio and AWS-based data solutions, ensuring compliance with cloud best practices. Write and optimize complex SQL queries for data validation and reconciliation. Perform root cause analysis and troubleshoot issues across Unix-based systems and cloud platforms. Collaborate with developers, analysts, and business stakeholders to ensure test coverage and traceability. Some Other Highly Valued Skills Include Graduate. Excellent communication and analytical skills. Skilled communicator at a wide variety of levels and capabilities. Collaborative and able to share best practice at all levels. You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based in Pune. Purpose of the role To design, develop, and execute testing strategies to validate functionality, performance, and user experience, while collaborating with cross-functional teams to identify and resolve defects, and continuously improve testing processes and methodologies, to ensure software quality and reliability. Accountabilities Development and implementation of comprehensive test plans and strategies to validate software functionality and ensure compliance with established quality standards. Creation and execution automated test scripts, leveraging testing frameworks and tools to facilitate early detection of defects and quality issues. . Collaboration with cross-functional teams to analyse requirements, participate in design discussions, and contribute to the development of acceptance criteria, ensuring a thorough understanding of the software being tested. Root cause analysis for identified defects, working closely with developers to provide detailed information and support defect resolution. Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing. Stay informed of industry technology trends and innovations, and actively contribute to the organization's technology communities to foster a culture of technical excellence and growth. Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise Thorough understanding of the underlying principles and concepts within the area of expertise They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for end results of a team’s operational processing and activities. Escalate breaches of policies / procedure appropriately. Take responsibility for embedding new policies/ procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Make evaluative judgements based on the analysis of factual information, paying attention to detail. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex / sensitive information. Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Join us as a Test Automation Engineer at Barclays where you will spearhead the evolution of our infrastructure and deployment pipelines, driving innovation and operational excellence. You will harness cutting-edge technology to build and manage robust, scalable and secure infrastructure, ensuring seamless delivery of our digital solutions. To be successful as a Test Automation Engineer you should have experience with: Hands-on experience in one or more technical skills under any of the technology platforms as below: Mainframe - COBOL, IMS, CICS, DB2, VSAM, JCL, TWS, File-Aid, REXX Open Systems and tools – Selenium, Java, Jenkins, J2EE, Web-services, APIs, XML, JSON, Parasoft/SoaTest –Service Virtualization API Testing Tools – SOAP UI , Postman, Insomnia. Mid-Tier technology – MQ, WebSphere, UNIX, API 3rd party hosting platforms Data warehouse – ETL, Informatica, Ab-initio, Oracle, Hadoop Good knowledge of API Architecture and API Concepts. Preferably have domain Knowledge in Retail Banking and testing experience in one or more core banking product platforms/systems such as Accounting and clearing / General Ledger, Savings & Insurance products, Online/mobile payments, Customer & Risk systems, Mortgages and payments. Experience in JIRA and similar test management tools. Test Automation Skills Hand on Experience of Test Automation using Java or any other Object Oriented Programming Language Hands on Experience of Automation Framework Creation and Optimization. Good understanding of Selenium, Appium, SeeTest, JQuery , Java Script and Cucumber. Working experience of Build tools like apache ant, maven, gradle. Knowledge/previous experience of dev ops and Continuous Integration using Jenkins, GIT, Dockers. Experience In API Automation Framework Like RestAssured , Karate. Experience in GitLab and or Gitlab Duo will be an added advantage. Some Other Highly Valued Skills May Include E2E Integration Testing and team leading Experience. Previous Barclays Experience. Understanding of Mainframes and Barclays Systems will be an added Advantage. Hands on Experience in Agile methodology. Domain/Testing/Technical certification will be an advantage You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based out of Pune. Purpose of the role To design, develop, and execute testing strategies to validate functionality, performance, and user experience, while collaborating with cross-functional teams to identify and resolve defects, and continuously improve testing processes and methodologies, to ensure software quality and reliability. Accountabilities Development and implementation of comprehensive test plans and strategies to validate software functionality and ensure compliance with established quality standards. Creation and execution automated test scripts, leveraging testing frameworks and tools to facilitate early detection of defects and quality issues. . Collaboration with cross-functional teams to analyse requirements, participate in design discussions, and contribute to the development of acceptance criteria, ensuring a thorough understanding of the software being tested. Root cause analysis for identified defects, working closely with developers to provide detailed information and support defect resolution. Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing. Stay informed of industry technology trends and innovations, and actively contribute to the organization's technology communities to foster a culture of technical excellence and growth. Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions. Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc).to solve problems creatively and effectively. Communicate complex information. 'Complex' information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave.

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

Fulcrum Digital is an agile and next-generation digital accelerating company providing digital transformation and technology services across various industries such as banking & financial services, insurance, retail, higher education, food, healthcare, and manufacturing. As a Data Quality Assurance Engineer, your main objectives will include hands-on experience in EDW source to target testing, data transformation/manipulation testing, data quality/completeness validation, ETL processes, and running processes through schedulers. You will be responsible for developing and executing comprehensive test plans to validate data within on-prem and cloud data warehouses, conducting thorough testing of ETL processes, dimensional data models, and reporting outputs, identifying and tracking data quality issues, and ensuring data consistency and integrity across different data sources and systems. Collaboration with the Data Quality and Data Engineering team is essential to define quality benchmarks and metrics, improve QA testing strategies, and implement best practices for data validation and error handling. You will work closely with various stakeholders to understand data requirements and deliverables, design and support testing infrastructure, provide detailed reports on data quality findings, and contribute insights to enhance data quality and processing efficiency. To be successful in this role, you should have a Bachelor's or Master's degree in computer science or equivalent, 2 to 3 years of experience in data warehouse development/testing, strong understanding of Data Warehouse & Data Quality fundamentals, and experience in SQL Server, SSIS, SSAS, and SSRS testing. Additionally, you should possess a great attention to detail, a result-driven test approach, excellent written and verbal communication skills, and willingness to take on challenges and provide off-hour support as needed. If you have a minimum of 2 to 3 years of Quality Assurance experience with a proven track record of improving Data Quality, experience with SSIS, MSSQL, Snowflake, and DBT, knowledge of QA automation tools, ETL processes, familiarity with cloud computing, and data ecosystem on Snowflake, you would be a great fit for this role. Desirable qualifications include knowledge of Insurance Data & its processes, data validation experience between on-prem & cloud architecture, and familiarity with hybrid data ecosystems.,

Posted 1 week ago

Apply

4.0 - 10.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

At EY, you will have the opportunity to shape a career as unique as you are, supported by a global network, inclusive culture, and cutting-edge technology to help you reach your full potential. Your individual perspective and voice are valued to contribute to the continuous improvement of EY. By joining us, you can create an outstanding experience for yourself while contributing to a more efficient and inclusive working world for all. As a Data Engineering Lead, you will work closely with the Data Architect to design and implement scalable data lake architecture and data pipelines. Your responsibilities will include designing and implementing scalable data lake architectures using Azure Data Lake services, developing and maintaining data pipelines for data ingestion from various sources, optimizing data storage and retrieval processes for efficiency and performance, ensuring data security and compliance with industry standards, collaborating with data scientists and analysts to enhance data accessibility, monitoring and troubleshooting data pipeline issues to ensure reliability, and documenting data lake designs, processes, and best practices. You should have experience with SQL and NoSQL databases, as well as familiarity with big data file formats such as Parquet and Avro. **Roles and Responsibilities:** **Must Have Skills:** - Azure Data Lake - Azure Synapse Analytics - Azure Data Factory - Azure DataBricks - Python (PySpark, Numpy, etc.) - SQL - ETL - Data warehousing - Azure DevOps - Experience in developing streaming pipelines using Azure Event Hub, Azure Stream Analytics, Spark streaming - Experience in integrating with business intelligence tools such as Power BI **Good To Have Skills:** - Big Data technologies (e.g., Hadoop, Spark) - Data security **General Skills:** - Experience with Agile and DevOps methodologies and the software development lifecycle - Proactive and accountable for deliverables - Ability to identify and escalate dependencies and risks - Proficient in working with DevOps tools with limited supervision - Timely completion of assigned tasks and regular status reporting - Capability to train new team members - Desired knowledge of cloud solutions like Azure or AWS with DevOps/Cloud certifications - Ability to work effectively with multicultural global teams and virtually - Strong relationship-building skills with project stakeholders Join EY in its mission to build a better working world by creating long-term value for clients, people, and society, and fostering trust in the capital markets. Leveraging data and technology, diverse EY teams across 150+ countries provide assurance and support clients in growth, transformation, and operations across various sectors. Through its services in assurance, consulting, law, strategy, tax, and transactions, EY teams strive to address complex global challenges by asking insightful questions to discover innovative solutions.,

Posted 1 week ago

Apply

3.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

You should have hands-on experience with Celonis EMS (Execution Management System) and possess strong SQL skills for data extraction, transformation, and modeling. Proficiency in PQL (Process Query Language) for custom process analytics is essential, along with experience in integrating Celonis with SAP, Oracle, Salesforce, or other ERP/CRM systems. Having knowledge of ETL, data pipelines, and APIs (REST/SOAP) is crucial for this role. You should also demonstrate expertise in Process Mining & Analytical Skills, including understanding of business process modeling, process optimization techniques, and at least one OCPM project experience. Your responsibilities will include analyzing event logs to identify bottlenecks, inefficiencies, and automation opportunities. With 6-10 years of experience in the IT industry, focusing on Data Architecture/Business Process, and specifically 3-4 years of experience in process mining, data analytics, or business intelligence, you should be well-equipped for this position. A Celonis certification (e.g., Celonis Data Engineer, Business Analyst, or Solution Consultant) would be a plus. Any additional OCPM experience is also welcomed. Candidates who can join within 30-45 days will be given priority consideration for this role.,

Posted 1 week ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

We are Allvue Systems, the leading provider of software solutions for the Private Capital and Credit markets. Whether a client wants an end-to-end technology suite, or independently focused modules, Allvue helps eliminate the boundaries between systems, information, and people. We’re looking for ambitious, smart, and creative individuals to join our team and help our clients achieve their goals. Working at Allvue Systems means working with pioneers in the fintech industry. Our efforts are powered by innovative thinking and a desire to build adaptable financial software solutions that help our clients achieve even more. With our common goals of growth and innovation, whether you’re collaborating on a cutting-edge project or connecting over shared interests at an office happy hour, the passion is contagious. We want all of our team members to be open, accessible, curious and always learning. As a team, we take initiative, own outcomes, and have passion for what we do. With these pillars at the center of what we do, we strive for continuous improvement, excellent partnership and exceptional results. Come be a part of the team that’s revolutionizing the alternative investment industry. Define your own future with Allvue Systems! Design, implement, and maintain data pipelines that handle both batch and real-time data ingestion. Integrate various data sources (databases, APIs, third-party data) into Snowflake and other data systems. Work closely with data scientists and analysts to ensure data availability, quality, and performance. Troubleshoot and resolve issues related to data pipeline performance, scalability, and integrity. Optimize data processes for speed, scalability, and cost efficiency. Ensure data governance and security best practices are implemented Possesses a total experience of 5 to 8 years, including over 4+ years of expertise in data engineering or related roles. Strong experience with Snowflake, Kafka, and Debezium. Proficiency in SQL, Python, and ETL frameworks. Experience with data warehousing, data modeling, and pipeline optimization. Strong problem-solving skills and attention to detail. Experience in the financial services or fintech industry is highly desirable

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

The role of Sr Specialist Visualization & Automation in Hyderabad, India involves defining and driving the Platform engineering of Business intelligence solutions with a focus on Power BI technology. As a part of your responsibilities, you will use your strong Power BI skills to oversee the creation and management of BI and analytics solutions. You will be instrumental in driving the success of technology usage for solution delivery, best practices, standards definition, compliance, smooth transition to operations, improvements, and enablement of the business. Collaboration with the solution delivery lead and visualization lead on existing, new, and upcoming features, technology decisioning, and roadmap will be crucial. You will work closely with the solution architect and platform architect to define the visualization architecture pattern based on functional and non-functional requirements, considering available technical patterns. Additionally, you will define and drive the DevOps Roadmap to enable Agile ways of working, CI/CD pipeline, and automation for self-serve governance of the Power BI platform in collaboration with the Platform lead. It will be your accountability to ensure adherence with security and compliance policies and procedures, including Information Security & Compliance (ISC), Legal, ethics, and other compliance policies and procedures in defining architecture standards, patterns, and platform solutions. The role requires 8-10 years of IT experience in Data and Analytics, Visualization with a strong exposure to Power BI Solution delivery and Platform Automation in a global matrix organization. An in-depth understanding of database management systems, ETL, OLAP, data lake technologies, and experience in Power BI is essential. Knowledge of other visualization technologies is a plus. A specialization in the Pharma domain and understanding of data usage across the end-to-end enterprise value chain are advantageous. Good interpersonal, written and verbal communication skills, time management, and technical expertise aligned with Novartis Values & Behaviors are necessary. Join Novartis, a company committed to building an outstanding, inclusive work environment with diverse teams representative of the patients and communities served. Be a part of a mission to reimagine medicine and improve lives. If this role does not align with your career goals but you wish to stay connected with Novartis for future opportunities, join the Novartis Network. Explore the benefits, rewards, and the opportunity to create a brighter future together with Novartis.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

As a member of the infrastructure team at FIS, you will play a crucial role in troubleshooting and resolving technical issues related to Azure and SQL Server. Your responsibilities will include developing data solutions, understanding business requirements, and transforming data from different sources. You will design and implement ETL processes and collaborate with cross-functional teams to ensure that solutions meet business needs. To excel in this role, you should have a degree in Computer Science, a minimum of 4 years of experience, and proficient working knowledge of Azure, SQL, and ETL. Any programming language skills will be an asset, along with working knowledge of Data Warehousing, experience with JSON and XML data structures, and familiarity with working with APIs. At FIS, we offer a flexible and creative work environment where you can learn, grow, and make a real impact on your career. You will be part of a diverse and collaborative atmosphere, with access to professional and personal development resources. Additionally, you will have opportunities to volunteer and support charities, along with competitive salary and benefits. Please note that current and future sponsorship are not available for this position. FIS is committed to protecting the privacy and security of all personal information processed to provide services to our clients. For specific information on how FIS safeguards personal information online, please refer to the Online Privacy Notice. Recruitment at FIS primarily operates on a direct sourcing model, with a small portion of hires through recruitment agencies. FIS does not accept resumes from agencies not on the preferred supplier list and is not responsible for any related fees for resumes submitted to job postings.,

Posted 1 week ago

Apply

7.0 - 11.0 years

0 Lacs

pune, maharashtra

On-site

Join us as a Senior Technical Lead at Barclays, where you'll have the opportunity to contribute to the evolution of our digital landscape, driving innovation and excellence. In this role, you will leverage cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences. Your primary responsibility will be to deliver technology stack solutions, utilizing your strong analytical and problem-solving skills to understand business requirements and deliver high-quality solutions. Working collaboratively with fellow engineers, business analysts, and stakeholders, you will tackle complex technical issues that require detailed analytical skills and analysis. To excel as a Senior Technical Lead, you should possess experience in leading a team to perform complex tasks, using your professional knowledge and skills to deliver impactful work that influences the entire business function. You will be responsible for setting objectives, coaching employees to achieve those objectives, conducting performance appraisals, and determining reward outcomes. Whether you have leadership responsibilities or work as an individual contributor, you will lead collaborative assignments, guide team members, and identify new directions for projects to meet desired outcomes. Your role may involve consulting on complex issues, providing advice to leaders, identifying ways to mitigate risks, and developing new policies and procedures to enhance control and governance. You will take ownership of managing risks, strengthening controls, advising on decision-making, contributing to policy development, and ensuring operational effectiveness. Collaboration with other functions and business divisions will be essential to stay aligned with business strategies and activities. In addition to the above, some highly valued technical skills for this role include proficiency in ABINITIO and AWS, strong ETL and Data Integration background, experience in building complex ETL Data Pipelines, knowledge of data warehousing principles, Unix, SQL, basic AWS/Cloud Architecture, data modeling, ETL Scheduling, and strong data analysis skills. As a Senior Technical Lead, you will be assessed on key critical skills such as risk management, change and transformation, business acumen, strategic thinking, digital and technology expertise, along with job-specific technical skills. This role is based in Pune. **Purpose of the Role:** The purpose of this role is to design, develop, and enhance software using various engineering methodologies to provide business, platform, and technology capabilities for our customers and colleagues. **Key Accountabilities:** - Develop and deliver high-quality software solutions using industry-aligned programming languages, frameworks, and tools, ensuring scalability, maintainability, and performance optimization of the code. - Collaborate with product managers, designers, and engineers to define software requirements, devise solution strategies, and integrate software solutions with business objectives. - Participate in code reviews, promote a culture of code quality and knowledge sharing, and stay informed about industry technology trends to contribute to technical excellence. - Adhere to secure coding practices, implement effective unit testing practices, and ensure proper code design, readability, and reliability. **Assistant Vice President Expectations:** As an Assistant Vice President, you are expected to advise on decision-making, contribute to policy development, and ensure operational effectiveness. Collaboration with other functions and business divisions is crucial. Whether leading a team or working as an individual contributor, you will be accountable for delivering impactful work, coaching employees, and promoting a culture of excellence. **Barclays Values and Mindset:** All colleagues at Barclays are expected to embody the values of Respect, Integrity, Service, Excellence, and Stewardship, as well as demonstrate the Barclays Mindset of Empower, Challenge, and Drive in their behavior.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

eClinical Solutions helps life sciences organizations around the world accelerate clinical development initiatives with expert data services and the elluminate Clinical Data Cloud the foundation of digital trials. Together, the elluminate platform and digital data services give clients self-service access to all their data from one centralized location plus advanced analytics that help them make smarter, faster business decisions. The Senior Software Developer plays a crucial role in collaborating with the Product Manager, Implementation Consultants (ICs), and clients to understand requirements for meeting data analysis needs. This position requires good collaboration skills to provide guidance on analytics aspects to the team in various analytics-related activities. Experienced in Qlik Sense Architecture design and proficient in load script implementation and best practices. Hands-on experience in Qlik Sense development, dashboarding, data-modelling, and reporting techniques. Skilled in data integration through ETL processes from various sources and adept at data transformation including the creation of QVD files and set analysis. Capable of data modeling using Dimensional Modelling, Star schema, and Snowflake schema. The Senior Software Developer should possess strong SQL skills, particularly in SQL Server, to validate Qlik Sense dashboards and work on internal applications. Knowledge of deploying Qlik Sense applications using Qlik Management Console (QMC) is advantageous. Responsibilities include working with ICs, product managers, and clients to gather requirements, configuration, migration, and support of Qlik Sense applications, implementation of best practices, and staying updated on new technologies. Candidates for this role should hold a Bachelor of Science / BTech / MTech / Master of Science degree in Computer Science or equivalent work experience. Effective verbal and written communication skills are essential. Additionally, candidates are required to have a minimum of 3 - 5 years of experience in implementing end-to-end business intelligence using Qlik Sense, with thorough knowledge of Qlik Sense architecture, design, development, testing, and deployment processes. Understanding of Qlik Sense best practices, relational database concepts, data modeling, SQL code writing, and ETL procedures is crucial. Technical expertise in Qlik Sense, SQL Server, data modeling, and experience with clinical trial data and SDTM standards is beneficial. This position offers the opportunity to accelerate skills and career growth within a fast-growing company while contributing to the future of healthcare. eClinical Solutions fosters an inclusive culture that values diversity and encourages continuous learning and improvement. The company is an equal opportunity employer committed to making employment decisions based on qualifications, merit, culture fit, and business needs.,

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Our Technology team builds innovative digital solutions rapidly and at scale to deliver the next generation of Financial and Non-Financial services across the globe. The Position is a senior technical, hands-on delivery role, requiring knowledge of data engineering, cloud infrastructure, platform engineering, platform operations, and production support using ground-breaking cloud and big data technologies. The ideal candidate with 6-8 years of experience will possess strong technical skills, an eagerness to learn, a keen interest in Financial Crime, Financial Risk, and Compliance technology transformation, the ability to work collaboratively in a fast-paced environment, and an aptitude for picking up new tools and techniques on the job, building on existing skillsets as a foundation. In this role, you will: - Ingest and provision raw datasets, enriched tables, and curated, re-usable data assets to enable a variety of use cases. - Drive improvements in the reliability and frequency of data ingestion, including increasing real-time coverage. - Support and enhance data ingestion infrastructure and pipelines. - Design and implement data pipelines to collect data from disparate sources across the enterprise and external sources and deliver it to the data platform. - Implement Extract Transform and Load (ETL) workflows, ensuring data availability at each stage in the data flow. - Identify and onboard data sources, conduct exploratory data analysis, and evaluate modern technologies, frameworks, and tools in the data engineering space. Core/Must-Have skills: - 3-8 years of expertise in designing and implementing data warehouses, data lakes using Oracle Tech Stack (ETL: ODI, SSIS, DB: PLSQL, and AWS Redshift). - Experience in managing data extraction, transformation, and loading various sources using Oracle Data Integrator and other tools like SSIS. - Database Design and Dimension modeling using Oracle PLSQL, Microsoft SQL Server. - Advanced working SQL Knowledge and experience with relational and NoSQL databases. - Strong analytical and critical thinking skills, expertise in data Modeling and DB Design, and experience building and optimizing data pipelines. Good to have: - Experience in Financial Crime, Financial Risk, and Compliance technology transformation domains. - Certification on any cloud tech stack preferred Microsoft Azure. - In-depth knowledge and hands-on experience with data engineering, Data Warehousing, and Delta Lake on-prem and cloud platforms. - Ability to script, code, query, and design systems for maintaining Azure/AWS Lakehouse, ETL processes, business Intelligence, and data ingestion pipelines. EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,

Posted 1 week ago

Apply

0.0 - 3.0 years

0 Lacs

pune, maharashtra

On-site

You are a highly skilled Technical Data Analyst with expertise in Oracle PL/SQL and Python, well-versed in data analysis tools and techniques. Your role involves leading and mentoring a team of data analysts to derive data-driven insights and contribute to key business decisions. Additionally, you will research and evaluate emerging AI tools for potential application in data analysis projects. Your responsibilities will include designing, developing, and maintaining complex Oracle PL/SQL queries and procedures for ETL processes. You will utilize Python scripting for data analysis, automation, and reporting, as well as perform in-depth data analysis to provide actionable insights for improving business performance. Collaborating with cross-functional teams, you will translate business requirements into technical specifications and maintain data quality standards across systems. Moreover, you will leverage data analysis and visualization tools like Tableau, Power BI, and Qlik Sense to create interactive dashboards and reports for stakeholders. It is essential to stay updated with the latest data analysis tools, techniques, and industry best practices, including advancements in AI/ML. You will also research and evaluate emerging AI/ML tools for potential application in data analysis projects. Preferred qualifications for this role include hands-on experience as a Technical Data Analyst with expertise in Oracle PL/SQL and Python, proficiency in Python scripting, and familiarity with data visualization tools like Tableau, Power BI, or Qlik Sense. Additionally, awareness of AI/ML tools and techniques in data analytics and practical experience applying these techniques in projects will be advantageous. Strong analytical, problem-solving, communication, and interpersonal skills are essential, along with experience in the financial services industry. Qualifications for this position include 0-2 years of relevant experience, programming/debugging skills used in business applications, knowledge of industry standards, specific business areas for application development, and program languages. Clear and concise written and verbal communication is consistently demonstrated. Education requirements include a Bachelor's degree or equivalent experience. This job description offers a high-level overview of the work involved, and additional job-related duties may be assigned as needed. Mandatory skills required for this role are Ab Initio, Oracle PL/SQL, and Unix/Linux, with a minimum of 2 years of hands-on Development experience.,

Posted 1 week ago

Apply

3.0 - 10.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Specialist, Technical Professional Services (ETL Programming) at Fiserv, you will be responsible for designing large and complex Data Migration, Data Warehousing, and Business Intelligence Solutions under general supervision. Your role will involve analyzing conversion requirements, interpreting clients" existing systems, and taking complete ownership of the technical delivery of assigned conversion/implementation projects. Additionally, you will manage multiple clients, adhere to project timelines, and monitor project progress by tracking activities and resolving issues. You will be expected to assist management in planning and designing improvements to business processes, utilize your problem-solving skills to resolve moderately complex issues, and communicate progress and potential problems to the Project Manager. Your responsibilities will also include maintaining tools for ensuring the efficiency and effectiveness of the conversion process, providing post-implementation support for 2 weeks, and working in US shift timings based on client requirements. To qualify for this role, you should hold a B. Tech/MCA/MSc (CS/IT) degree and have 3 to 10 years of experience in the IT industry. You must possess excellent programming skills in SQL/SSIS, a good understanding of ETL, and knowledge of activities performed in the conversion/implementation of core Banking applications. Additionally, experience supporting Banking Core Conversions and familiarity with Account Processing core systems would be advantageous. Ideal candidates will have exposure to the Banking and Financial Services industry, including a good understanding of Banking Products, Services & Procedures. Proficiency in Mainframe/COBOL/JCL, strong analytical skills, leadership abilities, and proficiency with Excel are also desirable skills for this role. At Fiserv, we are committed to diversity and inclusion and provide reasonable accommodations for individuals with disabilities during the job application and interview process. We caution applicants against fraudulent job postings not affiliated with Fiserv and recommend reporting any suspicious activity to local law enforcement authorities. If you are interested in joining Fiserv, please apply using your legal name, complete the step-by-step profile, and attach your resume to be considered for this Specialist position in Technical Professional Services (ETL Programming).,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

hyderabad, telangana

On-site

A career within Functional and Industry Technologies services will provide you with the opportunity to build secure and new digital experiences for customers, employees, and suppliers. We focus on improving apps or developing new apps for traditional and mobile devices as well as conducting usability testing to find ways to improve our clients" user experience. As part of our team, you'll help clients harness technology systems in financial services focusing on areas such as insurance, sales performance management, retirement and pension, asset management, and banking & capital markets. To really stand out and make us fit for the future in a constantly changing world, each and every one of us at PwC needs to be a purpose-led and values-driven leader at every level. To help us achieve this we have the PwC Professional; our global leadership development framework. It gives us a single set of expectations across our lines, geographies, and career paths, and provides transparency on the skills we need as individuals to be successful and progress in our careers, now and in the future. Responsibilities As a Senior Associate, you'll work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. PwC Professional skills and responsibilities for this management level include but are not limited to: - Use feedback and reflection to develop self-awareness, personal strengths and address development areas. - Delegate to others to provide stretch opportunities, coaching them to deliver results. - Demonstrate critical thinking and the ability to bring order to unstructured problems. - Use a broad range of tools and techniques to extract insights from current industry or sector trends. - Review your work and that of others for quality, accuracy, and relevance. - Know how and when to use tools available for a given situation and can explain the reasons for this choice. - Seek and embrace opportunities which give exposure to different situations, environments, and perspectives. - Use straightforward communication, in a structured way, when influencing and connecting with others. - Able to read situations and modify behavior to build quality relationships. - Uphold the firm's code of ethics and business conduct. Years of Experience - 2 to 5 years of experience Education Qualification: BTech/BE/MTech/MS/MCA Preferred Skill Set/Roles and Responsibility - - Hands-on Experience in P&C Insurance on Guidewire DataHub/InfoCenter Platform. - Experience in mapping Guidewire Insurance Suite of products (PC/BC/CC/CM) to DHIC. - Works with business in identifying detailed analytical and operational reporting/extracts requirements. - Able to create Microsoft SQL / ETL / SSIS complex queries. - Participates in Sprint development, test, and integration activities. - Creates detailed source to target mappings. - Creates and validates data dictionaries - Writes and validates data translation and migration scripts. - Communicating with business to gather business requirements. - Performs GAP analysis between existing (legacy) and new (GW) data related solutions. - Working with Informatica ETL devs. - Knowledge of Cloud AWS,

Posted 1 week ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

This is a data engineer position - a programmer responsible for the design, development implementation and maintenance of data flow channels and data processing systems that support the collection, storage, batch and real-time processing, and analysis of information in a scalable, repeatable, and secure manner in coordination with the Data & Analytics team. The overall objective is defining optimal solutions to data collection, processing, and warehousing. Must be a Spark Java development expertise in big data processing, Python and Apache spark particularly within banking & finance domain. He/She designs, codes and tests data systems and works on implementing those into the internal infrastructure. Responsibilities: Ensuring high quality software development, with complete documentation and traceability Develop and optimize scalable Spark Java-based data pipelines for processing and analyzing large scale financial data Design and implement distributed computing solutions for risk modeling, pricing and regulatory compliance Ensure efficient data storage and retrieval using Big Data Implement best practices for spark performance tuning including partition, caching and memory management Maintain high code quality through testing, CI/CD pipelines and version control (Git, Jenkins) Work on batch processing frameworks for Market risk analytics Promoting unit/functional testing and code inspection processes Work with business stakeholders and Business Analysts to understand the requirements Work with other data scientists to understand and interpret complex datasets Qualifications: 5- 8 Years of experience in working in data eco systems. 4-5 years of hands-on experience in Hadoop, Scala, Java, Spark, Hive, Kafka, Impala, Unix Scripting and other Big data frameworks. 3+ years of experience with relational SQL and NoSQL databases: Oracle, MongoDB, HBase Strong proficiency in Python and Spark Java with knowledge of core spark concepts (RDDs, Dataframes, Spark Streaming, etc) and Scala and SQL Data Integration, Migration & Large Scale ETL experience (Common ETL platforms such as PySpark/DataStage/AbInitio etc.) - ETL design & build, handling, reconciliation and normalization Data Modeling experience (OLAP, OLTP, Logical/Physical Modeling, Normalization, knowledge on performance tuning) Experienced in working with large and multiple datasets and data warehouses Experience building and optimizing ‘big data’ data pipelines, architectures, and datasets. Strong analytic skills and experience working with unstructured datasets Ability to effectively use complex analytical, interpretive, and problem-solving techniques Experience with Confluent Kafka, Redhat JBPM, CI/CD build pipelines and toolchain – Git, BitBucket, Jira Experience with external cloud platform such as OpenShift, AWS & GCP Experience with container technologies (Docker, Pivotal Cloud Foundry) and supporting frameworks (Kubernetes, OpenShift, Mesos) Experienced in integrating search solution with middleware & distributed messaging - Kafka Highly effective interpersonal and communication skills with tech/non-tech stakeholders. Experienced in software development life cycle and good problem-solving skills. Excellent problem-solving skills and strong mathematical and analytical mindset Ability to work in a fast-paced financial environment Education: Bachelor’s/University degree or equivalent experience in computer science, engineering, or similar domain ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Data Architecture ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 1 week ago

Apply

5.0 years

0 Lacs

Greater Chennai Area

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : PySpark Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the solutions align with business objectives. You will also engage in problem-solving discussions and contribute to the overall success of the projects by leveraging your expertise in application development. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate knowledge sharing sessions to enhance team capabilities. - Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark. - Strong understanding of data processing frameworks and distributed computing. - Experience with data integration and ETL processes. - Familiarity with cloud platforms and services related to application development. - Ability to write efficient and scalable code. Additional Information: - The candidate should have minimum 5 years of experience in PySpark. - This position is based at our Chennai office. - A 15 years full time education is required.

Posted 1 week ago

Apply

5.0 years

0 Lacs

Greater Chennai Area

On-site

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : PySpark Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team through the development process. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that the applications developed meet both user needs and technical specifications. Your role will require you to balance technical oversight with team management, fostering an environment of innovation and collaboration. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate knowledge sharing sessions to enhance team skills and capabilities. - Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark. - Strong understanding of data processing frameworks and distributed computing. - Experience with data integration and ETL processes. - Familiarity with cloud platforms and services related to data processing. - Ability to mentor junior team members and provide technical guidance. Additional Information: - The candidate should have minimum 7.5 years of experience in PySpark. - This position is based in Pune. - A 15 years full time education is required.

Posted 1 week ago

Apply

7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

This role has been designed as ‘Hybrid’ with an expectation that you will work on average 2 days per week from an HPE office. Who We Are Hewlett Packard Enterprise is the global edge-to-cloud company advancing the way people live and work. We help companies connect, protect, analyze, and act on their data and applications wherever they live, from edge to cloud, so they can turn insights into outcomes at the speed required to thrive in today’s complex world. Our culture thrives on finding new and better ways to accelerate what’s next. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. If you are looking to stretch and grow your career our culture will embrace you. Open up opportunities with HPE. Job Description Aruba is an HPE Company, and a leading provider of next-generation network access solutions for the mobile enterprise. Helping some of the largest companies in the world modernize their networks to meet the demands of a digital future, Aruba is redefining the “Intelligent Edge” – and creating new customer experiences across intelligent spaces and digital workspaces. Join us redefine what’s next for you. How You Will Make Your Mark… The ideal candidate will have experience with deploying and managing enterprise-scale Data Governance practices along with Data Engineering experience developing the database layer to support and enable AI initiatives as well as streamlined user experience with Data Discovery, Security & Access Control, for meaningful & business-relevant analytics. The candidate will be comfortable with the full stack analytics ecosystem, with Database layer, BI dashboards, and AI/Data Science models & solutions, to effectively define and implement a scalable Data Governance practice. What You’ll Do Responsibilities: Drive the design and development of Data Dictionary, Lineage, Data Quality, Security & Access Control for Business-relevant data subjects & reports across business domains. Engage with the business users community to enable ease of Data Discovery and build trust in the data through Data Quality & Reliability monitoring with key metrics & SLAs defined. Supports the development and sustaining of Data subjects in the Database layer to enable BI dashboards and AI solutions. Drives the engagement and alignment with the HPE IT/CDO team on Governance initiatives, including partnering with functional teams across the business. Test, validate and assure the quality of complex AI-powered product features. Partner with a highly motivated and talented set of colleagues. Be a motivated, self-starter who can operate with minimal handholding. Collaborate across teams and time zones, demonstrating flexibility and accountability Education And Experience Required 7+ years of Data Governance and Data Engineering experience, with significant exposure to enabling Data availability, data discovery, quality & reliability, with appropriate security & access controls in enterprise-scale ecosystem. First level university degree. What You Need To Bring Knowledge and Skills: Experience working with Data governance & metadata management tools (Collibra, Databricks Unity Catalog, Atlan, etc.). Subject matter expertise of consent management concepts and tools. Demonstrated knowledge of research methodology and the ability to manage complex data requests. Excellent analytical thinking, technical analysis, and data manipulation skills. Proven track record of development of SQL SSIS packages with ETL flow. Experience with AI application deployment governance a plus. Technologies such as MS SQL Server, Databricks, Hadoop, SAP S4/HANA. Experience with SQL databases and building SSIS packages; knowledge of NoSQL and event streaming (e.g., Kafka) is a bonus. Exceptional interpersonal skills and written communication skills. Experience and comfort solving problems in an ambiguous environment where there is constant change. Ability to think logically, communicate clearly, and be well organized. Strong knowledge of Computer Science fundamentals. Experience working with LLMs and generative AI frameworks (e.g., OpenAI, Hugging Face, etc.). Proficiency in MS Power Platform, Java, Scala, Python experience preferred. Strong collaboration and communication skills. Performing deep-dive investigations, including applying advanced techniques, to solve some of the most critical and complex business problems in support of business transformation to enable Product, Support, and Software as a Service offerings. Strong business acumen and technical knowledge within area of responsibility. Strong project management skills Additional Skills Accountability, Accountability, Active Learning (Inactive), Active Listening, Bias, Business Decisions, Business Development, Business Metrics, Business Performance, Business Strategies, Calendar Management, Coaching, Computer Literacy, Creativity, Critical Thinking, Cross-Functional Teamwork, Design Thinking, Empathy, Follow-Through, Growth Mindset, Intellectual Curiosity (Inactive), Leadership, Long Term Planning, Managing Ambiguity, Personal Initiative {+ 5 more} What We Can Offer You Health & Wellbeing We strive to provide our team members and their loved ones with a comprehensive suite of benefits that supports their physical, financial and emotional wellbeing. Personal & Professional Development We also invest in your career because the better you are, the better we all are. We have specific programs catered to helping you reach any career goals you have — whether you want to become a knowledge expert in your field or apply your skills to another division. Unconditional Inclusion We are unconditionally inclusive in the way we work and celebrate individual uniqueness. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. Let's Stay Connected Follow @HPECareers on Instagram to see the latest on people, culture and tech at HPE. #india #aruba Job Business Planning Job Level Specialist HPE is an Equal Employment Opportunity/ Veterans/Disabled/LGBT employer. We do not discriminate on the basis of race, gender, or any other protected category, and all decisions we make are made on the basis of qualifications, merit, and business need. Our goal is to be one global team that is representative of our customers, in an inclusive environment where we can continue to innovate and grow together. Please click here: Equal Employment Opportunity. Hewlett Packard Enterprise is EEO Protected Veteran/ Individual with Disabilities. HPE will comply with all applicable laws related to employer use of arrest and conviction records, including laws requiring employers to consider for employment qualified applicants with criminal histories.

Posted 1 week ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Summary: We are seeking a Senior Python Developer with a strong background in backend development and a passion for designing and implementing efficient algorithms. The ideal candidate will be responsible for developing, maintaining, and optimizing our core backend systems and services, with a particular focus on complex algorithms .This role requires a deep understanding of Python, strong problem-solving skills, and the ability to work collaboratively in a fast-paced environment. You will play a key role in designing, developing, and maintaining robust data pipelines, APIs, and data processing workflows. You will work closely with data analysts and business teams to understand data requirements and deliver insightful data-driven solutions. The ideal candidate is passionate about data, enjoys problem-solving, and thrives in a collaborative environment. Experience in the financial or banking domain is a plus. Responsibilities: Design, develop, and maintain robust and scalable data pipelines using Python, SQL, PySpark, and streaming technologies like Kafka. Perform efficient data extraction, transformation, and loading (ETL) for large volumes of data from diverse data providers, ensuring data quality and integrity. Build and maintain RESTful APIs and microservices to support seamless data access and transformation workflows. Develop reusable components, libraries, and frameworks to automate data processing workflows, optimizing for performance and efficiency. Apply statistical analysis techniques to uncover trends, patterns, and actionable business insights from data. Implement comprehensive data quality checks and perform root cause analysis on data anomalies, ensuring data accuracy and reliability. Collaborate effectively with data analysts, business stakeholders, and other engineering teams to understand data requirements and translate them into technical solutions. Qualifications: Bachelor's or Master's degree in Computer Science, Data Science, Information Systems, or a related field. 5+ years of proven experience in Python development, with a strong focus on data handling, processing, and analysis. Extensive experience building and maintaining RESTful APIs and working with microservices architectures. Proficiency in building and managing data pipelines using APIs, ETL tools, and Kafka. Solid understanding and practical application of statistical analysis methods for business decision-making. Hands-on experience with PySpark for large-scale distributed data processing. Strong SQL skills for querying, manipulating, and optimizing relational database operations. Deep understanding of data cleaning, preprocessing, and validation techniques. Knowledge of data governance, security, and compliance standards is highly desirable. Experience in the financial services industry is a plus. Familiarity with basic machine learning (ML) concepts and experience preparing data for ML models is a plus. Strong analytical, debugging, problem-solving, and communication skills. Ability to work both independently and collaboratively within a team environment. Preferred Skills: Experience with CI/CD tools and Git-based version control. Experience in the financial or banking domain. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies