Home
Jobs

443 Etl Tool Jobs - Page 17

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2 - 7 years

10 - 20 Lacs

Pune, Chennai, Bengaluru

Hybrid

Naukri logo

Salary: 10- 30 LPA Exp: 2-7 years Location: Gurgaon/Pune/Bangalore/Chennai Notice period: Immediate to 30 days..!! Key Responsibilities: 2+ years hands on strong experience in Ab-Initio technology. Should have good knowledge about Ab-Initio components like Reformat, Join, Sort, Rollup, Normalize, Scan, Lookup, MFS, Ab-Initio parallelism and products like Metadata HUB, Conduct>IT, Express>IT, Control center and good to have clear understanding of concepts like Meta programming, continuous flows & PDL. Very good knowledge of Data warehouse, SQL and Unix shell scripting. Knowledge on ETL side of Cloud platforms like AWS or Azure and on Hadoop platform is also an added advantage. Experience in working with banking domain data is an added advantage. Excellent technical knowledge in Design, development & validation of complex ETL features using Ab-Initio. Excellent knowledge in integration with upstream and downstream processes and systems. Ensure compliance to technical standards, and processes. Ability to engage and collaborate with Stakeholders to deliver assigned tasks with defined quality goals. Can work independently with minimum supervision and help the development team on technical Issues. Good Communication and analytical skills.

Posted 1 month ago

Apply

5 - 10 years

10 - 14 Lacs

Noida

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Power Business Intelligence (BI) Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process and ensuring seamless communication among team members and stakeholders. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead the application development process Ensure effective communication among team members and stakeholders Identify and address any issues or roadblocks in the development process Professional & Technical Skills: Must To Have Skills: Proficiency in Microsoft Power Business Intelligence (BI) Strong understanding of data visualization tools such as Power BI Experience with implementing various BI solutions Hands-on experience in designing and configuring BI applications Solid grasp of data analysis and interpretation Additional Information: The candidate should have a minimum of 5 years of experience in Microsoft Power Business Intelligence (BI) This position is based at our Noida office A 15 years full-time education is required Qualification 15 years full time education

Posted 1 month ago

Apply

5 - 10 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : Python (Programming Language), Data Building Tool Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. You will create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Your day will involve working on data solutions and collaborating with teams to optimize data processes. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Develop and maintain data pipelines Ensure data quality and integrity Implement ETL processes Professional & Technical Skills: Must To Have Skills: Proficiency in Snowflake Data Warehouse Good To Have Skills: Experience with Data Building Tool Strong understanding of data architecture Proficiency in SQL and database management Experience with cloud data platforms Knowledge of data modeling Additional Information: The candidate should have a minimum of 5 years of experience in Snowflake Data Warehouse This position is based at our Bengaluru office A 15 years full time education is required Qualification 15 years full time education

Posted 1 month ago

Apply

3 - 8 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Reltio Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will collaborate with key stakeholders, data owners, and architects to model existing and new data, ensuring data integrity and accuracy. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Develop and maintain data models for current and future data needs. Collaborate with business representatives to understand data requirements. Implement data modeling best practices to ensure data quality and consistency. Provide data modeling expertise and guidance to the team. Contribute to data governance initiatives and compliance efforts. Professional & Technical Skills: Must To Have Skills: Proficiency in Reltio. Strong understanding of data modeling concepts and techniques. Experience with data modeling tools and techniques. Knowledge of data governance principles and practices. Experience in data analysis and interpretation. Additional Information: The candidate should have a minimum of 3 years of experience in Reltio. This position is based at our Bengaluru office. A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

3 - 8 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Modeler Project Role Description : Work with key business representatives, data owners, end users, application designers and data architects to model current and new data. Must have skills : Reltio Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Modeler, you will collaborate with key stakeholders, including business representatives, data owners, and architects to model existing and new data, ensuring data integrity and quality. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Collaborate with business representatives to understand data requirements. Design and implement data models to meet business needs. Ensure data integrity and quality in all data modeling activities. Provide expertise in data modeling best practices. Contribute to data architecture decisions and strategies. Professional & Technical Skills: Must To Have Skills: Proficiency in Reltio. Strong understanding of data modeling concepts. Experience with data modeling tools and techniques. Knowledge of data governance principles. Familiarity with data integration and data warehousing concepts. Additional Information: The candidate should have a minimum of 3 years of experience in Reltio. This position is based at our Bengaluru office. A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

3 - 6 years

7 - 12 Lacs

Bengaluru

Remote

Naukri logo

Role: Power BI Developer - Client Facing Experience Required: 35 Years Location: Remote, India Qualification: B.Tech / M.Tech / BCA / MCA About Enate Enate is on a mission to make every service run like clockwork. Our orchestration and AI solution has been purpose-built by experts to tackle the complexities of service delivery. We orchestrate work across teams, systems and workers so that businesses can run smoother operations at scale. Why join us? We're a global software brand at the intersection of technology and business services without the chaos of a typical start-up. You'll join a friendly, no-ego team where people support each other and every voice is heard. You can work remotely from anywhere in India. We also get together for global retreats and team socials. We think we're a great workplace, but don't just take our word for it. Check out our 4.8 rating on Glassdoor and read what Enate employees have to say. About the Opportunity Are you seeking a role that accelerates your career while offering exposure to a fast-growing global SaaS company? It is your chance to make a real impact. We seek a dynamic and detail-oriented Power BI Developer with strong technical and analytical skills who thrives in a client-facing environment. The ideal candidate will possess deep experience in Power BI development, business reporting, and data analytics. They will work closely with stakeholders and business teams to translate complex data into clear, actionable insights and visualisations. Key Responsibilities: Collaborate with clients and internal stakeholders to gather reporting requirements and translate them into scalable BI solutions. Develop, enhance, and migrate Power BI reports, dashboards, and data models. Design and implement custom datasets using advanced SQL queries and optimise for performance. Connect to various data sources, including on-premises servers, cloud platforms(Azure, AWS), Excel, CSV, and other file formats. Utilise Power BI Desktop, Gateway, Service, and Report Server for end-to-end report development and deployment. To transform, clean, and model data, write advanced DAX expressions and Power Query (M) scripts. Implement static and dynamic Row Level Security (RLS) to control data access. Optimise report performance and handle large datasets efficiently. Provide technical leadership and guidance to clients during data migration or upgrade initiatives. Troubleshoot data discrepancies, resolve data quality issues and support ad-hoc analytical requests. Create comprehensive documentation and conduct user training/workshops as required. Required Skills and Qualifications: Proven experience as a Power BI Developer (minimum 3 years). Proficiency in Power BI Desktop, Power Query (M), Data modelling, DAX, and Power BI Service. Proficiency in Data Warehousing and Extract Transform Load (ETL) processes Hands-on experience in writing complex & optimised SQL queries and creating SQL views. In-depth understanding of relational database structures and data warehousing principles. Strong analytical and visualisation skills to communicate complex information. Demonstrated ability in performance tuning of reports. Experience in handling Power BI migrations and version upgrades. Excellent problem-solving, debugging, and communication skills. Nice to Have: Familiarity with Azure Data Factory, Synapse Analytics, or other cloud data tools. Exposure to Agile methodologies and DevOps for BI deployments. Experience in customer success or stakeholder engagement roles. Understanding of business workflow optimisation and automation

Posted 1 month ago

Apply

5 - 7 years

0 - 0 Lacs

Kolkata

Work from Office

Naukri logo

Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes: Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency. Validate results with user representatives integrating the overall solution seamlessly. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Influence and improve customer satisfaction through effective data solutions. Measures of Outcomes: Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Number of data security incidents or compliance breaches. Outputs Expected: Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Define coding standards templates and checklists. Review code for team members and peers. Documentation: Create and review templates checklists guidelines and standards for design processes and development. Create and review deliverable documents including design documents architecture documents infrastructure costing business requirements source-target mappings test cases and results. Configuration: Define and govern the configuration management plan. Ensure compliance within the team. Testing: Review and create unit test cases scenarios and execution plans. Review the test plan and test strategy developed by the testing team. Provide clarifications and support to the testing team as needed. Domain Relevance: Advise data engineers on the design and development of features and components demonstrating a deeper understanding of business needs. Learn about customer domains to identify opportunities for value addition. Complete relevant domain certifications to enhance expertise. Project Management: Manage the delivery of modules effectively. Defect Management: Perform root cause analysis (RCA) and mitigation of defects. Identify defect trends and take proactive measures to improve quality. Estimation: Create and provide input for effort and size estimation for projects. Knowledge Management: Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Management: Execute and monitor the release process to ensure smooth transitions. Design Contribution: Contribute to the creation of high-level design (HLD) low-level design (LLD) and system architecture for applications business components and data models. Customer Interface: Clarify requirements and provide guidance to the development team. Present design options to customers and conduct product demonstrations. Team Management: Set FAST goals and provide constructive feedback. Understand team members' aspirations and provide guidance and opportunities for growth. Ensure team engagement in projects and initiatives. Certifications: Obtain relevant domain and technology certifications to stay competitive and informed. Skill Examples: Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning of data processes. Expertise in designing and optimizing data warehouses for cost efficiency. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets. Capacity to clearly explain and communicate design and development aspects to customers. Ability to estimate time and resource requirements for developing and debugging features or components. Knowledge Examples: Knowledge Examples Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models relevant to various business contexts. Familiarity with domain-related data and its implications. Expertise in data warehousing optimization techniques. Knowledge of data security concepts and best practices. Familiarity with design patterns and frameworks in data engineering. Additional Comments: Required Skills & Qualifications: - A degree (preferably an advanced degree) in Computer Science, Engineering or a related field - Senior developer having 8+ years of hands on development experience in Azure using ASB and ADF: Extensive experience in designing, developing, and maintaining data solutions/pipelines in the Azure ecosystem, including Azure Service Bus, & ADF. - Familiarity with MongoDB and Python is added advantage. Required Skills Azure Data Factory,Azure Service Bus,Azure,Mongodb

Posted 1 month ago

Apply

4 - 5 years

8 - 12 Lacs

Kolkata

Work from Office

Naukri logo

JOB DESCRIPTION- Experienced Core Banking System (CBS) Data Migration Specialist Required (PL/SQL) 1) Data Migration Planning and Execution 2) Data Mapping and Transformation 3) Database Management and Performance 4) Data Quality and Collaboration . Required Candidate profile Preferred Qualifications • Certification in Oracle PL/SQL or related technologies. • Experience working with large datasets and performing data quality checks on large-scale migrations.

Posted 1 month ago

Apply

3 - 5 years

5 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

€¢ Sound Knowledge & hands on experience on H-look Up, V-Look Up, Pivot Table, Conditional Formatting etc. €¢ Good in preparing MIS Report. €¢ Perform data analysis for generating reports on periodic basis €¢ Provide strong reporting and analytical information support €¢ Knowledge of various MIS reporting tools

Posted 1 month ago

Apply

3 - 5 years

5 - 7 Lacs

Navi Mumbai

Work from Office

Naukri logo

Skill required: Network Billing Operations - Problem Management Designation: Network & Svcs Operation Analyst Qualifications: Any Graduation Years of Experience: 3 to 5 years What would you do? We are seeking a detail-oriented and proactive Usage Assurance Analyst to join our team. The primary responsibility of this role is to ensure accuracy in revenue streams by performing thorough reconciliation processes between mediation systems and billers. The ideal candidate will be vigilant in identifying and resolving issues related to usage and revenue, ensuring a seamless experience for our customers.1.Reconciliation:Perform reconciliation between mediation systems and billers to ensure accuracy in revenue and usage data.2.Issue Identification:Highlight and report discrepancies related to usage and revenue to the concerned teams. Ensure that these issues are resolved promptly to avoid any customer impact.3.Switch Level Monitoring:Monitor switch levels closely to detect and resolve any anomalies in real-time, ensuring smooth operations.4.Billing Cycle Management:Track and manage all issues related to billing cycles. Ensure timely follow-up and resolution to maintain accurate and timely billing processes.5.Reporting:Prepare and maintain detailed reports on reconciliations, anomalies, and billing cycle issues for internal review and continuous process improvement.6. 3-4 years of experience in telecom billing, billing operations7. Strong understanding of telecom billing systems, revenue recognition principles, and regulatory requirements8. Collaborate with cross-functional teams including IT & operations to identify and resolve Usage assurance issues9.Conduct regular audits and reviews of billing systems, contracts, and processes to ensure compliance with regulatory requirements and industry standards.10. Develop and maintain key performance indicators (KPIs) to measure the effectiveness of Usage assurance activities and report on performance to senior management.11. Excellent analytical skills with a keen attention to detail and the ability to identify patterns and anomalies in large datasets. What are we looking for? Qualifications:Bachelor's degree.Strong analytical skills with the ability to identify discrepancies and resolve them efficiently.Experience in usage assurance, reconciliation processes, or billing in a telecom or financial services environment.Excellent communication skills, with the ability to liaise effectively with different teams.High attention to detail and the ability to manage multiple tasks simultaneously.Experience/Understanding with mediation systems and billing platforms.Knowledge of telecom networks and switch level monitoring.Proficiency in data analysis and reporting tools. Roles and Responsibilities: In this role you are required to do analysis and solving of lower-complexity problems Your day to day interaction is with peers within Accenture before updating supervisors In this role you may have limited exposure with clients and/or Accenture management You will be given moderate level instruction on daily work tasks and detailed instructions on new assignments The decisions you make impact your own work and may impact the work of others You will be an individual contributor as a part of a team, with a focused scope of work Please note that this role may require you to work in rotational shifts Qualifications Any Graduation

Posted 1 month ago

Apply

5 - 8 years

7 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Skill required: Data Management - Structured Query Language (SQL) Designation: Data Eng, Mgmt & Governance Sr Analyst Qualifications: BE/BTech Years of Experience: 5 to 8 years What would you do? Data & AIThe role may require for you to work on SQL (Structured Query Language) which is a Domain-specific language used in programming and designed for querying and modifying data and managing databases. What are we looking for? SQL Extract Transform Load (ETL) Data Modeling Adaptable and flexible Strong analytical skills Ability to work well in a team Commitment to quality Written and verbal communication Roles and Responsibilities: In this role you are required to do analysis and solving of increasingly complex problems Your day-to-day interactions are with peers within Accenture You are likely to have some interaction with clients and/or Accenture management You will be given minimal instruction on daily work/tasks and a moderate level of instruction on new assignments Decisions that are made by you impact your own work and may impact the work of others In this role you would be an individual contributor and/or oversee a small work effort and/or team Qualifications BE,BTech

Posted 1 month ago

Apply

3 - 5 years

8 - 11 Lacs

Pune, Gurugram, Bengaluru

Work from Office

Naukri logo

Job Title: Data Engineer – Snowflake & Python About the Role: We are seeking a skilled and proactive Data Developer with 3-5 years of hands-on experience in Snowflake , Python , Streamlit , and SQL , along with expertise in consuming REST APIs and working with modern ETL tools like Matillion, Fivetran etc. The ideal candidate will have a strong foundation in data modeling , data warehousing , and data profiling , and will play a key role in designing and implementing robust data solutions that drive business insights and innovation. Key Responsibilities: Design, develop, and maintain data pipelines and workflows using Snowflake and an ETL tool (e.g., Matillion, dbt, Fivetran, or similar). Develop data applications and dashboards using Python and Streamlit. Create and optimize complex SQL queries for data extraction, transformation, and loading. Integrate REST APIs for data access and process automation. Perform data profiling, quality checks, and troubleshooting to ensure data accuracy and integrity. Design and implement scalable and efficient data models aligned with business requirements. Collaborate with data analysts, data scientists, and business stakeholders to understand data needs and deliver actionable solutions. Implement best practices in data governance, security, and compliance. Required Skills and Qualifications: 3–5 years of professional experience in a data engineering or development role. Strong expertise in Snowflake , including performance tuning and warehouse optimization. Proficient in Python , including data manipulation with libraries like Pandas. Experience building web-based data tools using Streamlit . Solid understanding and experience with RESTful APIs and JSON data structures. Strong SQL skills and experience with advanced data transformation logic. Experience with an ETL tool commonly used with Snowflake (e.g., dbt , Matillion , Fivetran , Airflow ). Hands-on experience in data modeling (dimensional and normalized), data warehousing concepts , and data profiling techniques . Familiarity with version control (e.g., Git) and CI/CD processes is a plus. Preferred Qualifications: Experience working in cloud environments (AWS, Azure, or GCP). Knowledge of data governance and cataloging tools. Experience with agile methodologies and working in cross-functional teams.

Posted 1 month ago

Apply

10 - 14 years

9 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Skill required: Reporting & Insights - Reporting Analytics Designation: Analytics and Modeling Associate Manager Qualifications: MCA/Any Graduation/Master of Business Administration Years of Experience: 10 to 14 years Language - Ability: English(International) - Expert What would you do? NAPrepare management reports and analysis, both recurring and ad-hoc. It focuses on tracking business performance through trusted data and insights while actively managing employee behaviors. What are we looking for? Microsoft Power BI/Automate Structured Query Language (SQL) Business Intelligence (BI) Reporting Tools Ability to establish strong client relationship Agility for quick learning Detail orientation Problem-solving skills Written and verbal communication Artificial Intelligence (AI) Microsoft Office Suite Roles and Responsibilities: In this role you are required to do analysis and solving of moderately complex problems Typically creates new solutions, leveraging and, where needed, adapting existing methods and procedures The person requires understanding of the strategic direction set by senior management as it relates to team goals Primary upward interaction is with direct supervisor or team leads Generally interacts with peers and/or management levels at a client and/or within Accenture The person should require minimal guidance when determining methods and procedures on new assignments Decisions often impact the team in which they reside and occasionally impact other teams Individual would manage medium-small sized teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Qualifications MCA,Any Graduation,Master of Business Administration

Posted 1 month ago

Apply

5 - 10 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Workday Advanced Reporting Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : Mandatory to have Workday Related certification15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will collaborate with teams to ensure successful project delivery and implementation. Roles & Responsibilities: Expected to be an SME. Collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Lead and mentor junior professionals. Conduct regular team meetings to discuss progress and challenges. Stay updated on industry trends and best practices to enhance team performance. Professional & Technical Skills: Must To Have Skills:Proficiency in Workday Advanced Reporting, Mandatory to have Workday Related certification. Strong understanding of data analytics and reporting tools. Experience in designing and implementing complex reporting solutions. Knowledge of Workday integration tools and processes. Ability to troubleshoot and resolve technical issues efficiently. Additional Information: The candidate should have a minimum of 5 years of experience in Workday Advanced Reporting. This position is based at our Bengaluru office. A mandatory Workday Related certification is required. Qualifications Mandatory to have Workday Related certification15 years full time education

Posted 1 month ago

Apply

5 - 10 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Workday Advanced Reporting Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process and ensuring seamless communication within the team and stakeholders. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead the application design and development process Ensure timely delivery of projects Provide technical guidance and mentorship to team members Professional & Technical Skills: Must To Have Skills:Proficiency in Workday Advanced Reporting Strong understanding of data analytics and reporting tools Experience in leading application development projects Knowledge of software development lifecycle Excellent communication and leadership skills Additional Information: The candidate should have a minimum of 5 years of experience in Workday Advanced Reporting This position is based at our Bengaluru office A 15 years full-time education is required Qualifications 15 years full time education

Posted 1 month ago

Apply

5 - 10 years

7 - 12 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will collaborate with teams to ensure successful project delivery and implementation. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead and mentor junior professionals Conduct regular team meetings to discuss progress and challenges Stay updated on industry trends and best practices Professional & Technical Skills: Must To Have Skills:Proficiency in Ab Initio Strong understanding of ETL processes Experience with data integration and data warehousing Knowledge of data quality and data governance principles Hands-on experience with Ab Initio GDE and EME tools Additional Information: The candidate should have a minimum of 5 years of experience in Ab Initio This position is based at our Pune office A 15 years full-time education is required Qualifications 15 years full time education

Posted 1 month ago

Apply

3 - 8 years

3 - 7 Lacs

Pune, Bengaluru

Work from Office

Naukri logo

locationsPune - WestBangalore Fortune Summit time typeFull time posted onPosted 3 Days Ago job requisition idJR-0009000 Middle Office - Analyst - Business Systems - Permanent LocationPune Experience3 - 6 years DesignationAssociate Industry/DomainETL/Mapping Tool, VBA, SQL, Capital Market knowledge, Bank Debts, Solvas Apex Group Ltd has an immediate requirement for Middle Office Tech Specialist. As an ETL Techno-Functional Support Specialist at Solvas, you will be the bridge between technical ETL processes and end-users, ensuring the effective functioning and support of data integration solutions. Your role involves addressing user queries, providing technical support for ETL-related issues, and collaborating with both technical and non-technical teams to ensure a seamless data integration environment. You will contribute to the development, maintenance, and enhancement of ETL processes for solvas application, ensuring they align with business requirements. Work Environment: Highly motivated, collaborative, and results driven. Growing business within a dynamic and evolving industry. Entrepreneurial approach to everything we do. Continual focus on process improvement and automation. Functional/ Business Expertise Required Serve as the primary point of contact for end-users seeking technical assistance related to Solvas applications. Serve as a point of contact for end-users, addressing queries related to ETL processes, data transformations, and data loads. Provide clear and concise explanations to non-technical users regarding ETL functionalities and troubleshoot issues. Integrate Client Trade files into the Conversant systemdesign, develop, implement, and test technical solutions based on client and business requirements. Diagnose and troubleshoot ETL-related issues reported by end-users or identified through monitoring systems. Work closely with business analysts and end-users to understand and document ETL requirements. Monitor ETL jobs and processes to ensure optimal performance and identify potential issues. Create user documentation and guides to facilitate self-service issue resolution. Hands on experience in working on any ETL tools is mandatory . Strong command of SQL, VBA and Advance Excel. Good understanding of Solvas or any other loan operation system . Mandatory to have good knowledge of Solvas Bank Debt working . Intermediate knowledge of financial instruments, both listed and unlisted or OTCs , which includes and not limited to derivatives, illiquid stocks, private equity, bank-debts, and swaps. Understanding of the Loan operation industry is necessary. Should have knowledge of market data provider applications (Bloomberg, Refinitiv etc.). Proficiency in any loan operation system, preferably solvas. An ability to work under pressure with changing priorities. Strong analytical and problem -solving skills. Experience and Knowledge: 3+ years of related experience in support/ technical in any loan operation system & accounting system (Solvas/ Geneva). Connect with operation to understand & resolve their issues. Experience working data vendors (Bloomberg/ Refinitiv/ Markit) Able to handle reporting issue/ New requirement raised by operations. Strong analytical, problem solving, and troubleshooting abilities. Strong Excel and Excel functions knowledge for business support. Create and maintain Business documentation, including user manuals and guides. Worked on system upgrade/ migration/ Integration. Other Skills: Good team player, ability to work on a local, regional, and global basis. Good communication & management skills Good understanding of Financial Services/ Capital Markets/ Fund Administration DisclaimerUnsolicited CVs sent to Apex (Talent Acquisition Team or Hiring Managers) by recruitment agencies will not be accepted for this position. Apex operates a direct sourcing model and where agency assistance is required, the Talent Acquisition team will engage directly with our exclusive recruitment partners.

Posted 1 month ago

Apply

3 - 5 years

2 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Skill required: Retirement Solutions - Retirement Planning Services Designation: Customer Service Analyst Qualifications: Any Graduation Years of Experience: 3 to 5 years What would you do? Flawless configuration of plan setup, Payroll file review and plan maintenance Oversees and leads the implementation of all complex, sensitive Third-Party Administrators, Retirement Plan Administration and bundled new business/conversion plans. Excellent communication skills and ability to interact with all levels of end users, technical resources, advisory, plan sponsors, administrators, prior record keepers and funding providers to coordinate all activities within the new business/conversion experience. Coaching and mentoring others to draw out their skills and expertiseRetirement solution is a comprehensive process to understand how much money you will need when you retire. Retirement solution also helps you identify the bestCovers the full range of services needed throughout a plans life, including plan development & enhancement, sales & marketing, plan sponsor/institutional client onboarding/management, participant enrollment/management, sponsor and member servicing & reporting. Their products consist of individual retirement accounts (Roth IRA), college savings accounts, guaranteed investment contracts, fixed & variable deferred annuities (qualified & non qualified), as well as corporate retirement funds. What are we looking for? 3+ years of experience in US Retirement Services domain – managing services for Defined Contribution plans Minimum 3+ years of experience in managing all aspects of the Payroll Maintenance deliverables and the new business/conversion process with Internal new business/Conversion teams, Sales and Administrators to ensure a successful conversion for Defined Contribution plans (Preferably 403(b)). Strong project management skills. Strong knowledge of Defined Contribution plans – 401(k), 403(b), 457, etc. Well versed with retirement plan funding platforms and ERISA fundamentals and concepts (eligibility, vesting, Enrollment) Working knowledge of Microsoft Access, SQL and other ETL tools is required. Ability to manage large data sets (census files, financial/payroll files) for Defined Contribution plans – 401(k), 403(b), 457 Proficient in MS Office applications – Word, Excel & PowerPoint Excellent written, verbal and presentation skills (internal and external presentations) ASPPA certified oRetirement Plan Fundamentals - RPFoQualified 401(k) Administrator - QKAoQualified 401(k) Consultant- QKC Project Management CertificationoPMP Roles and Responsibilities: Flawless configuration of plan setup, Payroll file review and plan maintenance Oversees and leads the implementation of all complex, sensitive Third-Party Administrators, Retirement Plan Administration and bundled new business/conversion plans. Excellent communication skills and ability to interact with all levels of end users, technical resources, advisory, plan sponsors, administrators, prior record keepers and funding providers to coordinate all activities within the new business/conversion experience. Coaching and mentoring others to draw out their skills and expertise Decisive, creative, and adaptable, with experience developing and executing solutions for clients while balancing business goals and priorities Work closely with the assigned Relationship Manager, ongoing Account Manager and various internal business partners to ensure a smooth transition at the end of the project. Monitoring and reporting on project status. Also, identify and lead quality improvement initiatives and other special projects for the team Maintaining knowledge of ERISA, IRS and DOL regulations and understanding the impact of pending legislation. Leading multiple projects and prioritizing workload based on urgency, importance, client expectations and business needs, delegating decision-making to team members as appropriate Developing project timelines and ensuring that project team members complete requirements on time, while anticipating challenges and formulating solutions before the project is adversely impacted Provide consultative solutions for Client's Retirement Services best practices and ensure details are understood so that our administrative processes and system(s) are established correctly to include but not limited to plan design details that affect pricing, eligibility, data exchanges, contracts, and transactional processing. Support RFP process for new business to identify potential solutions based on Implementations Service model and scope. Perform root cause analysis and resolution to complex problems. Conduct lessons learned at the end of each implementation phase for continuous improvement purposes. Research, analyze and recommend Payroll data conversion strategies for complex retirement plans, analyze client data in the format it is received, identify data deficiencies, define remediation of deficiencies, and construct a statement of work that properly outlines the payroll process to reformat into data requirements. Qualification Any Graduation

Posted 1 month ago

Apply

1 - 2 years

3 - 4 Lacs

Bengaluru

Work from Office

Naukri logo

Role & responsibilities Develop, test, and maintain ETL processes to extract, transform, and load data from multiple sources into data warehouses. Write efficient and optimized SQL queries to manage and retrieve data. Design, implement, and manage database systems using MySQL . Use Python to automate workflows, manipulate data, and perform analytical tasks. Collaborate with cross-functional teams to understand data requirements and deliver solutions. Monitor and ensure the quality, consistency, and reliability of data pipelines. Troubleshoot and resolve data-related issues in a timely manner. Requirements: Bachelors degree in Computer Science , Data Engineering , or a related field. 1 year of hands-on experience with: MySQL for database management and querying. ETL processes for data integration. Python for scripting and data processing. Strong understanding of relational database systems and data warehousing concepts. Familiarity with version control systems like Git . Good problem-solving and analytical skills. Excellent communication and teamwork abilities. Preferred Skills: Familiarity with cloud platforms such as AWS , GCP , or Azure . Knowledge of big data tools like Apache Spark or Hadoop .

Posted 1 month ago

Apply

5 - 10 years

18 - 33 Lacs

Bhubaneswar, Pune, Bengaluru

Work from Office

Naukri logo

About Client Hiring for One of the Most Prestigious Multinational Corporations Job Title: Snowflake Developer Experience: 5 to 10 years Key Responsibilities: Design, develop, and implement scalable Snowflake data warehouse solutions. Build efficient data pipelines using Snowflake features such as Snowpipe, Streams, and Tasks. Develop ELT/ETL solutions integrating data from various sources (e.g., SQL Server, Oracle, APIs). Optimize queries and Snowflake performance including clustering, caching, and warehouse sizing. Work closely with data architects, analysts, and business stakeholders to understand data requirements. Implement data security and governance using Snowflake roles and policies. Monitor and troubleshoot data pipeline and warehouse performance issues. Maintain documentation for data models, data flows, and processes. Required Skills 510 years of experience in Data Engineering/BI with a strong focus on cloud data warehousing. 3+ years of hands-on experience in Snowflake development and administration. Strong SQL programming skills; experience with complex queries and performance tuning. Experience with ETL/ELT tools like Informatica, Talend, dbt, or Matillion. Knowledge of Cloud platforms (AWS, Azure, or GCP), preferably AWS. Familiarity with tools like Airflow, Kafka, or other data orchestration solutions is a plus. Good understanding of data warehousing concepts, data modeling, and data governance. Notice period : 30,45,60,90 days Location: Bangalore, Hyderabad, Pune, Bhubaneswar (BBSR) Mode of Work :WFO(Work From Office) Thanks & Regards, SWETHA Black and White Business Solutions Pvt.Ltd. Bangalore,Karnataka,INDIA. Contact Number:8067432433 rathy@blackwhite.in |www.blackwhite.in

Posted 1 month ago

Apply

5 - 10 years

15 - 30 Lacs

Noida, Gurugram, Delhi / NCR

Hybrid

Naukri logo

Skills: Mandatory: SQL, Python, Databricks, Spark / Pyspark. Good to have: MongoDB, Dataiku DSS, Databricks Exp in data processing using Python/scala Advanced working SQL knowledge, expertise using relational databases Need Early joiners. Required Candidate profile ETL development tools like databricks/airflow/snowflake. Expert in building and optimizing big data' data pipelines, architectures, and data sets. Proficient in Big data tools and ecosystem

Posted 1 month ago

Apply

9 - 14 years

20 - 35 Lacs

Chennai, Bengaluru, Mumbai (All Areas)

Hybrid

Naukri logo

Lead and manage data-driven projects from initiation to completion, ensuring timely delivery and alignment with business objectives. Experienced in at least in one of the following engagements - BI/DW, DWH, Data Warehousing, Data Analytics, Business Intelligence, Data Projects, Data Management, Data GovernanceProven experience in project management, preferably in data analytics, business intelligence, or related domains. Strong analytical skills with experience in working with large datasets and deriving actionable insights. Collaborate with cross-functional teams, including data scientists, analysts, and engineers, to drive data-related initiatives. Develop and maintain project plans, timelines, and resource allocation for data & analytics projects. Oversee data collection, analysis, and visualization to support decision-making. Ensure data quality, integrity, and security compliance throughout project execution. Use data insights to optimize project performance, identify risks, and implement corrective actions. Communicate project progress, challenges, and key findings to stakeholders and senior management. Implement Agile or Scrum methodologies for efficient project execution in a data-focused environment. Stay updated with industry trends and advancements in data analytics and project management best practices. Proven experience in project management, preferably in data analytics, business intelligence, or related domains. Strong analytical skills with experience in working with large datasets and deriving actionable insights. Excellent problem-solving skills and ability to work with complex datasets. Strong communication and stakeholder management skills. Knowledge of cloud platforms (AWS, Google Cloud, Azure) and database management is a plus. Certification in PMP, PRINCE2, or Agile methodologies is an advantage. Locations: Pune/Bangalore/Hyderabad/Chennai/Mumbai

Posted 1 month ago

Apply

3 - 8 years

4 - 9 Lacs

Bangalore Rural, Bengaluru

Work from Office

Naukri logo

Skills: Elasticsearch, Talend, Grafana Responsibilities: Build dashboards, manage clusters, optimize performance Tech: API, Python, cloud platforms (AWS, Azure, GCP) Preference: Immediate joiners Contact: 6383826448 || jensyofficial23@gmail.com

Posted 1 month ago

Apply

2 - 3 years

5 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Role & responsibilities Design, develop, and maintain scalable ETL/ELT data pipelines Work with structured and unstructured data from various sources (APIs, databases, cloud storage, etc.) Optimize data workflows and ensure data quality, consistency, and reliability Collaborate with cross-functional teams to understand data requirements and deliver solutions Maintain and improve our data infrastructure and architecture Monitor pipeline performance and troubleshoot issues in real-time Preferred candidate profile 2-3 years of experience in data engineering or a similar role Proficiency in SQL and Python (or Scala/Java for data processing) Experience with ETL tools (e.g., Airflow, dbt, Luigi) Familiarity with cloud platforms like AWS, GCP, or Azure Hands-on experience with data warehouses (e.g., Redshift, BigQuery, Snowflake) Knowledge of distributed data processing frameworks like Spark or Hadoop Experience with version control systems (e.g., Git) Exposure to data modeling and schema design Experience working with CI/CD pipelines for data workflows Understanding of data privacy and security practices

Posted 1 month ago

Apply

10 - 15 years

10 - 14 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

The position is part of the Solutions Integration practice which focuses on the integration of information, process and people through the application of multiple technologies. The candidate is expected to handle small to medium scale consulting projects and should possess skills in the design, development, integration, and deployment of data extraction/load programs. Previous experience within Banking and Financial Services is preferred. To be considered, a candidate should be available for traveling (5% or more) and possess the required skills as mentioned below. The position will be based in the C&R Software Office in Bangalore, India . We shall offer Hybrid model of working. Position description - Solution Integration - Lead ETL Consultant - Band D Role/responsibilities: Design, develop, deploy, and support modules of our world-class enterprise-level solution into our international client base Drive technical architecture and design process in conjunction with client requirements Evaluate new design specifications, raise quality standards, and address architectural concerns Evaluate stability, compatibility, scalability, interoperability, and performance of the solution Own design aspects, performance, re-startablity, logging, error handling, security for both on-premises and cloud customers Continually learn new technologies in related areas Single point of contact (SPOC) for the technical implementation. Work with Project Manager to plan and deliver projects from requirements till Go-Live Be responsible to successfully deliver projects with accountability and ownership Getting the broader picture of the project and contributing accordingly. This includes understanding the business, technical & architectural aspects of project implementations. Thought process to build reusable artefacts and to make use of them to reduce development / testing / deployment / maintenance efforts. Ability to work with multiple customers at the same time. Adaptability to work in SDLC, Iterative and Agile methodology. Interact with clients/onsite team members to understand project requirements and goals. Lead client workshops (face to face or over the phone) for consulting, drive solutioning and issue resolution with client Follow up and escalate gaps, issues and enhancements identified throughout the project and drive them to closure Display high level of knowledge and consistent service in all interactions with client Establish positive client relationship(s) to facilitate our implementations Support client activities throughout the implementation of project life cycle including testing phases Support & review test strategy / planning of the end to end solution Lead in developing detailed business and technical specifications based on project requirements and turn into data extraction/load programs. Program ETL tool with business rules to be applied to data from source input to target data repository. Develop and assist in automating data extraction/load programs to run on regular schedule. Assist in managing daily, weekly, and monthly data operations and scheduled processes. Perform data conversion, quality, and integrity checks for all programs/processes Mentor junior members on team and be responsible for their deliverables Engage in Pre-Sales demonstrations, providing solutions and providing estimates In addition to these skills, the individual needs to be skilled in business analysis and knowledge acquisition . An integration consultant interacts with clients (both business and technical personnel) on a constant basis. Hence, it is necessary that an Integration Consultant have extremely good communication skills, should be able to listen carefully to clients and facilitate information gathering sessions. Skills/Experience requirements: Overall 10+ years of IT industry experience Undergraduate / Graduate in Computer Science or Computer Applications such as B. Sc. / B.C.A. / B. Tech. / B. E. / M. Sc. / M. Tech. / M. E. / M.C.A. Strong experience in understanding business requirements and converting those requirements into detailed functional and technical specifications 7 years experience with ETL tool preferably Kettle with knowledge on Metadata Injection, Kettle DB logging, Carte. 7 years experience in writing PL/SQL or T-SQL programming and queries on Oracle / SQL Server Strong knowledge on RDBMS concept and OLTP system architecture. Minimum 5 years experience in writing shell scripts on UNIX Sun Solaris Competent with SQL/database, SQL Server / Postgres, SSRS and other analytical programs, with the desire and ability to understand new software applications Experience reviewing query performance and optimizing/developing more efficient code Experience with creating table indexes to improve database performance Experience writing complex operations, views, stored procedures, triggers and functions to support business needs in a high availability environment Strong knowledge on source code control mechanism on any tool. Knowledge on GIT / BitBucket is added advantage. Strong knowledge of XML and JSON structures and Jenkins. Experience of job scheduling and working knowledge on at least one 3rd party scheduler Hands-on experience in AWS services like PostgreSQL, Aurora, Lambda is preferred Ability to perform data research and root cause analysis on data issues/discrepancies. Experience utilizing SOAP and REST to access web services Experience in Javascript, HTML, CSS Excellent written and verbal communication skills Excellent inter-personal skills and comfortable establishing professional relationships especially remotely (electronic, phone, written) Proven ability to plan and execute effectively to meet critical time-sensitive objectives Ability to effectively work alone and independently Experience in either the Banking or Financial Industry is preferred Experience in SSRS reports development Working knowledge of Python scripting is preferred Good mentorship skills Ability to deliver effectively in high pressure situations

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies