Jobs
Interviews

340 Etl Development Jobs - Page 5

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 7.0 years

9 - 13 Lacs

Hyderabad

Work from Office

PowerBi Modelling Performance tuning and optimization PowerBi embedded APIs Concept of Embedded Tokens Custom UI Integration Row Level Security Concepts Hands on knowledge on incremental refresh powerM scripting Enhanced XMLA scripting

Posted 1 week ago

Apply

2.0 - 4.0 years

4 - 8 Lacs

Hyderabad

Work from Office

CDP ETL & Database Engineer The CDP ETL & Database Engineer will specialize in architecting, designing, and implementing solutions that are sustainable and scalable. The ideal candidate will understand CRM methodologies, with an analytical mindset, and a background in relational modeling in a Hybrid architecture. The candidate will help drive the business towards specific technical initiatives and will work closely with the Solutions Management, Delivery, and Product Engineering teams. The candidate will join a team of developers across the US, India & Costa Rica. Responsibilities : ETL Development The CDP ETL & Database Engineer will be responsible for building pipelines to feed downstream data processes. They will be able to analyze data, interpret business requirements, and establish relationships between data sets. The ideal candidate will be familiar with different encoding formats and file layouts such as JSON and XML. I mplementations & Onboarding Will work with the team to onboard new clients onto the ZMP/CDP+ platform. The candidate will solidify business requirements, perform ETL file validation, establish users, perform complex aggregations, and syndicate data across platforms. The hands-on engineer will take a test-driven approach towards development and will be able to document processes and workflows. Incremental Change Requests The CDP ETL & Database Engineer will be responsible for analyzing change requests and determining the best approach towards implementation and execution of the request. This requires the engineer to have a deep understanding of the platform's overall architecture. Change requests will be implemented and tested in a development environment to ensure their introduction will not negatively impact downstream processes. Change Data Management The candidate will adhere to change data management procedures and actively participate in CAB meetings where change requests will be presented and approved. Prior to introducing change, the engineer will ensure that processes are running in a development environment. The engineer will be asked to do peer-to-peer code reviews and solution reviews before production code deployment. Collaboration & Process Improvement The engineer will be asked to participate in knowledge share sessions where they will engage with peers, discuss solutions, best practices, overall approach, and process. The candidate will be able to look for opportunities to streamline processes with an eye towards building a repeatable model to reduce implementation duration. Job Requirements : The CDP ETL & Database Engineer will be well versed in the following areas: Relational data modeling ETL and FTP concepts Advanced Analytics using SQL Functions Cloud technologies - AWS, Snowflake Able to decipher requirements, provide recommendations, and implement solutions within predefined timeframes. The ability to work independently, but at the same time, the individual will be called upon to contribute in a team setting. The engineer will be able to confidently communicate status, raise exceptions, and voice concerns to their direct manager. Participate in internal client project status meetings with the Solution/Delivery management teams. When required, collaborate with the Business Solutions Analyst (BSA) to solidify requirements. Ability to work in a fast paced, agile environment; the individual will be able to work with a sense of urgency when escalated issues arise. Strong communication and interpersonal skills, ability to multitask and prioritize workload based on client demand. Familiarity with Jira for workflow mgmt., and time allocation. Familiarity with Scrum framework, backlog, planning, sprints, story points, retrospectives etc. Required Skills : ETL ETL tools such as Talend (Preferred, not required) DMExpress Nice to have Informatica Nice to have Database - Hands on experience with the following database Technologies Snowflake (Required) MYSQL/PostgreSQL Nice to have Familiar with NOSQL DB methodologies (Nice to have) Programming Languages Can demonstrate knowledge of any of the following. PLSQL JavaScript Strong Plus Python - Strong Plus Scala - Nice to have AWS Knowledge of the following AWS services: S3 EMR (Concepts) EC2 (Concepts) Systems Manager / Parameter Store Understands JSON Data structures, key value pair. Working knowledge of Code Repositories such as GIT, Win CVS, SVN. Workflow management tools such as Apache Airflow, Kafka, Automic/Appworx Jira Minimum Qualifications Bachelor's degree or equivalent 2-4 Years' experience Excellent verbal & written communications skills Self-Starter, highly motivated Analytical mindset.

Posted 1 week ago

Apply

5.0 - 7.0 years

15 - 30 Lacs

Gurugram

Remote

Design, develop, and maintain robust data pipelines and ETL/ELT processes on AWS. Leverage AWS services such as S3, Glue, Lambda, Redshift, Athena, EMR , and others to build scalable data solutions. Write efficient and reusable code using Python for data ingestion, transformation, and automation tasks. Collaborate with cross-functional teams including data analysts, data scientists, and software engineers to support data needs. Monitor, troubleshoot, and optimize data workflows for performance, reliability, and cost efficiency. Ensure data quality, security, and governance across all systems. Communicate technical solutions clearly and effectively with both technical and non-technical stakeholders. Required Skills & Qualifications 5+ years of experience in data engineering roles. Strong hands-on experience with Amazon Web Services (AWS) , particularly in data-related services (e.g., S3, Glue, Lambda, Redshift, EMR, Athena). Proficiency in Python for scripting and data processing. Experience with SQL and working with relational databases. Solid understanding of data architecture, data modeling, and data warehousing concepts. Experience with CI/CD pipelines and version control tools (e.g., Git). Excellent verbal and written communication skills . Proven ability to work independently in a fully remote environment. Preferred Qualifications Experience with workflow orchestration tools like Apache Airflow or AWS Step Functions. Familiarity with big data technologies such as Apache Spark or Hadoop. Exposure to infrastructure-as-code tools like Terraform or CloudFormation. Knowledge of data privacy and compliance standards.

Posted 1 week ago

Apply

2.0 - 4.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Responsibilities : ETL Development The CDP ETL & Database Engineer will be responsible for building pipelines to feed downstream data processes. They will be able to analyze data, interpret business requirements, and establish relationships between data sets. The ideal candidate will be familiar with different encoding formats and file layouts such as JSON and XML. I mplementations & Onboarding Will work with the team to onboard new clients onto the ZMP/CDP+ platform. The candidate will solidify business requirements, perform ETL file validation, establish users, perform complex aggregations, and syndicate data across platforms. The hands-on engineer will take a test-driven approach towards development and will be able to document processes and workflows. Incremental Change Requests The CDP ETL & Database Engineer will be responsible for analyzing change requests and determining the best approach towards implementation and execution of the request. This requires the engineer to have a deep understanding of the platform's overall architecture. Change requests will be implemented and tested in a development environment to ensure their introduction will not negatively impact downstream processes. Change Data Management The candidate will adhere to change data management procedures and actively participate in CAB meetings where change requests will be presented and approved. Prior to introducing change, the engineer will ensure that processes are running in a development environment. The engineer will be asked to do peer-to-peer code reviews and solution reviews before production code deployment. Collaboration & Process Improvement The engineer will be asked to participate in knowledge share sessions where they will engage with peers, discuss solutions, best practices, overall approach, and process. The candidate will be able to look for opportunities to streamline processes with an eye towards building a repeatable model to reduce implementation duration. Job Requirements : The CDP ETL & Database Engineer will be well versed in the following areas: Relational data modeling ETL and FTP concepts Advanced Analytics using SQL Functions Cloud technologies - AWS, Snowflake Able to decipher requirements, provide recommendations, and implement solutions within predefined timeframes. The ability to work independently, but at the same time, the individual will be called upon to contribute in a team setting. The engineer will be able to confidently communicate status, raise exceptions, and voice concerns to their direct manager. Participate in internal client project status meetings with the Solution/Delivery management teams. When required, collaborate with the Business Solutions Analyst (BSA) to solidify requirements. Ability to work in a fast paced, agile environment; the individual will be able to work with a sense of urgency when escalated issues arise. Strong communication and interpersonal skills, ability to multitask and prioritize workload based on client demand. Familiarity with Jira for workflow mgmt., and time allocation. Familiarity with Scrum framework, backlog, planning, sprints, story points, retrospectives etc. Required Skills : ETL ETL tools such as Talend (Preferred, not required) DMExpress Nice to have Informatica Nice to have Database - Hands on experience with the following database Technologies Snowflake (Required) MYSQL/PostgreSQL Nice to have Familiar with NOSQL DB methodologies (Nice to have) Programming Languages Can demonstrate knowledge of any of the following. PLSQL JavaScript Strong Plus Python - Strong Plus Scala - Nice to have AWS Knowledge of the following AWS services: S3 EMR (Concepts) EC2 (Concepts) Systems Manager / Parameter Store Understands JSON Data structures, key value pair. Working knowledge of Code Repositories such as GIT, Win CVS, SVN. Workflow management tools such as Apache Airflow, Kafka, Automic/Appworx Jira Minimum Qualifications Bachelor's degree or equivalent 2-4 Years' experience Excellent verbal & written communications skills Self-Starter, highly motivated Analytical mindset

Posted 1 week ago

Apply

2.0 - 7.0 years

7 - 11 Lacs

Noida

Work from Office

Critical Thinking Testing Concepts ETL Testing Python Experience Nice to Have API Understanding & Testing (Manual Automation) UI Automation (Able to identify UI elements programmatically. (This is for Selenium). Detailed Description : Critical Thinking - 5/5 High in logical reasoning and proactiveness Should come up with diverse test cases against requirements Testing Concepts - 5/5 Practice various test design techniques Clarity with priority and Severity Testing life cycle and defect management Understand regression v/s functional SQL/ETL/Batch - 4/5 Able to write SQL statements with aggregate functions and joins Understand data transformation Familiar with data loads and related validations Automation - 3/5 Should be able to solve a given problem programmatically Familiar with coding standards, version control, piepliens Able to identify UI elements programmatically API - 2/5 Understand how API works Various authorization mechanisms Validation of responses

Posted 1 week ago

Apply

3.0 - 5.0 years

6 - 11 Lacs

Mumbai

Work from Office

Role Summary: Development of functions, stored procedures, and packages Development using external tables, bulk statement processing, dynamic statement execution, the use of bind variables, and the use of ref cursors, PL/SQL object types SQL statement tuning, reviewing explain plans, and utilizing optimizer hints Dealing with large volumes of data (millions/billions of rows of data) and dealing with partitioned tables Integration with ETL processes and experience in ETL tools Coding applications using best practices, documentation. Knowledge of Java/J2EE would be added advantage Would be responsible for unit testing. Contribute in design improvement and product enhancement. Demonstrate ability to understand unique requirements and implement them. He/ She should be a self-learner, able to work independently and manage tasks in hand. Skills: Excellent skills in relational database design Knowledge of Oracle, MSSQL, MySQL, Maria DB Extract Transform Load (ETL) concepts and technologies Data Warehousing tools, patterns and processes Knowledge of Scripting language (added advantage) Web servers: Apache Tomcat, JBoss, Weblogic and any additional web server Knowledge of Java/J2EE frameworks (added advantage)

Posted 1 week ago

Apply

6.0 - 11.0 years

20 - 27 Lacs

Hyderabad, Chennai

Hybrid

Role Summary As a ETL Developer, need to prepare the Requirement and Mapping document by analyzing and understanding source and target tables. Essential Responsibilities ¢ Responsible for the development of ETL scripts. €¢ Resolving €˜manual conversion€™ PowerCenter objects, creating IDMC compatible workflows from the existing PowerCenter code, as part of a standard SCRUM execution team. This effort is inclusive of testing and deployment of the resulting workflow. €¢ Responsible for managing the test data corresponding to the testing approach applied. €¢ Create and implement upgrade code. €¢ Assist Team Lead in Implementing plans and strategies for functionality and integrity €¢ Troubleshoot defects and manage defect tracking and resolution processes so that a consistent, Documented process is used and followed. €¢ Fully document any unexpected anomalies or defects identified in testing with instruction of how to recreate each. €¢ Ensure that defects and requirement variances are being appropriately communicated to development for correction. €¢ Work with team resources to discuss and troubleshoot issues. €¢ Perform effective version control management in all aspects of documentation / script output. €¢ Liaise with developers, testers and business domain experts to better understand requirements and produce effective scripts and concept of software features. Educational Qualification €¢ Engineering (BE or B.Tech or MCA) degree from a reputed College / University, Must Have €¢ Healthcare Domain experience. €¢ Strong knowledge and extensive working experience of SQL. €¢ Strong working knowledge in ETL processes and workflow. €¢ Relational Database/SQL experience a must. (SQL Server and Oracle preferred.); €¢ Informatica PowerCenter is must and IDMC a plus. €¢ A good understanding of the software development process €¢ Good communications and consultancy skills €¢ Ability to interpret and communicate technical information into business language €¢ Experience in both Agile and waterfall Methodologies. €¢ 6 + years of experience in ETL Development

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

indore, madhya pradesh

On-site

You are a technically strong Data Lead with 5+ years of experience, proficient in managing data projects, designing data architectures, and implementing end-to-end data solutions on the Microsoft platform. Your responsibilities include building and maintaining data pipelines and data warehouse solutions. You should have strong experience with the Microsoft data stack, including SQL Server, Azure Data Factory, etc. Expertise in ETL development and data warehousing concepts is required. You should also possess the ability to design scalable and efficient data models. Excellent communication skills are a must for this role.,

Posted 2 weeks ago

Apply

3.0 - 6.0 years

4 - 8 Lacs

Mumbai

Work from Office

The resource shall have at least 4 to 5 years of hands-on development experience using Alteryx, creating workflows and scheduling them. Shall be responsible for design, development, validation, and troubleshooting the ETL workflows using data from multiple source systems and transforming them in Alteryx for consumption by various PwC developed solutions. Alteryx workflow automation is another task that will come the way. Should have prior experience in maintaining documentation like design documents, mapping logic and technical specifications.

Posted 2 weeks ago

Apply

2.0 - 6.0 years

5 - 9 Lacs

Ahmedabad

Work from Office

Primary Skill: Abinitio, Oracle PL/SQL, Index Management Secondary Skill: Unix Commands and Shell scripting Primary Skill : Abinitio, Oracle PL/SQL, Index Management Secondary Skill : Unix Commands and Shell scripting Hands-on experience in Abinitio Graph / Plan Development, Parallel processing, Debugging, Air Commands, Ab InitioFile System Hands-on experiencein Oracle PL/SQL, Index Management Unix Commands, File management, process monitoring, network interfaces and Shell scripting Should be knowledgeable in Unix Commands and Shell scripting Understanding of QA within Software Development environment Logical analysis skills and problem-solving Proven ability to work to deadlines. Consistently demonstrates clear and concise written and verbal communication skills Ability to work under own initiative or as part of a team. Experience in Designing & Executing test cases Selenium Automation good to have Hands-on experience in Abinitio Graph / Plan Development, Parallel processing, Debugging, Air Commands, Ab Initio File System Hands-on experience in Oracle PL/SQL, Index Management Unix Commands, File management, process monitoring, network interfaces and Shell scripting Should be knowledgeable in Unix Commands and Shell scripting Understanding of QA within Software Development environment Logical analysis skills and problem-solving Proven ability to work to deadlines. Consistently demonstrates clear and concise written and verbal communication skills Ability to work under own initiative or as part of a team. Experience in Designing & Executing test cases Selenium Automation good to have

Posted 2 weeks ago

Apply

5.0 - 10.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Manage and support the Delivery Operations Team by implementing and supporting ETL and automation procedures. Schedule and perform delivery operations functions to complete tasks and ensure client satisfaction. ESSENTIAL FUNCTIONS: Process data conversions on multiple platforms Perform address standardization, merge purge, database updates, client mailings, postal presort. Automate scripts to perform tasks to transfer and manipulate data feeds internal and external. Multitask ability to manage multiple Jobs to ensure timely client deliverability Work with technical staff to maintain and support an ETL environment. Work in a team environment with database/crm, modelers, analysts and application programmers to deliver results for clients. REQUIRED SKILLS: Experience in database marketing with the ability to transform and manipulate data. Experience with Oracle and SQL to automate scripts to process and manipulate marketing data. Experience with tools such as DMexpress, Talend, Snowflake, Sap DQM suite of tools, excel. Experience with Sql Server : Data exports and imports, ability to run Sql server Agent Jobs and SSIS packages. Experience with editors like Notepad++, Ultraedit, or any type of editor. Experience in SFTP and PGP to ensure data security and protection of client data. Experience working with large scale customer databases in a relational database environment. Proven ability to work on multiple tasks at a given time. Ability to communicate and work in a team environment to ensure tasks are completed in a timely manner MINIMUM QUALIFICATIONS: Bachelor's degree or equivalent 5+ years experience in Database Marketing. Excellent oral and written communication skills required.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Pune

Work from Office

Snowflake Data Engineer1 Snowflake Data Engineer Overall Experience 5+ years of experience in Snowflake and Python. Experience of 5+ years in data preparation. BI projects to understand business requirements in BI context and understand data model to transform raw data into meaningful data using snowflake and Python. Designing and creating data models that define the structure and relationships of various data elements within the organization. This includes conceptual, logical, and physical data models, which help ensure data accuracy, consistency, and integrity. Designing data integration solutions that allow different systems and applications to share and exchange data seamlessly. This may involve selecting appropriate integration technologies, developing ETL (Extract, Transform, Load) processes, and ensuring data quality during the integration process. Create and maintain optimal data pipeline architecture. Good knowledge of cloud platforms like AWS/Azure/GCP Good hands-on knowledge of Snowflake is a must. Experience with various data ingestion methods (Snow pipe & others), time travel and data sharing and other Snowflake capabilities Good knowledge of Python/Py Spark, advanced features of Python Support business development efforts (proposals and client presentations). Ability to thrive in a fast-paced, dynamic, client-facing role where delivering solid work products to exceed high expectations is a measure of success. Excellent leadership and interpersonal skills. Eager to contribute to a team-oriented environment. Strong prioritization and multi-tasking skills with a track record of meeting deadlines. Ability to be creative and analytical in a problem-solving environment. Effective verbal and written communication skills. Adaptable to new environments, people, technologies, and processes Ability to manage ambiguity and solve undefined problems.

Posted 2 weeks ago

Apply

5.0 - 7.0 years

8 - 16 Lacs

Bengaluru

Work from Office

We are looking for a Senior Data Engineer with deep experience in SnapLogic, SQL, ETL pipelines, and data warehousing, along with hands-on experience with Databricks.in designing scalable data solutions and working across cloud and big data .

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 16 Lacs

Hyderabad

Remote

Job description As an ETL Developer for the Data and Analytics team, at Guidewire you will participate and collaborate with our customers and SI Partners who are adopting our Guidewire Data Platform as the centerpiece of their data foundation. You will facilitate and be an active developer when necessary to operationalize the realization of the agreed upon ETL Architecture goals of our customers adhering to Guidewire best practices and standards. You will work with our customers, partners, and other Guidewire team members to deliver successful data transformation initiatives. You will utilize best practices for design, development, and delivery of customer projects. You will share knowledge with the wider Guidewire Data and Analytics team to enable predictable project outcomes and emerge as a leader in our thriving data practice. One of our principles is to have fun while we deliver, so this role will need to keep the delivery process fun and engaging for the team in collaboration with the broader organization. Given the dynamic nature of the work in the Data and Analytics team, we are looking for decisive, highly-skilled technical problem solvers who are self-motivated and take proactive actions for the benefit of our customers and ensure that they succeed in their journey to Guidewire Cloud Platform. You will collaborate closely with teams located around the world and adhere to our core values Integrity, Collegiality, and Rationality. Key Responsibilities: Build out technical processes from specifications provided in High Level Design and data specifications documents. Integrate test and validation processes and methods into every step of the development process Work with Lead Architects and provide inputs into defining user stories, scope, acceptance criteria and estimates. Systematic problem-solving approach, coupled with a sense of ownership and drive Ability to work independently in a fast-paced Agile environment Actively contribute to the knowledge base from every project you are assigned to. Qualifications: Bachelors or Masters Degree in Computer Science, or equivalent level of demonstrable professional competency, and 3 - 5 years + in a technical capacity building out complex ETL Data Integration frameworks. 3+ years of Experience with data processing and ETL (Extract, Transform, Load) and ELT (Extract, Load, and Transform) concepts. Experience with ADF or AWS Glue, Spark/Scala, GDP, CDC, ETL Data Integration, Experience working with relational and/or NoSQL databases Experience working with different cloud platforms (such as AWS, Azure, Snowflake, Google Cloud, etc.) Ability to work independently and within a team. Nice to have: Insurance industry experience Experience with ADF or AWS Glue Experience with the Azure data factory, Spark/Scala Experience with the Guidewire Data Platform.

Posted 2 weeks ago

Apply

10.0 - 15.0 years

12 - 17 Lacs

Bengaluru

Work from Office

Experience: Minimum of 10+ years in database development and management roles. SQL Mastery: Advanced expertise in crafting and optimizing complex SQL queries and scripts. AWS Redshift: Proven experience in managing, tuning, and optimizing large-scale Redshift clusters. PostgreSQL: Deep understanding of PostgreSQL, including query planning, indexing strategies, and advanced tuning techniques. Data Pipelines: Extensive experience in ETL development and integrating data from multiple sources into cloud environments. Cloud Proficiency: Strong experience with AWS services like ECS, S3, KMS, Lambda, Glue, and IAM. Data Modeling: Comprehensive knowledge of data modeling techniques for both OLAP and OLTP systems. Scripting: Proficiency in Python, C#, or other scripting languages for automation and data manipulation. Preferred Qualifications Leadership: Prior experience in leading database or data engineering teams. Data Visualization: Familiarity with reporting and visualization tools like Tableau, Power BI, or Looker. DevOps: Knowledge of CI/CD pipelines, infrastructure as code (e.g., Terraform), and version control (Git). Certifications: Any relevant certifications (e.g., AWS Certified Solutions Architect, AWS Certified Database - Specialty, PostgreSQL Certified Professional) will be a plus. Azure Databricks: Familiarity with Azure Databricks for data engineering and analytics workflows will be a significant advantage. Soft Skills Strong problem-solving and analytical capabilities. Exceptional communication skills for collaboration with technical and non-technical stakeholders. A results-driven mindset with the ability to work independently or lead within a team. Qualification: Bachelor's or masters degree in Computer Science, Information Systems, Engineering or equivalent. 10+ years of experience

Posted 2 weeks ago

Apply

3.0 - 4.0 years

8 - 9 Lacs

Bengaluru

Work from Office

Total Years of Experience - 3-4 yr Relevant years of Experience - 3-4yr Mandatory Skills: 1. Data Management: Strong understanding of data management principles, Proficiency in working with different types of databases (SQL, NoSQL) and data formats (CSV, JSON, XML). 2. ETL (Extract, Transform, Load): Experience with ETL processes to extract data from various sources, transform it according to business rules, and load it into target systems. Knowledge of any ETL tools is beneficial. 3. Programming Skills: Proficiency in programming languages commonly used in data engineering such as Python, Java, or Scala with Spark. Knowledge of scripting languages like Bash or PowerShell can also be useful for automation tasks. 4. SQL: Strong SQL skills for querying and manipulating data in relational databases. This includes writing complex SQL queries, optimizing query performance, and understanding database indexing. Interview Date : 22nd July 2025.

Posted 2 weeks ago

Apply

4.0 - 9.0 years

4 - 9 Lacs

Hyderabad

Remote

As an ETL Developer for the Data and Analytics team, at Guidewire you will participate and collaborate with our customers and SI Partners who are adopting our Guidewire Data Platform as the centerpiece of their data foundation. You will facilitate and be an active developer when necessary to operationalize the realization of the agreed upon ETL Architecture goals of our customers adhering to Guidewire best practices and standards. You will work with our customers, partners, and other Guidewire team members to deliver successful data transformation initiatives. You will utilize best practices for design, development, and delivery of customer projects. You will share knowledge with the wider Guidewire Data and Analytics team to enable predictable project outcomes and emerge as a leader in our thriving data practice. One of our principles is to have fun while we deliver, so this role will need to keep the delivery process fun and engaging for the team in collaboration with the broader organization. Given the dynamic nature of the work in the Data and Analytics team, we are looking for decisive, highly-skilled technical problem solvers who are self-motivated and take proactive actions for the benefit of our customers and ensure that they succeed in their journey to Guidewire Cloud Platform. You will collaborate closely with teams located around the world and adhere to our core values Integrity, Collegiality, and Rationality. Key Responsibilities: Build out technical processes from specifications provided in High Level Design and data specifications documents. Integrate test and validation processes and methods into every step of the development process Work with Lead Architects and provide inputs into defining user stories, scope, acceptance criteria and estimates. Systematic problem-solving approach, coupled with a sense of ownership and drive Ability to work independently in a fast-paced Agile environment Actively contribute to the knowledge base from every project you are assigned to. Qualifications: Bachelors or Master’s Degree in Computer Science, or equivalent level of demonstrable professional competency, and 3 - 5 years + in a technical capacity building out complex ETL Data Integration frameworks. 3+ years of Experience with data processing and ETL (Extract, Transform, Load) and ELT (Extract, Load, and Transform) concepts. Experience with ADF or AWS Glue, Spark/Scala, GDP, CDC, ETL Data Integration, Experience working with relational and/or NoSQL databases Experience working with different cloud platforms (such as AWS, Azure, Snowflake, Google Cloud, etc.) Ability to work independently and within a team. Nice to have: Insurance industry experience Experience with ADF or AWS Glue Experience with the Azure data factory, Spark/Scala Experience with the Guidewire Data Platform.

Posted 2 weeks ago

Apply

5.0 - 7.0 years

7 - 15 Lacs

Bengaluru

Work from Office

Immediate joiners only • Design, develop, and maintain ETL processes using tools such as Talend, Informatica, SSIS, or similar. • Extract data from various sources, including databases, APIs, and flat files, transforming it to meet business requirements. • Load transformed data into target systems while ensuring data integrity and accuracy. • Collaborate with data analysts and business stakeholders to understand data needs and requirements. • Optimize ETL processes for enhanced performance and efficiency. • Debug and troubleshoot ETL jobs, providing effective solutions to data-related issues. • Document ETL processes, data models, and workflows for future reference and team collaboration.

Posted 2 weeks ago

Apply

4.0 - 9.0 years

0 - 2 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

A minimum of 4-10 years of experience into data integration/orchestration services, service architecture and providing data driven solutions for client requirements Experience on Microsoft Azure cloud and Snowflake SQL, database query/performance tuning. Experience with Qlik Replicate and Compose tools(Change Data Capture) tools is considered a plus Strong Data warehousing Concepts, ETL tools such as Talend Cloud Data Integration tool is must Exposure to the financial domain knowledge is considered a plus. Cloud Managed Services such as source control code Github, MS Azure/Devops is considered a plus. Prior experience with State Street and Charles River Development ( CRD) considered a plus. Experience in tools such as Visio, PowerPoint, Excel. Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus. Strong SQL knowledge and debugging skills is a must

Posted 2 weeks ago

Apply

5.0 - 9.0 years

10 - 20 Lacs

Chennai

Work from Office

Experience - 5 - 9 Years Notice Period - Immediate to 15 Days Location: Chennai Note: Experience with DataStage is Mandatory Responsibilities Direct Responsibilities For a predefined applications scope take care of: Design Implementation (coding / parametrization, unit test, assembly test, integration test, system test, support during functional/acceptance test) Roll-out support Documentation Continuous Improvement Ensure that SLA targets are met for above activities Handover to Italian teams if knowledge and skills are not available in ISPL Coordinate closely with Data Platform Teamss and also all other BNL BNP Paribas IT teams (Incident coordination, Security, Infrastructure, Development teams, etc.) Collaborate and support Data Platform Teams to Incident Management, Request Management and Change Management Technical & Behavioral Competencies Fundamental skills: IBM DataStage SQL Experience with Data Modeling and tool ERWin Important skill - knowledge of at least one of database technologies is required: Teradata Oracle SQL Server.

Posted 2 weeks ago

Apply

7.0 - 10.0 years

12 - 22 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

JD For Informatica Developer: Key Responsibilities: Design, develop, test, and deploy ETL mappings, workflows, and sessions using Informatica PowerCenter. Analyze source systems, define transformation logic, and create data mappings and process flows. Optimize ETL performance and troubleshoot data quality issues. Collaborate with database administrators, data architects, and QA teams to ensure data integrity and optimal system performance. Develop and maintain technical documentation including data flow diagrams, mapping documents, and deployment instructions. Support data migration, data integration, and business intelligence initiatives. Participate in code reviews, testing, and deployment activities. Ensure adherence to data governance and security standards. Required Qualifications: 7+ years of hands-on experience with Informatica PowerCenter in a development role. Strong proficiency in SQL and experience with at least one RDBMS (e.g., Oracle, SQL Server, DB2). Solid understanding of data warehousing concepts, dimensional modeling, and ETL best practices. Experience with performance tuning of Informatica mappings and sessions. Knowledge of job scheduling tools (e.g., UC4, Autosys, Control-M) is a plus. Familiarity with version control tools (e.g., Git, SVN). Preferred Skills: Experience with cloud data platforms (AWS, Snowflake, etc.). Knowledge of other Informatica products such as IDQ, MDM, or Cloud Data Integration. Exposure to Agile/Scrum methodologies. Good communication and interpersonal skills. Ability to work independently and collaboratively in a team environment.

Posted 2 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

hyderabad, telangana

On-site

The role of Data Lead at LERA Technologies involves owning the data strategy, architecture, and engineering roadmap for key client engagements. As a Data Lead, you will lead the design and development of scalable, secure, and high-performance data pipelines, marts, and warehouses. Additionally, you will mentor a team of data engineers and collaborate with BI/reporting teams and solution architects. Your responsibilities will include overseeing data ingestion, transformation, consolidation, and validation across cloud and hybrid environments. It is essential to champion best practices for data quality, data lineage, and metadata management. You will also be expected to evaluate emerging tools, technologies, and frameworks to enhance platform capabilities and engage with business and technical stakeholders to translate analytics needs into scalable data solutions. Monitoring performance and optimizing storage and processing layers for efficiency and scalability are key aspects of this role. The ideal candidate for this position should have at least 7 years of experience in Data Engineering, including proficiency in SQL/PLSQL/TSQL, ETL development, and data pipeline architecture. A strong command of ETL tools such as SSIS or equivalent and Data Warehousing concepts is required. Expertise in data modeling, architecture, and integration frameworks is essential, along with experience leading data teams and managing end-to-end data delivery across projects. Hands-on knowledge of BI tools like Power BI, Tableau, SAP BO, or OBIEE and their backend integration is a must. Proficiency in big data technologies and cloud platforms such as Azure, AWS, or GCP is also necessary. Programming experience in Python, Java, or equivalent languages, as well as proven experience in performance tuning and optimization of large datasets, are important qualifications. A strong understanding of data governance, data security, and compliance best practices is required, along with excellent communication, stakeholder management, and team mentoring abilities. Desirable skills for this role include leadership experience in building and managing high-performing data teams, exposure to data mesh, data lake house architectures, or modern data platforms, experience defining and enforcing data quality and lifecycle management practices, and familiarity with CI/CD for data pipelines and infrastructure-as-code. At LERA Technologies, you will have the opportunity to embrace innovation, creativity, and experimentation while significantly impacting our clients" success across various industries. You will thrive in a workplace that values diversity and inclusive excellence, benefit from extensive opportunities for career advancement, and lead cutting-edge projects with an agile and visionary team. If you are ready to lead data-driven transformation and shape the future of enterprise data, apply now to join LERA Technologies as a Data Lead.,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

8 - 15 Lacs

Coimbatore

Work from Office

Job Title: Senior ETL Developer SSIS (Contract) Location: Coimbatore Type: Contract Duration: 6 months (extendable) Job Summary: We are looking for an experienced Senior ETL Developer with strong hands-on expertise in SSIS and SQL Server to join our data engineering team on a contract basis. The selected candidate will be responsible for developing, optimizing, and maintaining robust ETL workflows that support business analytics and reporting needs. This is a high-impact role for someone who thrives in a fast-paced, delivery-oriented environment and can work independently with minimal supervision. Key Responsibilities: 1. Design and Development of ETL Workflows: Develop, enhance, and maintain robust SSIS packages to extract, transform, and load data from various sources (e.g., flat files, Excel, SQL Server, APIs). Implement complex business logic and transformation rules within SSIS. Design reusable components like configuration-driven packages, parameterized data flows, and modular templates. 2. Performance Optimization: Analyse and tune SSIS packages for performance (e.g., buffer sizes, data flow parallelism, lookup caching). Identify and resolve bottlenecks in ETL workflows. Use best practices for managing large volumes of data, including incremental loads and change data capture (CDC). 3. Data Quality and Validation: Implement data quality checks and validations within ETL pipelines. Create audit and control frameworks for monitoring completeness and accuracy of data movement. Handle data cleansing, deduplication, and standardization as part of transformation logic. 4. Error Handling and Logging: Build comprehensive logging and error-handling mechanisms within SSIS (e.g., event handlers, custom logs, notification alerts). Troubleshoot and debug failed packages or erroneous data movements effectively. 5. Deployment and Automation: Package and deploy SSIS solutions using tools like SSDT, MSBuild, and Azure DevOps or other CI/CD pipelines. Schedule and monitor ETL jobs using SQL Server Agent or enterprise schedulers. Maintain SSIS package versions and manage deployment across dev/test/prod environments. 6. Documentation and Technical Specifications: Document data flow logic, transformation rules, and package dependencies. Create and maintain technical specifications for ETL processes. Support knowledge transfer and onboarding of junior developers by maintaining updated documentation. 7. Collaboration and Stakeholder Communication: Work closely with data architects, DBAs, and business analysts to understand requirements and translate them into ETL solutions. Participate in requirement gathering sessions and contribute to solution design discussions. Provide estimations, timelines, and status updates for development activities. 8. Standards and Best Practices: Promote SSIS development standards including naming conventions, modularity, parameterization, and logging. Participate in code reviews to ensure maintainability and consistency. Continuously evaluate and apply improvements to coding and performance practices. Required Skills & Experience: 6+ years of experience in ETL development using SSIS. Strong proficiency in SQL Server and advanced T-SQL scripting. Solid understanding of data warehousing principles, dimensional modeling (Kimball), and data integration patterns. Experience with job orchestration tools and schedulers (e.g., SQL Agent, Control-M, Azure Data Factory). Strong troubleshooting and performance optimization skills. Ability to work independently, meet deadlines, and communicate effectively with technical and business teams. Preferred Skills: Exposure to cloud platforms like Azure Data Factory, Azure Synapse, or AWS Glue. Experience with reporting tools such as Power BI, SSRS, or Tableau. Previous experience working in healthcare, finance, or other regulated domains is a plus. Familiarity with version control tools (e.g., Git, Azure DevOps). Bachelors degree in Computer Science, Information Technology, or a related field.

Posted 2 weeks ago

Apply

10.0 - 15.0 years

20 - 30 Lacs

Coimbatore

Work from Office

Location: Coimbatore Type: Contract Duration: 6-7 months, extendable Experience Required: 10+ years Job Summary: We are looking for a Contract Data Architect with strong expertise in SSIS-based ETL development to support our ongoing data integration and warehousing initiatives. The contractor will be responsible for designing and optimizing data pipelines, managing data transformations, and ensuring high-quality data flow between systems. This is a hands-on role requiring close collaboration with internal teams and external stakeholders. Experienced EHR/EMR Systems Architect to lead the design and integration of scalable, secure, and interoperable healthcare information systems. This role requires deep expertise in Electronic Health Records (EHR) and Electronic Medical Records (EMR) platforms, with a strong focus on system architecture, interoperability standards (HL7 certified), and regulatory compliance including HIPAA. Key Responsibilities: 1. Design and Develop Scalable ETL Solutions: Lead the architecture and design of robust ETL pipelines using SQL Server Integration Services (SSIS) to ingest, transform, and load data from various sources into enterprise data warehouses or data marts. 2. Define Data Architecture Standards: Establish best practices, data modeling standards, and ETL frameworks to ensure consistency, performance, and maintainability across data solutions. 3. Analyze Source Systems and Define Data Integration Strategies: Work closely with business analysts, SMEs, and source system owners to understand data structures, transformations, and mapping requirements. 4. Optimize Data Workflows and Performance: Monitor, troubleshoot, and optimize SSIS packages for performance, reliability, and error handling; implement logging and alerting mechanisms. 5. Data Quality and Validation: Implement data profiling, cleansing, validation, and reconciliation rules to ensure high data integrity across ETL processes. 6. Collaborate with Data Engineers and BI Teams: Guide junior SSIS ETL developers and collaborate with reporting/analytics teams to support data availability for dashboards, reports, and analytics platforms. 7. Maintain and Document ETL Architecture: Create and maintain comprehensive architecture diagrams, technical specifications, and operational runbooks for ETL processes and data flows. 8. Support Deployment and Version Control: Use version control tools (e.g., Git, TFS) and CI/CD pipelines for deploying ETL changes across environments (Dev, QA, Prod). 9. Data Security and Compliance: Ensure ETL processes comply with organizational data governance and industry regulations (e.g., HIPAA, GDPR), especially when handling sensitive or regulated data. 10. Support Ad Hoc Data Requests: Assist with one-time or urgent data extraction, transformation, or loading needs from business or technical teams. Required Skills & Experience: 10+ years of experience in data architecture, data warehousing, and large-scale ETL design and development. Expert-level proficiency in SSIS (SQL Server Integration Services) for building complex, high-volume ETL workflows. Strong experience in Microsoft SQL Server writing complex stored procedures, performance tuning, and indexing strategies. Solid understanding of data modeling (conceptual, logical, and physical models) using tools like Erwin, PowerDesigner, or dbt. Experience working with large datasets, handling incremental and full data loads, and implementing CDC (Change Data Capture) and SCD (Slowly Changing Dimensions). Hands-on experience with data quality frameworks, validation rules, and exception handling mechanisms. Experience with version control (Git, TFS) and CI/CD pipelines for ETL deployments. Familiarity with data governance, metadata management, and master data management (MDM) concepts. Experience with Azure SQL, Azure Data Factory, or cloud data integration is a plus. Familiarity with reporting tools like SSRS, Power BI, or Tableau is a bonus. Comfortable working independently with minimal supervision in a fast-paced environment. Excellent written and verbal communication skills.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 11 Lacs

Telangana

Work from Office

Key Responsibilities ETL Development: Design and implement ETL processes using Informatica PowerCenter, Cloud Data Integration, or other Informatica tools. Data Integration: Integrate data from various sources, ensuring data accuracy, consistency, and high availability. Performance Optimization: Optimize ETL processes for performance and efficiency, ensuring minimal downtime and maximum throughput.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies