Home
Jobs

175 Etl Development Jobs - Page 3

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 15.0 years

2 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title :Senior SQL Developer Experience 10 -15Years Location :Bangalore ExperienceMinimum of 10+ years in database development and management roles. SQL MasteryAdvanced expertise in crafting and optimizing complex SQL queries and scripts. AWS RedshiftProven experience in managing, tuning, and optimizing large-scale Redshift clusters. PostgreSQLDeep understanding of PostgreSQL, including query planning, indexing strategies, and advanced tuning techniques. Data PipelinesExtensive experience in ETL development and integrating data from multiple sources into cloud environments. Cloud ProficiencyStrong experience with AWS services like ECS, S3, KMS, Lambda, Glue, and IAM. Data ModelingComprehensive knowledge of data modeling techniques for both OLAP and OLTP systems. ScriptingProficiency in Python, C#, or other scripting languages for automation and data manipulation. Preferred Qualifications LeadershipPrior experience in leading database or data engineering teams. Data VisualizationFamiliarity with reporting and visualization tools like Tableau, Power BI, or Looker. DevOpsKnowledge of CI/CD pipelines, infrastructure as code (e.g., Terraform), and version control (Git). CertificationsAny relevant certifications (e.g., AWS Certified Solutions Architect, AWS Certified Database - Specialty, PostgreSQL Certified Professional) will be a plus. Azure DatabricksFamiliarity with Azure Databricks for data engineering and analytics workflows will be a significant advantage. Soft Skills Strong problem-solving and analytical capabilities. Exceptional communication skills for collaboration with technical and non-technical stakeholders. A results-driven mindset with the ability to work independently or lead within a team. Qualification: Bachelor's or masters degree in Computer Science, Information Systems, Engineering or equivalent. 10+ years of experience

Posted 3 weeks ago

Apply

6.0 - 10.0 years

10 - 15 Lacs

Chennai

Work from Office

Naukri logo

Experience : 5-10 years in ETL development, with 3+ years in a leadership role and extensive hands-on experience in Informatica PowerCenter and Cloud Data Integration. Job Overview: We are seeking a highly skilled and experienced Informatica Lead to join our IT team. The ideal candidate will lead a team of ETL developers and oversee the design, development, and implementation of ETL solutions using Informatica PowerCenter and Cloud Data Integration. This role requires expertise in data integration, leadership skills, and the ability to work in a dynamic environment to deliver robust data solutions for business needs. Key Responsibilities: ETL Development and Maintenance: Lead the design, development, and maintenance of ETL workflows and mappings using Informatica PowerCenter and Cloud Data Integration. Ensure the reliability, scalability, and performance of ETL solutions to meet business requirements. Optimize ETL processes for data integration, transformation, and loading into data warehouses and other target systems. Solution Architecture and Implementation: Collaborate with architects and business stakeholders to define ETL solutions and data integration strategies. Develop and implement best practices for ETL design and development. Ensure seamless integration with on-premises and cloud-based data platforms. Data Governance and Quality: Establish and enforce data quality standards and validation processes. Implement data governance and compliance policies to ensure data integrity and security. Perform root cause analysis and resolve data issues proactively. Team Leadership: Manage, mentor, and provide technical guidance to a team of ETL developers. Delegate tasks effectively and ensure timely delivery of projects and milestones. Conduct regular code reviews and performance evaluations for team members. Automation and Optimization: Develop scripts and frameworks to automate repetitive ETL tasks. Implement performance tuning for ETL pipelines and database queries. Explore opportunities to improve efficiency and streamline workflows. Collaboration and Stakeholder Engagement: Work closely with business analysts, data scientists, and application developers to understand data requirements and deliver solutions. Communicate project updates, challenges, and solutions to stakeholders effectively. Act as the primary point of contact for Informatica-related projects and initiatives. Academic Qualifications: Bachelor's degree in Computer Science, Information Technology, or equivalent. Relevant certifications (e.g., Informatica Certified Specialist, Informatica Cloud Specialist) are a plus. Experience : 6-10 years of experience in ETL development and data integration, with at least 3 years in a leadership role. Proven experience with Informatica PowerCenter, Informatica Cloud Data Integration, and large-scale ETL implementations. Experience in integrating data from various sources such as databases, flat files, and APIs. Technical Skills: Strong expertise in Informatica PowerCenter, Informatica Cloud, and ETL frameworks. Proficiency in SQL, PL/SQL, and performance optimization techniques. Knowledge of cloud platforms like AWS, Azure, or Google Cloud. Familiarity with big data tools such as Hive, Spark, or Snowflake is a plus. Strong understanding of data modeling concepts and relational database systems. Soft Skills: Excellent leadership and project management skills. Strong analytical and problem-solving abilities. Effective communication and stakeholder management skills. Ability to work under tight deadlines in a fast-paced environment

Posted 3 weeks ago

Apply

4.0 - 6.0 years

18 - 20 Lacs

Pune, Chennai, Bengaluru

Work from Office

Naukri logo

We are hiring experienced ETL Developers (Ab Initio) for a leading MNC, with positions open in Pune, Chennai, and Bangalore. The ideal candidate should have 5+ years of hands-on experience in ETL development, with strong proficiency in Ab Initio, Unix, and SQL. Exposure to Hadoop and scripting languages like Shell or Python is a plus. This is a work-from-office role and requires candidates to be available for a face-to-face interview. Applicants should be able to join within 15 days to 1 month. Strong development background and the ability to work in a structured, fast-paced environment are essential.

Posted 3 weeks ago

Apply

5.0 - 7.0 years

8 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

We are looking for a Senior Data Engineer with deep experience in SnapLogic, SQL, ETL pipelines, and data warehousing, along with hands-on experience with Databricks.in designing scalable data solutions and working across cloud and big data .

Posted 3 weeks ago

Apply

8.0 - 10.0 years

15 - 20 Lacs

Noida

Work from Office

Naukri logo

Job Description: We are looking for a highly skilled Senior System Engineer with expertise in server administration, automation, network security, and ETL development. The ideal candidate should have a strong understanding of system infrastructure design, security best practices, and database management. Key Responsibilities: -Administer and maintain Windows & Linux servers, ensuring high availability and performance. -Develop and maintain automation scripts using Python, PowerShell, and Bash. -Configure and manage VPNs, firewalls, and network security protocols. -Implement server security best practices, including patch management, hardening, and access control. -Design and manage ETL processes for data extraction, transformation, and loading. -Work with SQL Server and PostgreSQL for database administration, optimization, and troubleshooting. -Manage and support real-time data acquisition using OPC DA/HAD. -Collaborate with cross-functional teams to design and implement scalable system infrastructure solutions. Required Skills & Qualifications: -8+ years of experience in server administration (Windows & Linux). -Strong scripting skills in Python, PowerShell, and Bash for automation. -Expertise in network security, VPN configurations, and firewall management. -Hands-on experience in ETL development and database management (SQL Server/PostgreSQL). -Familiarity with real-time data acquisition using OPC DA/HAD. -Knowledge of system infrastructure design principles and best practices. -Experience in server security (patching, hardening, access control, monitoring, and compliance). Strong problem-solving and troubleshooting skills. Preferred Qualifications: -Experience with cloud platforms (AWS, Azure, or GCP). -Knowledge of containerization (Docker, Kubernetes). -Familiarity with CI/CD pipelines and DevOps tools. -Proficiency in server administration (Windows & Linux). -Scripting knowledge (Python, PowerShell, Bash) for automation. -Knowledge of network security, VPN, and firewall configurations. -Strong understanding of system infrastructure design principles. -Knowledge of server security best practices (patching, hardening, access control). -Hands-on Experience in ETL development -Experience in SQLServer/Postgres database management -Familiar with real-time data acquisition using OPC DA/HAD ETL Development Process, Window & Linux Server, real-time data acquisition using OPC DA/HAD, Strong scripting skills in Python, PowerShell, and Bash for automation, Expertise in network security, VPN configurations, and firewall management.

Posted 3 weeks ago

Apply

1.0 - 4.0 years

3 - 6 Lacs

Coimbatore

Work from Office

Naukri logo

About Responsive Responsive, formerly RFPIO, is the market leader in an emerging new category of SaaS solutions called Strategic Response Management Responsive customers including Google, Microsoft, Blackrock, T Rowe Price, Adobe, Amazon, Visa and Zoom are using Responsive to manage business critical responses to RFPs, RFIs, RFQs, security questionnaires, due diligence questionnaires and other requests for information Responsive has nearly 2,000 customers of all sizes and has been voted ?best in class? by G2 for 13 quarters straight It also has more than 35% of the cloud SaaS leaders as customers, as well as more than 15 of the Fortune 100 Customers have used Responsive to close more than $300B in transactions to-date, About The Role We are seeking a highly skilled Product Data Engineer with expertise in building, maintaining, and optimizing data pipelines using Python scripting The ideal candidate will have experience working in a Linux environment, managing large-scale data ingestion, processing files in S3, and balancing disk space and warehouse storage efficiently This role will be responsible for ensuring seamless data movement across systems while maintaining performance, scalability, and reliability, Essential Functions ETL Pipeline Development: Design, develop, and maintain efficient ETL workflows using Python to extract, transform, and load data into structured data warehouses, Data Pipeline Optimization: Monitor and optimize data pipeline performance, ensuring scalability and reliability in handling large data volumes, Linux Server Management: Work in a Linux-based environment, executing command-line operations, managing processes, and troubleshooting system performance issues, File Handling & Storage Management: Efficiently manage data files in Amazon S3, ensuring proper storage organization, retrieval, and archiving of data, Disk Space & Warehouse Balancing: Proactively monitor and manage disk space usage, preventing storage bottlenecks and ensuring warehouse efficiency, Error Handling & Logging: Implement robust error-handling mechanisms and logging systems to monitor data pipeline health, Automation & Scheduling: Automate ETL processes using cron jobs, Airflow, or other workflow orchestration tools, Data Quality & Validation: Ensure data integrity and consistency by implementing validation checks and reconciliation processes, Security & Compliance: Follow best practices in data security, access control, and compliance while handling sensitive data, Collaboration with Teams: Work closely with data engineers, analysts, and product teams to align data processing with business needs, Education Bachelors degree in Computer Science, Data Engineering, or a related field, Long Description 2+ years of experience in ETL development, data pipeline management, or backend data engineering, Proficiency in Python: Strong hands-on experience in writing Python scripts for ETL processes, Linux Expertise: Experience working with Linux servers, command-line operations, and system performance tuning, Cloud Storage Management: Hands-on experience with Amazon S3, including handling file storage, retrieval, and lifecycle policies, Data Pipeline Management: Experience with ETL frameworks, data pipeline automation, and workflow scheduling (e-g , Apache Airflow, Luigi, or Prefect), SQL & Database Handling: Strong SQL skills for data extraction, transformation, and loading into relational databases and data warehouses, Disk Space & Storage Optimization: Ability to manage disk space efficiently, balancing usage across different systems, Error Handling & Debugging: Strong problem-solving skills to troubleshoot ETL failures, debug logs, and resolve data inconsistencies, Experience with cloud data warehouses (e-g , Snowflake, Redshift, BigQuery), Knowledge of message queues (Kafka, RabbitMQ) for data streaming, Familiarity with containerization tools (Docker, Kubernetes) for deployment, Exposure to infrastructure automation tools (Terraform, Ansible), Knowledge, Ability & Skills Strong analytical mindset and ability to handle large-scale data processing efficiently, Ability to work independently in a fast-paced, product-driven environment,

Posted 3 weeks ago

Apply

6.0 - 8.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Diverse Lynx is looking for Datastage Developer to join our dynamic team and embark on a rewarding career journey Analyzing business requirements and translating them into technical specifications Designing and implementing data integration solutions using Datastage Extracting, transforming, and loading data from various sources into target systems Developing and testing complex data integration workflows, including the use of parallel processing and data quality checks Collaborating with database administrators, data architects, and stakeholders to ensure the accuracy and consistency of data Monitoring performance and optimizing Datastage jobs to ensure they run efficiently and meet SLAs Troubleshooting issues and resolving problems related to data integration Knowledge of data warehousing, data integration, and data processing concepts Strong problem-solving skills and the ability to think creatively and critically Excellent communication and collaboration skills, with the ability to work effectively with technical and non-technical stakeholders

Posted 3 weeks ago

Apply

4.0 - 5.0 years

6 - 10 Lacs

Kochi, Bengaluru

Work from Office

Naukri logo

4+ yrs experience Work from Office - 1st preference Kochi, 2nd preference Bangalore Good exp in any EtL tool Good knowledge in python Integration experience Good attitude and Cross skilling ability

Posted 3 weeks ago

Apply

0.0 - 2.0 years

4 - 7 Lacs

Navi Mumbai

Work from Office

Naukri logo

Title Our corporate activities are growing rapidly, and we are currently seeking a full-time, office-based Data Engineerto join our Information Technology team. This position will work on a team to accomplish tasks and projects that are instrumental to the company’s success. If you want an exciting career where you use your previous expertise and can develop and grow your career even further, then this is the opportunity for you. Overview Medpace is a full-service clinical contract research organization (CRO). We provide Phase I-IV clinical development services to the biotechnology, pharmaceutical and medical device industries. Our mission is to accelerate the global development of safe and effective medical therapeutics through its scientific and disciplined approach. We leverage local regulatory and therapeutic expertise across all major areas including oncology, cardiology, metabolic disease, endocrinology, central nervous system, anti-viral and anti-infective. Headquartered in Cincinnati, Ohio, employing more than 5,000 people across 40+ countries. Responsibilities Utilize skills in development areas including data warehousing, business intelligence, and databases (Snowflake, ANSI SQL, SQL Server, T-SQL); Support programming/software development using Extract, Transform, and Load (ETL) and Extract, Load and Transform (ELT) tools, (dbt, Azure Data Factory, SSIS); Design, develop, enhance and support business intelligence systems primarily using Microsoft Power BI; Collect, analyze and document user requirements; Participate in software validation process through development, review, and/or execution of test plan/cases/scripts; Create software applications by following software development lifecycle process, which includes requirements gathering, design, development, testing, release, and maintenance; Communicate with team members regarding projects, development, tools, and procedures; and Provide end-user support including setup, installation, and maintenance for applications Qualifications Bachelor's Degree in Computer Science, Data Science, or a related field; 5+ years of experience in Data Engineering; Knowledge of developing dimensional data models and awareness of the advantages and limitations of Star Schema and Snowflake schema designs; Solid ETL development, reporting knowledge based off intricate understanding of business process and measures; Knowledge of Snowflake cloud data warehouse, Fivetran data integration and dbt transformations is preferred; Knowledge of Python is preferred; Knowledge of REST API; Basic knowledge of SQL Server databases is required; Knowledge of C#, Azure development is a bonus; and Excellent analytical, written and oral communication skills. People. Purpose. Passion. Make a Difference Tomorrow. Join Us Today. The work we’ve done over the past 30+ years has positively impacted the lives of countless patients and families who face hundreds of diseases across all key therapeutic areas. The work we do today will improve the lives of people living with illness and disease in the future. Medpace Perks Flexible work environment Competitive compensation and benefits package Competitive PTO packages Structured career paths with opportunities for professional growth Company-sponsored employee appreciation events Employee health and wellness initiatives Awards Recognized by Forbes as one of America's Most Successful Midsize Companies in 2021, 2022, 2023 and 2024 Continually recognized with CRO Leadership Awards from Life Science Leader magazine based on expertise, quality, capabilities, reliability, and compatibility What to Expect Next A Medpace team member will review your qualifications and, if interested, you will be contacted with details for next steps. EO/AA Employer M/F/Disability/Vets

Posted 3 weeks ago

Apply

0.0 - 1.0 years

3 - 6 Lacs

Navi Mumbai

Work from Office

Naukri logo

Title Our corporate activities are growing rapidly, and we are currently seeking a full-time, office-based Data Engineerto join our Information Technology team. This position will work on a team to accomplish tasks and projects that are instrumental to the company’s success. If you want an exciting career where you use your previous expertise and can develop and grow your career even further, then this is the opportunity for you. Overview Medpace is a full-service clinical contract research organization (CRO). We provide Phase I-IV clinical development services to the biotechnology, pharmaceutical and medical device industries. Our mission is to accelerate the global development of safe and effective medical therapeutics through its scientific and disciplined approach. We leverage local regulatory and therapeutic expertise across all major areas including oncology, cardiology, metabolic disease, endocrinology, central nervous system, anti-viral and anti-infective. Headquartered in Cincinnati, Ohio, employing more than 5,000 people across 40+ countries. Responsibilities Utilize skills in development areas including data warehousing, business intelligence, and databases (Snowflake, ANSI SQL, SQL Server, T-SQL); Support programming/software development using Extract, Transform, and Load (ETL) and Extract, Load and Transform (ELT) tools, (dbt, Azure Data Factory, SSIS); Design, develop, enhance and support business intelligence systems primarily using Microsoft Power BI; Collect, analyze and document user requirements; Participate in software validation process through development, review, and/or execution of test plan/cases/scripts; Create software applications by following software development lifecycle process, which includes requirements gathering, design, development, testing, release, and maintenance; Communicate with team members regarding projects, development, tools, and procedures; and Provide end-user support including setup, installation, and maintenance for applications Qualifications Bachelor's Degree in Computer Science, Data Science, or a related field; 3+ years of experience in Data Engineering; Knowledge of developing dimensional data models and awareness of the advantages and limitations of Star Schema and Snowflake schema designs; Solid ETL development, reporting knowledge based off intricate understanding of business process and measures; Knowledge of Snowflake cloud data warehouse, Fivetran data integration and dbt transformations is preferred; Knowledge of Python is preferred; Knowledge of REST API; Basic knowledge of SQL Server databases is required; Knowledge of C#, Azure development is a bonus; and Excellent analytical, written and oral communication skills. People. Purpose. Passion. Make a Difference Tomorrow. Join Us Today. The work we’ve done over the past 30+ years has positively impacted the lives of countless patients and families who face hundreds of diseases across all key therapeutic areas. The work we do today will improve the lives of people living with illness and disease in the future. Medpace Perks Flexible work environment Competitive compensation and benefits package Competitive PTO packages Structured career paths with opportunities for professional growth Company-sponsored employee appreciation events Employee health and wellness initiatives Awards Recognized by Forbes as one of America's Most Successful Midsize Companies in 2021, 2022, 2023 and 2024 Continually recognized with CRO Leadership Awards from Life Science Leader magazine based on expertise, quality, capabilities, reliability, and compatibility What to Expect Next A Medpace team member will review your qualifications and, if interested, you will be contacted with details for next steps. EO/AA Employer M/F/Disability/Vets

Posted 3 weeks ago

Apply

10.0 - 20.0 years

10 - 19 Lacs

Karur

Remote

Naukri logo

Greetings from MindPro Technologies Pvt Ltd (Www.mindprotech.com) Job Description for Informatica Lead Position Experience Required: 10+ Years Mode : Remote Key Responsibilities Design & Development Lead the design, development, and implementation of ETL processes using Informatica products. Create, optimize, and maintain mappings, sessions, and workflows to ensure high performance and reliability. API Data Integration Coordinate with development teams to design and manage data ingestion from APIs into the Informatica environment. Develop strategies for real-time or near-real-time data processing, ensuring secure and efficient data flow. Collaborate on API specifications, error handling, and data validation requirements. Data Integration & Warehousing Integrate data from diverse sources (e.g., relational databases, flat files, cloud-based systems, APIs) into target data warehouses or data lakes. Ensure data quality by implementing best practices, validation checks, and error handling. Project Leadership Provide technical oversight and guidance to the ETL development team. Work with and expand development standards, processes, and coding practices to maintain a consistent and high-quality codebase. Coordinate with product owners, project managers, and onshore teams to track progress and meet milestones. Solution Architecture Work with business analysts and stakeholders to gather requirements and translate them into technical solutions. Propose improvements, optimizations, and best practices to enhance existing data integration solutions. Performance Tuning & Troubleshooting Identify bottlenecks in mappings or workflows; recommend and implement performance tuning strategies. Troubleshoot ETL or other data ingestion related issues, perform root cause analysis, and provide solutions in a timely manner. Collaboration & Communication Collaborate with onshore business and technical teams to ensure smooth project execution and knowledge sharing. Communicate project status, potential risks, and technical details to stakeholders and leadership. Participate in regular meetings with onshore teams, aligning on priorities and resolving issues. Documentation & Reporting Maintain comprehensive technical documentation including design specifications, test cases, and operational procedures. Generate reports or dashboards as needed to keep stakeholders informed of project progress and data pipeline performance.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

10 - 16 Lacs

Noida

Work from Office

Naukri logo

Design, build, and manage PostgreSQL databases. Perform tuning, develop ETL, automate processes, ensure security, handle migrations, backups, and recovery. Strong SQL skills, Oracle to PostgreSQL migration a plus.

Posted 3 weeks ago

Apply

8 - 10 years

12 - 17 Lacs

Pune

Work from Office

Naukri logo

About The Role Role Purpose The purpose of the role is to create exceptional architectural solution design and thought leadership and enable delivery teams to provide exceptional client engagement and satisfaction. ? Do 1.Develop architectural solutions for the new deals/ major change requests in existing deals Creates an enterprise-wide architecture that ensures systems are scalable, reliable, and manageable. Provide solutioning of RFP’s received from clients and ensure overall design assurance Develop a direction to manage the portfolio of to-be-solutions including systems, shared infrastructure services, applications in order to better match business outcome objectives Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution design framework/ architecture Provide technical leadership to the design, development and implementation of custom solutions through thoughtful use of modern technology Define and understand current state solutions and identify improvements, options & tradeoffs to define target state solutions Clearly articulate, document and sell architectural targets, recommendations and reusable patterns and accordingly propose investment roadmaps Evaluate and recommend solutions to integrate with overall technology ecosystem Works closely with various IT groups to transition tasks, ensure performance and manage issues through to resolution Perform detailed documentation (App view, multiple sections & views) of the architectural design and solution mentioning all the artefacts in detail Validate the solution/ prototype from technology, cost structure and customer differentiation point of view Identify problem areas and perform root cause analysis of architectural design and solutions and provide relevant solutions to the problem Collaborating with sales, program/project, consulting teams to reconcile solutions to architecture Tracks industry and application trends and relates these to planning current and future IT needs ? Provides technical and strategic input during the project planning phase in the form of technical architectural designs and recommendation Collaborates with all relevant parties in order to review the objectives and constraints of solutions and determine conformance with the Enterprise Architecture Identifies implementation risks and potential impacts 2.Enable Delivery Teams by providing optimal delivery solutions/ frameworks Build and maintain relationships with executives, technical leaders, product owners, peer architects and other stakeholders to become a trusted advisor Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams Recommend tools for reuse, automation for improved productivity and reduced cycle times Leads the development and maintenance of enterprise framework and related artefacts Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams Ensures architecture principles and standards are consistently applied to all the projects Ensure optimal Client Engagement Support pre-sales team while presenting the entire solution design and its principles to the client Negotiate, manage and coordinate with the client teams to ensure all requirements are met and create an impact of solution proposed Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor ? 3.Competency Building and Branding Ensure completion of necessary trainings and certifications Develop Proof of Concepts (POCs),case studies, demos etc. for new growth areas based on market and customer research Develop and present a point of view of Wipro on solution design and architect by writing white papers, blogs etc. Attain market referencability and recognition through highest analyst rankings, client testimonials and partner credits Be the voice of Wipro’s Thought Leadership by speaking in forums (internal and external) Mentor developers, designers and Junior architects in the project for their further career development and enhancement Contribute to the architecture practice by conducting selection interviews etc ? 4.Team Management Resourcing Anticipating new talent requirements as per the market/ industry trends or client requirements Hire adequate and right resources for the team Talent Management Ensure adequate onboarding and training for the team members to enhance capability & effectiveness Build an internal talent pool and ensure their career progression within the organization Manage team attrition Drive diversity in leadership positions Performance Management Set goals for the team, conduct timely performance reviews and provide constructive feedback to own direct reports Ensure that the Performance Nxt is followed for the entire team Employee Satisfaction and Engagement Lead and drive engagement initiatives for the team Track team satisfaction scores and identify initiatives to build engagement within the team Mandatory Skills: Tableau. Experience8-10 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 month ago

Apply

7 - 12 years

9 - 19 Lacs

Bengaluru

Remote

Naukri logo

Payroll Company- Anlage Infotech Client-NTT Data Role & responsibilities Job Posting Title: Sr. ETL Developer Experience : 7+ year of Relevant experience in ETL Location : Any NTT Data Location Temporary Remote Shift Timing : 01:00 PM IST to 11:00 PM IST **************************************************************************************************** MUST HAVE Mandatory Skills for each technology (All MUST) Minimum 7+ Years of hands-on experience with ETL Development Hands on experience with ETL Flows & Jobs – Specifically ADF Pipelines & SSIS Hands on experience with Data Warehouse & Data Mart Should be very strong with SQL Queries on MS-SQL Server **************************************************************************************************** Required Strong hands-on experience in SQLs, PL/SQLs [Procedures, Functions]. Expert level knowledge ETL flows & Jobs [ADF pipeline exp MUST] Experience on MS-SQL [MUST], Oracle DB, PostgreSQL, MySQL. Good knowledge of Data Warehouse/Data Mart. Good knowledge of Data Structures/Models, Integrities constraints, Performance tuning etc. Good Knowledge in Insurance Domain (preferred) Note- Immediate Joiners only, Currently serving notice period, 15 days or Less

Posted 1 month ago

Apply

7 - 12 years

20 - 35 Lacs

Bengaluru

Hybrid

Naukri logo

A bachelors degree in Computer Science or a related field. 5-7 years of experience working as a hands-on developer in Sybase, DB2, ETL technologies. Worked extensively on data integration, designing, and developing reusable interfaces Advanced experience in Python, DB2, Sybase, shell scripting, Unix, Perl scripting, DB platforms, database design and modeling. Expert level understanding of data warehouse, core database concepts and relational database design. Experience in writing stored procedures, optimization, and performance tuning?Strong Technology acumen and a deep strategic mindset. Proven track record of delivering results Proven analytical skills and experience making decisions based on hard and soft data A desire and openness to learning and continuous improvement, both of yourself and your team members. Hands-on experience on development of APIs is a plus Good to have experience with Business Intelligence tools, Source to Pay applications such as SAP Ariba, and Accounts Payable system Skills Required Familiarity with Postgres and Python is a plus

Posted 1 month ago

Apply

2 - 5 years

3 - 7 Lacs

Gurugram

Work from Office

Naukri logo

Role Data Engineer Skills: Data Modeling:* Design and implement efficient data models, ensuring data accuracy and optimal performance. ETL Development:* Develop, maintain, and optimize ETL processes to extract, transform, and load data from various sources into our data warehouse. SQL Expertise:* Write complex SQL queries to extract, manipulate, and analyze data as needed. Python Development:* Develop and maintain Python scripts and applications to support data processing and automation. AWS Expertise:* Leverage your deep knowledge of AWS services, such as S3, Redshift, Glue, EMR, and Athena, to build and maintain data pipelines and infrastructure. Infrastructure as Code (IaC):* Experience with tools like Terraform or CloudFormation to automate the provisioning and management of AWS resources is a plus. Big Data Processing:* Knowledge of PySpark for big data processing and analysis is desirable. Source Code Management:* Utilize Git and GitHub for version control and collaboration on data engineering projects. Performance Optimization:* Identify and implement optimizations for data processing pipelines to enhance efficiency and reduce costs. Data Quality:* Implement data quality checks and validation procedures to maintain data integrity. Collaboration:* Work closely with data scientists, analysts, and other teams to understand data requirements and deliver high-quality data solutions. Documentation:* Maintain comprehensive documentation for all data engineering processes and projects.

Posted 1 month ago

Apply

5 - 10 years

9 - 19 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Job Title: Senior Software Engineer - ETL Developer Main location: Hyderabad/ Bangalore / Chennai Employment Type: Full Time Experience: 5 to 10 yrs Role & responsibilities : Sr ETL Developer Position Description Looking for a Senior ETL Developer who has: ETL Development & Implementation Strong experience in designing, developing, and deploying ETL solutions using Informatica Cloud Services (ICS), Informatica PowerCenter, and other data integration tools. • Data Integration & Optimization Proficient in extracting, transforming, and loading (ETL) data from multiple sources, optimizing performance, and ensuring data quality. • Stakeholder Collaboration Skilled at working with cross-functional teams, including data engineers, analysts, and business stakeholders, to align data solutions with business needs. • Scripting & Data Handling Experience with SQL, PL/SQL, and scripting languages (e.g., Python, Shell) for data manipulation, transformation, and automation. • Tool Proficiency Familiarity with Informatica Cloud, version control systems (e.g., Git), JIRA, Confluence, and Microsoft Office Suite. • Agile Methodologies Knowledge of Agile frameworks (Scrum, Kanban) with experience in managing backlogs, writing user stories, and participating in sprint planning. • Testing & Validation Involvement in ETL testing, data validation, unit testing, and integration testing to ensure accuracy, consistency, and completeness of data. • Problem-Solving Skills Strong analytical mindset to troubleshoot, debug, and optimize ETL workflows, data pipelines, and integration solutions effectively. • Communication & Documentation Excellent written and verbal communication skills to document ETL processes, create technical design documents, and present data integration strategies to stakeholders. Your future duties and responsibilities Required qualifications to be successful in this role Together, as owners, lets turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, youll reach your full potential because You are invited to be an owner from day 1 as we work together to bring our Dream to life. Thats why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our companys strategy and direction. Your work creates value. Youll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. Youll shape your career by joining a company built to grow and last. Youll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our teamone of the largest IT and business consulting services firms in the world.

Posted 1 month ago

Apply

5 - 10 years

10 - 15 Lacs

Pune, Chennai, Bengaluru

Work from Office

Naukri logo

At least 5 years of Information Technology. Experience with hashtag#SQL Server, PL/SQL, data load, extract Experience in hashtag#Unix shell scripting, Python and automations. Experience in Bitbucket, GitLab and GitHub Experience in Jira and Chalk pages. Nice to have hashtag#Automation ( hashtag#Shell , hashtag#Python ) hashtag#Hadoop

Posted 1 month ago

Apply

3 - 8 years

18 - 22 Lacs

Hyderabad, Bengaluru

Work from Office

Naukri logo

ETL Development: Design, develop, and implement Ab Initio components and graphs for ETL processes. Develop complex data pipelines for large-scale data processing. Create and maintain data integration solutions. Data Analysis and Requirements: Analyze data requirements and collaborate with stakeholders to understand business needs. Understand and translate business requirements into technical solutions. Performance Tuning and Optimization: Optimize Ab Initio processes for performance and efficiency. Troubleshoot and debug issues related to application performance and deployment. Code and Documentation:

Posted 1 month ago

Apply

8 - 12 years

25 - 30 Lacs

Hyderabad, Bengaluru

Work from Office

Naukri logo

Key Responsibilities: ETL Development: Design, develop, and implement Ab Initio components and graphs for ETL processes. Develop complex data pipelines for large-scale data processing. Create and maintain data integration solutions. Data Analysis and Requirements: Analyze data requirements and collaborate with stakeholders to understand business needs. Understand and translate business requirements into technical solutions. Performance Tuning and Optimization: Optimize Ab Initio processes for performance and efficiency. Troubleshoot and debug issues related to application performance and deployment. Code and Documentation:

Posted 1 month ago

Apply

11 - 17 years

25 - 30 Lacs

Hyderabad

Work from Office

Naukri logo

ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. ABOUT THE ROLE Role Description We are seeking a highly skilled and experienced Principal AI Solution Architect to join our dynamic team. The candidate will lead the AI Solutioning and Designing across Enterprise Teams and cross-functional teams. They will primarily be working with the MDM CoE to lead and drive AI solutions and optimizations and also provide thought leadership. The role involves developing and implementing AI strategies, collaborating with cross-functional teams, and ensuring the scalability, reliability, and performance of AI solutions. To succeed in this role, the candidate must have strong AI/ML, Data Science , GenAI experience along with MDM knowledg . Candidate must have AI/ML, data science and GenAI experience on technologies like ( PySpark / PyTorch , TensorFlow, LLM , Autogen , Hugging FaceVectorDB , Embeddings, RAGs etc ), along with knowledge of MDM (Master Data Management) Roles & Responsibilities Lead the Designing, solutioning and development of enterprise-level GenAI applications using LLM frameworks such as Langchain , Autogen , and Hugging Face. Architect intelligent pipelines using PySpark , TensorFlow, and PyTorch within Databricks and AWS environments. Implement embedding models and manage VectorStores for retrieval-augmented generation (RAG) solutions. Integrate and leverage MDM platforms like Informatica and Reltio to supply high-quality structured data to ML systems. Utilize SQL and Python for data engineering, data wrangling, and pipeline automation. Build scalable APIs and services to serve GenAI models in production. Lead cross-functional collaboration with data scientists, engineers, and product teams to scope, design, and deploy AI-powered systems. Ensure model governance, version control, and auditability aligned with regulatory and compliance expectations. Basic Qualifications and Experience Master’s degree with 11 - 1 4 years of experience in Data Science, Artificial Intelligence, Computer Science, or related fields OR Bachelor’s degree with 1 5 - 16 years of experience in Data Science, Artificial Intelligence, Computer Science, or related fields OR Diploma with 1 7 - 1 8 years of hands-on experience in Data Science, AI/ML technologies, or related technical domains Functional Skills: Must-Have Skills: 1 4 + years of experience working in AI/ML or Data Science roles, including designing and implementing GenAI solutions. Extensive hands-on experience with LLM frameworks and tools such as Langchain , Autogen , Hugging Face, OpenAI APIs, and embedding models. Expertise in AI/ML solution architecture and design , knowledge of industry best practices Experience desining GenAI based solutions using Databricks platform Hands-on experience with Python, PySpark , PyTorch , LLMs, Vector DB, Embeddings, SciKit , Langchain , SK-learn, Tensorflow , APIs, Autogen , VectorStores , MongoDB, DataBricks , Django Strong knowledge of AWS and cloud-based AI infrastructure Excellent problem-solving skills Strong communication and leadership skills Ability to collaborate effectively with cross-functional teams and stakeholders Experience in managing and mentoring junior team members Must be able to p rovide thought leadership to the junior team members Good-to-Have Skills: Prior experience in Data Modeling, ETL development, and data profiling to support AI/ML workflows. Working knowledge of Life Sciences or Pharma industry standards and regulatory considerations. Proficiency in tools like JIRA and Confluence for Agile delivery and project collaboration. Familiarity with MongoDB, VectorStores , and modern architecture principles for scalable GenAI applications. Professional Certifications Any Data Analysis certification (SQL , Python, Other DBs or Programming languages ) Any cloud certification (AWS or AZURE) Data Science and ML Certification s Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 month ago

Apply

5 - 10 years

5 - 9 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will collaborate with teams to ensure successful project delivery and application functionality. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead and mentor junior professionals Conduct regular team meetings to discuss progress and challenges Stay updated on industry trends and best practices Professional & Technical Skills: Must To Have Skills: Proficiency in Ab Initio Strong understanding of ETL processes Experience with data integration and data warehousing Knowledge of data quality and data governance principles Hands-on experience with Ab Initio GDE and EME tools Additional Information: The candidate should have a minimum of 5 years of experience in Ab Initio This position is based at our Chennai office A 15 years full time education is required Qualification 15 years full time education

Posted 1 month ago

Apply

2 - 7 years

10 - 20 Lacs

Pune, Chennai, Bengaluru

Hybrid

Naukri logo

Salary: 10- 30 LPA Exp: 2-7 years Location: Gurgaon/Pune/Bangalore/Chennai Notice period: Immediate to 30 days..!! Key Responsibilities: 2+ years hands on strong experience in Ab-Initio technology. Should have good knowledge about Ab-Initio components like Reformat, Join, Sort, Rollup, Normalize, Scan, Lookup, MFS, Ab-Initio parallelism and products like Metadata HUB, Conduct>IT, Express>IT, Control center and good to have clear understanding of concepts like Meta programming, continuous flows & PDL. Very good knowledge of Data warehouse, SQL and Unix shell scripting. Knowledge on ETL side of Cloud platforms like AWS or Azure and on Hadoop platform is also an added advantage. Experience in working with banking domain data is an added advantage. Excellent technical knowledge in Design, development & validation of complex ETL features using Ab-Initio. Excellent knowledge in integration with upstream and downstream processes and systems. Ensure compliance to technical standards, and processes. Ability to engage and collaborate with Stakeholders to deliver assigned tasks with defined quality goals. Can work independently with minimum supervision and help the development team on technical Issues. Good Communication and analytical skills.

Posted 1 month ago

Apply

12 - 16 years

35 - 40 Lacs

Bengaluru

Work from Office

Naukri logo

As AWS Data Engineer at organization, you will play a crucial role in the design, development, and maintenance of our data infrastructure. Your work will empower data-driven decision-making and contribute to the success of our data-driven initiatives You will design and maintain scalable data pipelines using AWS data analytical resources, enabling efficient data processing and analytics. Key Responsibilities: - Highly experiences in developing ETL pipelines using AWS Glue and EMR with PySpark/Scala. - Utilize AWS services (S3, Glue, Lambda, EMR, Step Functions) for data solutions. - Design scalable data models for analytics and reporting. - Implement data validation, quality, and governance practices. - Optimize Spark jobs for cost and performance efficiency. - Automate ETL workflows with AWS Step Functions and Lambda. - Collaborate with data scientists and analysts on data needs. - Maintain documentation for data architecture and pipelines. - Experience with Open source bigdata file formats such as Iceberg or delta or Hundi - Desirable to have experience in terraforming AWS data analytical resources. Must-Have Skills: - AWS (S3, Glue, EMR Lambda, EMR), PySpark or Scala, SQL, ETL development. Good-to-Have Skills: - Snowflake, Cloudera Hadoop (HDFS, Hive, Impala), Iceberg

Posted 1 month ago

Apply

12 - 16 years

35 - 40 Lacs

Chennai

Work from Office

Naukri logo

As AWS Data Engineer at organization, you will play a crucial role in the design, development, and maintenance of our data infrastructure. Your work will empower data-driven decision-making and contribute to the success of our data-driven initiatives You will design and maintain scalable data pipelines using AWS data analytical resources, enabling efficient data processing and analytics. Key Responsibilities: - Highly experiences in developing ETL pipelines using AWS Glue and EMR with PySpark/Scala. - Utilize AWS services (S3, Glue, Lambda, EMR, Step Functions) for data solutions. - Design scalable data models for analytics and reporting. - Implement data validation, quality, and governance practices. - Optimize Spark jobs for cost and performance efficiency. - Automate ETL workflows with AWS Step Functions and Lambda. - Collaborate with data scientists and analysts on data needs. - Maintain documentation for data architecture and pipelines. - Experience with Open source bigdata file formats such as Iceberg or delta or Hundi - Desirable to have experience in terraforming AWS data analytical resources. Must-Have Skills: - AWS (S3, Glue, EMR Lambda, EMR), PySpark or Scala, SQL, ETL development. Good-to-Have Skills: - Snowflake, Cloudera Hadoop (HDFS, Hive, Impala), Iceberg

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies