Home
Jobs

175 Etl Development Jobs - Page 4

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

12 - 16 years

35 - 40 Lacs

Mumbai

Work from Office

Naukri logo

As AWS Data Engineer at organization, you will play a crucial role in the design, development, and maintenance of our data infrastructure. Your work will empower data-driven decision-making and contribute to the success of our data-driven initiatives You will design and maintain scalable data pipelines using AWS data analytical resources, enabling efficient data processing and analytics. Key Responsibilities: - Highly experiences in developing ETL pipelines using AWS Glue and EMR with PySpark/Scala. - Utilize AWS services (S3, Glue, Lambda, EMR, Step Functions) for data solutions. - Design scalable data models for analytics and reporting. - Implement data validation, quality, and governance practices. - Optimize Spark jobs for cost and performance efficiency. - Automate ETL workflows with AWS Step Functions and Lambda. - Collaborate with data scientists and analysts on data needs. - Maintain documentation for data architecture and pipelines. - Experience with Open source bigdata file formats such as Iceberg or delta or Hundi - Desirable to have experience in terraforming AWS data analytical resources. Must-Have Skills: - AWS (S3, Glue, EMR Lambda, EMR), PySpark or Scala, SQL, ETL development. Good-to-Have Skills: - Snowflake, Cloudera Hadoop (HDFS, Hive, Impala), Iceberg

Posted 1 month ago

Apply

12 - 16 years

35 - 40 Lacs

Kolkata

Work from Office

Naukri logo

As AWS Data Engineer at organization, you will play a crucial role in the design, development, and maintenance of our data infrastructure. Your work will empower data-driven decision-making and contribute to the success of our data-driven initiatives You will design and maintain scalable data pipelines using AWS data analytical resources, enabling efficient data processing and analytics. Key Responsibilities: - Highly experiences in developing ETL pipelines using AWS Glue and EMR with PySpark/Scala. - Utilize AWS services (S3, Glue, Lambda, EMR, Step Functions) for data solutions. - Design scalable data models for analytics and reporting. - Implement data validation, quality, and governance practices. - Optimize Spark jobs for cost and performance efficiency. - Automate ETL workflows with AWS Step Functions and Lambda. - Collaborate with data scientists and analysts on data needs. - Maintain documentation for data architecture and pipelines. - Experience with Open source bigdata file formats such as Iceberg or delta or Hundi - Desirable to have experience in terraforming AWS data analytical resources. Must-Have Skills: - AWS (S3, Glue, EMR Lambda, EMR), PySpark or Scala, SQL, ETL development. Good-to-Have Skills: - Snowflake, Cloudera Hadoop (HDFS, Hive, Impala), Iceberg

Posted 1 month ago

Apply

5 - 7 years

0 - 0 Lacs

Thiruvananthapuram

Work from Office

Naukri logo

Job Summary: We are seeking a highly skilled SAP BODS Data Engineer with strong expertise in ETL development and Enterprise Data Warehousing (EDW) . The ideal candidate will have a deep understanding of SAP Business Objects Data Services (BODS) and will be responsible for designing, developing, and maintaining robust data integration solutions. Key Responsibilities: Design, develop, and implement efficient ETL solutions using SAP BODS. Build and optimize SAP BODS jobs, including job design, data flows, scripting, and debugging. Develop and maintain scalable data extraction, transformation, and loading (ETL) processes from diverse data sources. Create and manage data integration workflows to ensure high performance and scalability. Collaborate closely with data architects, analysts, and business stakeholders to deliver accurate and timely data solutions. Ensure data quality and consistency across different systems and platforms. Troubleshoot and resolve data-related issues in a timely manner. Document all ETL processes and maintain technical documentation. Required Skills & Qualifications: 3+ years of hands-on experience with ETL development using SAP BODS . Strong proficiency in SAP BODS job design, data flow creation, scripting, and debugging. Solid understanding of data integration , ETL concepts , and data warehousing principles . Proficiency in SQL for data querying, data manipulation, and performance tuning. Familiarity with data modeling concepts and major database systems (e.g., Oracle, SQL Server, SAP HANA). Excellent problem-solving skills and keen attention to detail. Strong communication and interpersonal skills to facilitate effective collaboration. Ability to work independently, prioritize tasks, and manage multiple tasks in a dynamic environment. Required Skills Sap,Edw,Etl

Posted 1 month ago

Apply

8 - 13 years

4 - 8 Lacs

Gurugram

Work from Office

Naukri logo

remote typeOn-site locationsGurugram, HR time typeFull time posted onPosted 5 Days Ago job requisition idREQ401285 Senior ETL Developer What this job involves Are you comfortable working independently without close supervision? We offer an exciting role where you can enhance your skills and play a crucial part in delivering consistent, high-quality administrative and support tasks for the EPM team The Senior ETL Developer/SSIS Administrator will lead the design of logical data models for JLL's EPM Landscape system. This role is responsible for implementing physical database structures and constructs, as well as developing operational data stores and data marts. The role entails developing and fine-tuning SQL procedures to enhance system performance. The individual will support functional tasks of medium-to-high technological complexity and build SSIS packages and transformations to meet business needs. This position contributes to maximizing the value of SSIS within the organization and collaborates with cross-functional teams to align data integration solutions with business objectives. Responsibilities The Senior ETL Developer will be responsible for: Gathering requirements and processing information to design data transformations that will effectively meet end-user needs. Designing, developing, and testing ETL processes for large-scale data extraction, transformation, and loading from source systems to the Data Warehouse and Data Marts. Creating SSIS packages to clean, prepare, and load data into the data warehouse and transfer data to EPM, ensuring data integrity and consistency throughout the ETL process. Monitoring and optimizing ETL performance and data quality. Creating routines for importing data using CSV files. Mapping disparate data sources - relational DBs, text files, Excel files - onto the target schema. Scheduling the packages to extract data at specific time intervals. Planning, coordinating, and supporting ETL processes, including architecting table structure, building ETL processes, documentation, and long-term preparedness. Extracting complex data from multiple data sources into usable and meaningful reports and analyses by implementing PL/SQL queries. Ensuring that the data architecture is scalable and maintainable. Troubleshooting data integration and data quality issues and bugs, analyzing reasons for failure, implementing optimal solutions, and revising procedures and documentation as needed. Utilizing hands-on SQL features Stored Procedures, Indexes, Partitioning, Bulk loads, DB configuration, Security/Roles, Maintenance. Developing queries and procedures, creating custom reports/views, and assisting in debugging. The developer will also be responsible for designing SSIS packages and ensuring their stability, reliability, and performance. Sounds like you? To apply, you need to have: 8+ years of experience in Microsoft SQL Server Management Studioadministration and development Bachelors degree or equivalent Competency in Microsoft Office and Smart View Experience with Microsoft SQL databases and SSIS / SSAS development. Experience working with Microsoft SSIS to create and deploy packages and deploy for ETL processes. Experience in writing and troubleshooting SQL statements, creating stored procedures, views, and SQL functions Experience with data analytics and development. Strong SQL coding experience with performance optimization experience for data queries. Experience creating and supporting SSAS Cubes. Knowledge of Microsoft PowerShell and Batch scripting Good to have Power BI development experience Strong critical and analytical thinking and problem-solving skills Ability to multi-task and thrive in fast-paced, rapidly changing, and complex environment Good written and verbal communication skills Ability to learn new skills quickly to make a measurable difference Strong team player - proven success in contributing to a team-oriented environment Excellent communication (written and oral) and interpersonal skills Excellent troubleshooting and problem resolution skills What we can do for you At JLL, we make sure that you become the best version of yourself by helping you realize your full potential in an entrepreneurial and inclusive work environment. We will empower your ambitions through our dedicated Total Rewards Program, competitive pay, and benefits package. Apply today! Location On-site Gurugram, HR Scheduled Weekly Hours 40 Job Tags: . Jones Lang LaSalle (JLL) is an Equal Opportunity Employer and is committed to working with and providing reasonable accommodations to individuals with disabilities. If you need a reasonable accommodation because of a disability for any part of the employment process including the online application and/or overall selection process you may contact us at . This email is only to request an accommodation. Please direct any other general recruiting inquiries to our page > I want to work for JLL.

Posted 1 month ago

Apply

5 - 10 years

4 - 7 Lacs

Lucknow

Work from Office

Naukri logo

Work alongside our multidisciplinary team of developers and designers to create the next generation of enterprise software Support the entire application lifecycle (concept, design, develop, test, release and support) Special interest in SQL/Query and/or MDX/XMLA/OLAP data sources Responsible for end-to-end product development of Java-based services Work with other developers to implement best practices, introduce new tools, and improve processes Stay up to date with new technology trends Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 15+ years of software development in a professional environment Proficiency in Java Experience integrating services with relational databases and/or OLAP data sources Knowledge and experience in relational database, OLAP and/or query planning Strong working knowledge of SQL and/or MDX/XMLA Knowledge and experience creating applications on cloud platforms (Kubernetes, RedHat OCP) Exposure to agile development, continuous integration, continuous development environment (CI/CD) with tools such asGitHub, JIRA, Jenkins, etc. Other Toolsssh clients, docker Excellent interpersonal and communication skills with ability to effectively articulate technical challenges and devise solutions Ability to work independently in a large matrix environment Preferred technical and professional experience The position requires a back-end developer with strong Java skills Experienced integrating Business Intelligence tools with relational data sources Experienced integrating Business Intelligence tools with OLAP technologies such as SAP/BW, SAP/BW4HANA Experienced defining relational or OLAP test assets -test suites, automated tests - to ensure high code coverage and tight integration with Business Intelligence tools Full lifecycle of SAP/BW and BW4HANA assets - Cube upgrade, server, and server supportadministering, maintaining, and upgrading using current SAP tooling

Posted 1 month ago

Apply

12 - 15 years

15 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role Overview Technology for today and tomorrow The Boeing India Engineering & Technology Center (BIETC) is a 5500+ engineering workforce that contributes to global aerospace growth. Our engineers deliver cutting-edge R&D, innovation, and high-quality engineering work in global markets, and leverage new-age technologies such as AI/ML, IIoT, Cloud, Model-Based Engineering, and Additive Manufacturing, shaping the future of aerospace. People-driven culture At Boeing, we believe creativity and innovation thrives when every employee is trusted, empowered, and has the flexibility to choose, grow, learn, and explore. We offer variable arrangements depending upon business and customer needs, and professional pursuits that offer greater flexibility in the way our people work. We also believe that collaboration, frequent team engagements, and face-to-face meetings bring together different perspectives and thoughts enabling every voice to be heard and every perspective to be respected. No matter where or how our teammates work, we are committed to positively shaping peoples careers and being thoughtful about employee wellbeing. Boeing India Software Engineering team is currently looking for one Lead Software Engineer Developer to join their team in Bengaluru, KA. As a ETL Developer , you will be part of the Application Solutions team, which develops software applications and Digital products that create direct value to its customers. We provide re-vamped work environments focused on delivering data-driven solutions at a rapidly increased pace over traditional development. Be a part of our passionate and motivated team who are excited to use the latest in software technologies for modern web and mobile application development. Through our products we deliver innovative solutions to our global customer base at an accelerated pace. Position Responsibilities: Perform data mining and collection procedures. Ensure data quality and integrity, Interpret and analyze data problems. Visualize data and create reports. Experiments with new models and techniques Determines how data can be used to achieve customer / user goals. Designs data modeling processes Create algorithms and predictive models to for analysis. Enables development of prediction engines, pattern detection analysis, and optimization algorithms, etc. Develops guidance for analytics-based wireframes. Organizes and conducts data assessments. Discovers insights from structured and unstructured data. Estimate user stories/features (story point estimation) and tasks in hours with the required level of accuracy and commit them as part of Sprint Planning. Contributes to the backlog grooming meetings by promptly asking relevant questions to ensure requirements achieve the right level of DOR. Raise any impediments/risks (technical/operational/personal) they come across and approaches Scrum Master/Technical Architect/PO accordingly to arrive at a solution. Update the status and the remaining efforts for their tasks on a daily basis. Ensures change requests are treated correctly and tracked in the system, impact analysis done, and risks/timelines are appropriately communicated. Hands-on experience in understanding aerospace domain specific data Must coordinate with data scientists in data preparation, exploration and making data ready. Must have clear understanding of defining data products and monetizing. Must have experience in building self-service capabilities to users. Build quality checks across the data lineage and responsible in designing and implementing different data patterns. Can influence different stakeholders for funding and building the vision of the product in terms of usage, productivity, and scalability of the solutions. Build impactful or outcome-based solutions/products. Basic Qualifications (Required Skills/Experience): Bachelors or masters degree as BASIC QUALIFICATION 12-15 years of experience as a data engineer. Expertise in SQL, Python, Knowledge of Java, Oracle, R, Data modeling, Power BI. Experience in understanding and interacting with multiple data formats. Ability to rapidly learn and understand software from source code. Expertise in understanding, analyzing & optimizing large, complicated SQL statements Strong knowledge and experience in SQL Server, database design and ETL queries. Develop software models to simulate real world problems to help operational leaders understand on which variables to focus. Candidate should have proficiency to streamline and optimize databases for efficient and consistent data consumption. Strong understanding of Datawarehouse concepts, data lake, data mesh Familiar with ETL tools and Data ingestion patterns Hands on experience in building data pipelines using GCP. Hands on experience in writing complex SQL (No- SQL is a big plus) Hands on experience with data pipeline orchestration tools such as Airflow/GCP Composer Hands on experience on Data Modelling Experience in leading teams with diversity Experience in performance tuning of large datawarehouse/datalakes. Exposure to prompt engineering, LLMs, and vector DB. Python, SQL and Pyspark Spark Ecosystem (Spark Core, Spark Streaming, Spark SQL) / Databricks Azure (ADF, ADB, Logic Apps, Azure SQL database, Azure Key Vaults, ADLS, Synapse) Preferred Qualifications [Required Skills/Experience] PubSUB, Terraform Deep Learning - Tensor flow Time series, BI/Visualization Tools - Power BI and Tablaeu, Languages - R/Phython Deep Learning - Tensor flow Machine Learning NLP Typical Education & Experience Education/experience typically acquired through advanced education (e.g. Bachelor) and typically 12 to 15 years' related work experience or an equivalent combination of education and experience (e.g. Master+11 years of related work experience etc.) Relocation This position does offer relocation within INDIA. Export Control Requirements This is not an Export Control position. Education Bachelor's Degree or Equivalent Required Relocation This position offers relocation based on candidate eligibility. Visa Sponsorship Employer will not sponsor applicants for employment visa status. Shift Not a Shift Worker (India)

Posted 1 month ago

Apply

7 - 12 years

15 - 30 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Warm Greetings from SP Staffing Services Pvt Ltd!!!! Experience:4-13yrs Work Location :Bglre/Hybed/Chennai/Pune/Kolkata Participates in ETL Design of new or changing mappings and workflows with the team and prepares technical specifications. - Creates ETL Mappings, Mapplets, Workflows, Worklets using Informatica PowerCenter 10.x and prepare corresponding documentation. - Designs and builds integrations supporting standard data warehousing objects (type-2 dimensions, aggregations, star schema, etc.). - Performs source system analysis as required. - Works with DBAs and Data Architects to plan and implement appropriate data partitioning strategy in Enterprise Data Warehouse. - Implements versioning of the ETL repository and supporting code as necessary. - Develops stored procedures, database triggers and SQL queries where needed. - Implements best practices and tunes SQL code for optimization. - Loads data from SF Power Exchange to Relational database using Informatica. - Works with XML's, XML parser, Java and HTTP transformation within Informatica. - Works with Informatica Data Quality (Analyst and Developer) - Primary skill is Informatica PowerCenter Interested candidates, Kindly share your updated resume to ramya.r@spstaffing.in or contact number 8667784354 (Whatsapp:9597467601) to proceed further

Posted 1 month ago

Apply

5 - 8 years

8 - 18 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Dear Candidate Wonderful Job opportunity for ETL Developer Location -Hyderabad Greetings from LTIMindtree !!! I Saw your profile on Naukri I was really impressed by your experience as a Advanced SQL As you know LTIMindtree is growing rapidly and want to be a part of this journey. We are currently looking someone like you to join back our team.

Posted 1 month ago

Apply

5 - 10 years

5 - 15 Lacs

Mumbai Suburban, Navi Mumbai, Mumbai (All Areas)

Work from Office

Naukri logo

Solid SQL skills, experience writing scripts for automation using python or shell script and hands on experience using any ETL tool like SSIS, Informatica etc. Role & responsibilities • Develop, design, and maintain interactive and visually compelling reports, dashboards, and analytics solutions using tools such as Power BI, and AWS Quicksight. • Collaborate with business stakeholders, data engineers, and analysts to gather requirements, understand data sources, and ensure alignment of reporting solutions with business needs. • Write simple to complex SQL queries to extract, transform, and manipulate data from various sources, ensuring accuracy, efficiency, and performance. • Identify and troubleshoot data quality and integration issues, proposing effective solutions to enhance data accuracy and reliability within reporting solutions. • Stay up-to-date with industry trends and emerging technologies related to reporting, analytics, and data visualization to continually improve and innovate reporting practices. • Work closely with the development team to integrate reporting solutions into existing applications or systems. • Perform data analysis to identify trends, patterns, and insights, providing valuable information to business stakeholders. • Collaborate with the team to document data models, report specifications, and technical processes. • Participate actively in team meetings, discussions, and knowledge-sharing sessions, contributing to the growth of the team's capabilities. Preferred candidate profile Strong proficiency in SQL with the ability to write complex queries and optimize query performance. Extensive experience with data visualization tools such as Tableau and Power BI; familiarity with AWS Quick sight is a plus. Solid understanding of data warehousing concepts, data modeling, and ETL processes. Exceptional problem-solving skills and the ability to find innovative solutions to technical challenges. English B2 Level or Higher Basic to intermediate Python and Machine learning knowledge is a plus Knowledge of AI is a plus

Posted 1 month ago

Apply

6 - 11 years

30 - 35 Lacs

Indore, Hyderabad, Delhi / NCR

Work from Office

Naukri logo

Support enhancements to the MDM platform Track System Performance Troubleshoot issues Resolve production issues Required Candidate profile 5+ years in Python and advanced SQL including profiling, refactoring Experience with REST API and Hands on Azure Databricks and ADF Experience with Markit EDM or Semarchy

Posted 1 month ago

Apply

2 - 5 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

About The Role Process Manager - AWS Data Engineer Mumbai/Pune| Full-time (FT) | Technology Services Shift Timings - EMEA(1pm-9pm)|Management Level - PM| Travel Requirements - NA The ideal candidate must possess in-depth functional knowledge of the process area and apply it to operational scenarios to provide effective solutions. The role enables to identify discrepancies and propose optimal solutions by using a logical, systematic, and sequential methodology. It is vital to be open-minded towards inputs and views from team members and to effectively lead, control, and motivate groups towards company objects. Additionally, candidate must be self-directed, proactive, and seize every opportunity to meet internal and external customer needs and achieve customer satisfaction by effectively auditing processes, implementing best practices and process improvements, and utilizing the frameworks and tools available. Goals and thoughts must be clearly and concisely articulated and conveyed, verbally and in writing, to clients, colleagues, subordinates, and supervisors. Process Manager Roles and responsibilities: Understand clients requirement and provide effective and efficient solution in AWS using Snowflake. Assembling large, complex sets of data that meet non-functional and functional business requirements Using Snowflake / Redshift Architect and design to create data pipeline and consolidate data on data lake and Data warehouse. Demonstrated strength and experience in data modeling, ETL development and data warehousing concepts Understanding data pipelines and modern ways of automating data pipeline using cloud based Testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions Perform data quality testing and assurance as a part of designing, building and implementing scalable data solutions in SQL Technical and Functional Skills: AWS ServicesStrong experience with AWS data services such as S3, EC2, Redshift, Glue, Athena, EMR, etc. Programming LanguagesProficiency in programming languages commonly used in data engineering such as Python, SQL, Scala, or Java. Data WarehousingExperience in designing, implementing, and optimizing data warehouse solutions on Snowflake/ Amazon Redshift. ETL ToolsFamiliarity with ETL tools and frameworks (e.g., Apache Airflow, AWS Glue) for building and managing data pipelines. Database ManagementKnowledge of database management systems (e.g., PostgreSQL, MySQL, Amazon Redshift) and data lake concepts. Big Data TechnologiesUnderstanding of big data technologies such as Hadoop, Spark, Kafka, etc., and their integration with AWS. Version ControlProficiency in version control tools like Git for managing code and infrastructure as code (e.g., CloudFormation, Terraform). Problem-solving Skills: Ability to analyze complex technical problems and propose effective solutions. Communication Skills: Strong verbal and written communication skills for documenting processes and collaborating with team members and stakeholders. Education and ExperienceTypically, a bachelors degree in Computer Science, Engineering, or a related field is required, along with 5+ years of experience in data engineering and AWS cloud environments. About eClerx eClerx is a global leader in productized services, bringing together people, technology and domain expertise to amplify business results. Our mission is to set the benchmark for client service and success in our industry. Our vision is to be the innovation partner of choice for technology, data analytics and process management services. Since our inception in 2000, we've partnered with top companies across various industries, including financial services, telecommunications, retail, and high-tech. Our innovative solutions and domain expertise help businesses optimize operations, improve efficiency, and drive growth. With over 18,000 employees worldwide, eClerx is dedicated to delivering excellence through smart automation and data-driven insights. At eClerx, we believe in nurturing talent and providing hands-on experience. About eClerx Technology eClerxs Technology Group collaboratively delivers Analytics, RPA, AI, and Machine Learning digital technologies that enable our consultants to help businesses thrive in a connected world. Our consultants and specialists partner with our global clients and colleagues to build and implement digital solutions through a broad spectrum of activities. To know more about us, visit https://eclerx.com eClerx is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability or protected veteran status, or any other legally protected basis, per applicable law

Posted 1 month ago

Apply

7 - 12 years

30 - 40 Lacs

Pune, Bengaluru, Delhi / NCR

Hybrid

Naukri logo

Support enhancements to the MDM platform Develop pipelines using snowflake python SQL and airflow Track System Performance Troubleshoot issues Resolve production issues Required Candidate profile 5+ years of hands on expert level Snowflake, Python, orchestration tools like Airflow Good understanding of investment domain Experience with dbt, Cloud experience (AWS, Azure) DevOps

Posted 1 month ago

Apply

1 - 3 years

1 - 5 Lacs

Pune

Work from Office

Naukri logo

Job Title : Data Analyst- (SQL & ETL Process) Experience : 1 to 3 Years Location : Pune Industry : Investment Banking / Capital Markets Employment Type : Full Time Role & responsibilities -Collaborate with business and technical teams to gather and analyze requirements for models and data jobs. -Develop detailed technical specifications and data mappings for data models and system interfaces. -Design and develop efficient SQL queries; expertise in Complex SQL , PL/SQL , or T-SQL is essential. -Optimize and tune SQL for high performance and large datasets. -Build and maintain dashboards and reports using Power BI and/or Qlik . -Work with modern cloud databases like Snowflake and handle semi-structured data formats including JSON and XML . -Support and implement API-based data exchange; familiarity with XML , XSD , and REST APIs is a plus. -Develop conceptual, logical, and physical data models; strong understanding of ER modeling , dimensional modeling , and granularity . Collaborate in cloud-based environments, primarily using Azure and Snowflake . Key Skills: SQL, PL/SQL, T-SQL SQL Performance Tuning Power BI, Qlik Data Modeling (ER, Dimensional) JSON, XML, XSD API Integration (preferred) Azure, Snowflake (Cloud Platforms) Preferred Qualifications: Bachelors degree in Computer Science, Engineering, or a related field. Experience working in investment banking or capital markets domains is highly preferred . Excellent problem-solving skills and ability to work independently in a fast-paced environment.

Posted 1 month ago

Apply

15 - 20 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Data Management Practitioner Project Role Description : Maintain the quality and compliance of an organizations data assets. Design and implement data strategies, ensuring data integrity and enforcing governance policies. Establish protocols to handle data, safeguard sensitive information, and optimize data usage within the organization. Design and advise on data quality rules and set up effective data compliance policies. Must have skills : Teradata Vantage Good to have skills : Data Architecture Principles, Teradata BI, Amazon Web Services (AWS) Minimum 15 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Management Practitioner, you will be responsible for maintaining the quality and compliance of an organization's data assets. You will design and implement data strategies, ensure data integrity, enforce governance policies, and optimize data usage within the organization. Roles & Responsibilities: Expected to be a SME with deep knowledge and experience. Should have Influencing and Advisory skills. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Expected to provide solutions to problems that apply across multiple teams. Design and implement data quality rules. Advise on data compliance policies. Develop protocols to handle and safeguard sensitive data. Professional & Technical Skills: Must To Have Skills: Proficiency in Teradata Vantage. Good To Have Skills: Experience with Data Architecture Principles, Amazon Web Services (AWS), Teradata BI. Strong understanding of data management principles. Experience in implementing data governance policies. Knowledge of data integrity and compliance standards. Additional Information: The candidate should have a minimum of 15 years of experience in Teradata Vantage. This position is based at our Bengaluru office. A 15 years full-time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

10 - 17 years

25 - 35 Lacs

Mumbai, Bengaluru

Hybrid

Naukri logo

Roles and Responsibilities- 1. Should have hands-on experience with Azure Data Services, Azure Data Factory (ADF), Azure Synapse. 2. Expertise in Python 3.Expereicne in Azure DevOps pipeline 4.ETL Development

Posted 2 months ago

Apply

5 - 10 years

6 - 16 Lacs

Bengaluru

Hybrid

Naukri logo

Job description, Hiring for ETL developer with experience range 5 years & above Mandatory Skills: ETL developer, IBM Infosphere/Data stage Education: BE/B.Tech/MCA/M.Tech/MSc./MS Location - Bangalore Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements. You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!

Posted 2 months ago

Apply

5 - 10 years

1 - 6 Lacs

Ahmedabad, Bengaluru, Kolkata

Hybrid

Naukri logo

Job description, Hiring for ETL developer with experience range 5 years & above Mandatory Skills: ETL developer, IBM Infosphere/Data stage Education: BE/B.Tech/MCA/M.Tech/MSc./MS Location - Bangalore Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements. You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!

Posted 2 months ago

Apply

3 - 6 years

4 - 8 Lacs

Chennai

Work from Office

Naukri logo

-Design, implement, and continuously expand data pipelines by performing extraction, transformation, and loading activities- Gather requirements and business process knowledge in order to transform the data in a way thatu geared towards the needs of end users- Maintain and improve already existing processes- Ensure that the data architecture is scalable and maintainable- Work with the business in designing and delivering correct, high quality data-Investigate data to identify potential issues within ETL pipelines, notify end-users and propose adequate solutions -Prepare documentation for further reference

Posted 2 months ago

Apply

4 - 8 years

15 - 25 Lacs

Bengaluru

Remote

Naukri logo

Mandatory Skills:- On-Premises, Migration, Microsoft fabric, Azure Data factory, Creating pipeline in ADS, DBA, (Optional), Python, Expertise in building and managing ETL processes across Azure & on-prem environment. Strong in data integration, Required Candidate profile Design, develop, and maintain ETL processes to extract, transform, and load data between on-premises SQL Server databases and Azure-based solutions.

Posted 2 months ago

Apply

6 - 9 years

9 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

Expertise in SQL, performance tuning, database development, ETL development, Unix/shell scripting Design, develop, and optimize database systems Develop and optimize ETL pipelines using Python, Unix/shell scripting,other ETL tools Write SQL queries

Posted 2 months ago

Apply

2 - 5 years

3 - 8 Lacs

Pune, Sangli

Work from Office

Naukri logo

We are seeking a skilled ETL SSIS Developer to design, develop, and maintain robust ETL solutions using Microsoft SQL Server Integration Services (SSIS). The ideal candidate will be responsible for data extraction, transformation, and loading (ETL) processes to ensure seamless data integration and reporting across multiple platforms. Key Responsibilities: Develop, implement, and maintain ETL processes using SSIS and SQL Server . Design, build, and optimize data warehouse solutions to support business intelligence and reporting needs. Collaborate with business analysts, data architects, and stakeholders to understand data requirements and develop efficient ETL solutions. Optimize and tune ETL workflows for performance, scalability, and reliability . Ensure data integrity, quality, and security across all ETL processes. Debug, troubleshoot, and resolve data inconsistencies, performance issues, and failures in ETL processes. Implement logging, monitoring, and error handling for ETL jobs to ensure stability and reliability. Document technical solutions, processes, and best practices for ETL development. Required Skills & Qualifications: 2+ years of hands-on experience in ETL development using SSIS . Proficiency in SQL Server (T-SQL, Stored Procedures, Functions, Triggers, Performance Tuning, Indexing, and Query Optimization) . Experience with Data Warehousing concepts, Star Schema, and Snowflake Schema . Strong understanding of data modeling, database design, and ETL architecture . Experience with job scheduling and automation tools (e.g., SQL Agent, PowerShell, or third-party tools). Familiarity with cloud-based ETL solutions (Azure Data Factory, AWS Glue, etc.) is a plus. Knowledge of SSRS, Power BI, or other BI tools is an advantage. Ability to troubleshoot, debug, and optimize ETL workflows effectively. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities.

Posted 2 months ago

Apply

2 - 4 years

4 - 6 Lacs

Hyderabad

Work from Office

Naukri logo

The CDP ETL & Database Engineer will specialize in architecting, designing, and implementing solutions that are sustainable and scalable. The ideal candidate will understand CRM methodologies, with an analytical mindset, and a background in relational modeling in a Hybrid architecture. The candidate will help drive the business towards specific technical initiatives and will work closely with the Solutions Management, Delivery, and Product Engineering teams. The candidate will join a team of developers across the US, India & Costa Rica. Responsibilities : ETL Development The CDP ETL & Database Engineer will be responsible for building pipelines to feed downstream data processes. They will be able to analyze data, interpret business requirements, and establish relationships between data sets. The ideal candidate will be familiar with different encoding formats and file layouts such as JSON and XML. I mplementations & Onboarding Will work with the team to onboard new clients onto the ZMP/CDP+ platform. The candidate will solidify business requirements, perform ETL file validation, establish users, perform complex aggregations, and syndicate data across platforms. The hands-on engineer will take a test-driven approach towards development and will be able to document processes and workflows. Incremental Change Requests The CDP ETL & Database Engineer will be responsible for analyzing change requests and determining the best approach towards implementation and execution of the request. This requires the engineer to have a deep understanding of the platform's overall architecture. Change requests will be implemented and tested in a development environment to ensure their introduction will not negatively impact downstream processes. Change Data Management The candidate will adhere to change data management procedures and actively participate in CAB meetings where change requests will be presented and approved. Prior to introducing change, the engineer will ensure that processes are running in a development environment. The engineer will be asked to do peer-to-peer code reviews and solution reviews before production code deployment. Collaboration & Process Improvement The engineer will be asked to participate in knowledge share sessions where they will engage with peers, discuss solutions, best practices, overall approach, and process. The candidate will be able to look for opportunities to streamline processes with an eye towards building a repeatable model to reduce implementation duration. Job Requirements : The CDP ETL & Database Engineer will be well versed in the following areas: Relational data modeling ETL and FTP concepts Advanced Analytics using SQL Functions Cloud technologies - AWS, Snowflake Able to decipher requirements, provide recommendations, and implement solutions within predefined timeframes. The ability to work independently, but at the same time, the individual will be called upon to contribute in a team setting. The engineer will be able to confidently communicate status, raise exceptions, and voice concerns to their direct manager. Participate in internal client project status meetings with the Solution/Delivery management teams. When required, collaborate with the Business Solutions Analyst (BSA) to solidify requirements. Ability to work in a fast paced, agile environment; the individual will be able to work with a sense of urgency when escalated issues arise. Strong communication and interpersonal skills, ability to multitask and prioritize workload based on client demand. Familiarity with Jira for workflow mgmt., and time allocation. Familiarity with Scrum framework, backlog, planning, sprints, story points, retrospectives etc. Required Skills : ETL ETL tools such as Talend (Preferred, not required) DMExpress Nice to have Informatica Nice to have Database - Hands on experience with the following database Technologies Snowflake (Required) MYSQL/PostgreSQL Nice to have Familiar with NOSQL DB methodologies (Nice to have) Programming Languages Can demonstrate knowledge of any of the following. PLSQL JavaScript Strong Plus Python - Strong Plus Scala - Nice to have AWS Knowledge of the following AWS services: S3 EMR (Concepts) EC2 (Concepts) Systems Manager / Parameter Store Understands JSON Data structures, key value pair. Working knowledge of Code Repositories such as GIT, Win CVS, SVN. Workflow management tools such as Apache Airflow, Kafka, Automic/Appworx Jira Minimum Qualifications Bachelor's degree or equivalent 2-4 Years' experience Excellent verbal & written communications skills Self-Starter, highly motivated Analytical mindset.

Posted 2 months ago

Apply

2 - 7 years

4 - 9 Lacs

Coimbatore

Work from Office

Naukri logo

Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Google Cloud Machine Learning Services Good to have skills : GCP Dataflow, Google Pub/Sub, Google Dataproc Minimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :We are seeking a skilled GCP Data Engineer to join our dynamic team. The ideal candidate will design, build, and maintain scalable data pipelines and solutions on Google Cloud Platform (GCP). This role requires expertise in cloud-based data engineering and hands-on experience with GCP tools and services, ensuring efficient data integration, transformation, and storage for various business use cases.________________________________________ Roles & Responsibilities: Design, develop, and deploy data pipelines using GCP services such as Dataflow, BigQuery, Pub/Sub, and Cloud Storage. Optimize and monitor data workflows for performance, scalability, and reliability. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and implement solutions. Implement data security and governance measures, ensuring compliance with industry standards. Automate data workflows and processes for operational efficiency. Troubleshoot and resolve technical issues related to data pipelines and platforms. Document technical designs, processes, and best practices to ensure maintainability and knowledge sharing.________________________________________ Professional & Technical Skills:a) Must Have: Proficiency in GCP tools such as BigQuery, Dataflow, Pub/Sub, Cloud Composer, and Cloud Storage. Expertise in SQL and experience with data modeling and query optimization. Solid programming skills in Python ofor data processing and ETL development. Experience with CI/CD pipelines and version control systems (e.g., Git). Knowledge of data warehousing concepts, ELT/ETL processes, and real-time streaming. Strong understanding of data security, encryption, and IAM policies on GCP.b) Good to Have: Experience with Dialogflow or CCAI tools Knowledge of machine learning pipelines and integration with AI/ML services on GCP. Certifications such as Google Professional Data Engineer or Google Cloud Architect.________________________________________ Additional Information: - The candidate should have a minimum of 3 years of experience in Google Cloud Machine Learning Services and overall Experience is 3- 5 years - The ideal candidate will possess a strong educational background in computer science, mathematics, or a related field, along with a proven track record of delivering impactful data-driven solutions. Qualifications 15 years full time education

Posted 2 months ago

Apply

5 - 7 years

12 - 18 Lacs

Mumbai

Work from Office

Naukri logo

Workday Integration Analyst, ETL, APIs, and data warehousing . managing HR system integrations, troubleshooting issues, ensuring data accuracy , EIB, Workday Studio, XML, Java, and Web Services (SOAP, REST, WSDL) Share your CV @ HR@Akriyagroup.com

Posted 2 months ago

Apply

3 - 7 years

15 - 30 Lacs

Pune, Bengaluru, Gurgaon

Hybrid

Naukri logo

Salary: 10- 30 LPA Exp: 2-7 years Location: Gurgaon/Pune/Bangalore/Chennai Notice period: Immediate to 30 days..!! Key Responsibilities: 2+ years hands on strong experience in Ab-Initio technology. Should have good knowledge about Ab-Initio components like Reformat, Join, Sort, Rollup, Normalize, Scan, Lookup, MFS, Ab-Initio parallelism and products like Metadata HUB, Conduct>IT, Express>IT, Control center and good to have clear understanding of concepts like Meta programming, continuous flows & PDL. Very good knowledge of Data warehouse, SQL and Unix shell scripting. Knowledge on ETL side of Cloud platforms like AWS or Azure and on Hadoop platform is also an added advantage. Experience in working with banking domain data is an added advantage. Excellent technical knowledge in Design, development & validation of complex ETL features using Ab-Initio. Excellent knowledge in integration with upstream and downstream processes and systems. Ensure compliance to technical standards, and processes. Ability to engage and collaborate with Stakeholders to deliver assigned tasks with defined quality goals. Can work independently with minimum supervision and help the development team on technical Issues. Good Communication and analytical skills.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies