Home
Jobs

369 Datastage Jobs - Page 12

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4 - 9 years

6 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role : * 6+ years of project hands-on experience as a DataStage Developer. * Strong analysis knowledge in DataStage jobs analysis and extracting the business logic and rule. * Strong analytical and problem-solving knowledge in DataStage Shell script. * Co-ordinate with ETL developer to help them understand the parallel and sequence job logic. * Experience in CICD /DevOps. * Experience in software development life cycle (Agile/Scrum). * Experience in Project life cycle activities including release/deployment on development projects. * Experience and desire to work in a Global delivery environment. * Strong communication and Analytical skills. * Ability to work in team in diverse/ multiple stakeholder environment. * Installation, upgrade, configuration, and troubleshooting/problem resolution of DataStage Jobs. * Provide support to the DataStage developers/programmers in implementing systems and resolving issues * Sound knowledge in DataStage jobs analysis including installation, upgrade configuration, backup/restore and recovery processes. * Demonstrated knowledge & experience with programming languages, specifically including Shell Script. * Solid interpersonal and business relationship skills, including the ability to work independently or on a team * Sound verbal & written communication skills Primary Skills Good knowledge in monitoring & debugging datastage jobs. good knowledge in sql & any other database. Basic knowledge in Unix shell scripting

Posted 3 months ago

Apply

6 - 10 years

8 - 13 Lacs

Chennai, Hyderabad

Work from Office

Naukri logo

What youll be doing... You will be working in a Data product ownership model with business and work on solving real world problems by compiling and analyzing data and help tell the story behind the numbers. This position offers opportunities to drive better business partnering and insights, while developing your Data Intelligence skill set and leadership as we continue to grow as a world class organization. Youll become involved in, but not limited to, discovery, planning, integrating, modeling, analysis and reporting that will impact important decisions around the growth and development of Verizon business. Performing subject matter expertise & ad-hoc analysis, including identifying new revenue streams and improving operational efficiencies, reduction in man hours, new metrics insights and drivers with the help of the supply chain, logistic transportation and network end to end data operations and data products. Ensuring timely and accurate delivery of data intelligence applications for planning, reporting & analysis for the business. Liaison with cross-functional teams and business partners to build a network and acquire advanced business & technical acumen. Identifying improvement opportunities and executing projects which may include leveraging digital tools for cloud technologies, data workflow creation, system integration, automation tools and dashboards. Play a crucial role in defining the data architecture framework, standards and principles, including modeling, metadata, security and reference data. What were looking for... Youll need to have: Bachelors degree of four or more years of work experience. Six or more years of relevant work experience. Experience in data lifecycle management. Proven track record in design and build of the infrastructure for data extraction, preparation, and loading of data from a variety of sources using technology such as SQL ,Non -SQL and Big data. Identifying ways to improve data reliability, efficiency and quality by various data solution techniques. Experience in Google Cloud Platform technologies like Big query, Composer, Dataflow. Experience or transferable skills leveraging digital tools such as Tableau, Qlik, Looker, ThoughtSpot, Alteryx, SQL, Python, or R. Expert Knowledge of ETL process and reporting tools. Experience in dashboard development using Looker/Tableau/Thoughtspot. Experience in analyzing large amounts of information to discover trends and patterns. Experience with Microsoft Office Suite and Google Suite. Even better if you have one or more of the following: Masters degree or direct work experience in Data analytics, Supply chain or Telecom industry. Expert in writing Complex SQL queries and scripts using databases/tools like Oracle, SQL Server, or Google BigQuery, Data Stage, Python, Snowflake and pulling data from SQL/EDW data warehouses. Master Knowledge of common business & cost drivers, operational statement analysis, and Storytelling. Industry standard Data Automation and Proactive Alerting skills. Excellent communication skills and ability to focus on the details. Proficiency with Google Suite.

Posted 3 months ago

Apply

8 - 13 years

15 - 25 Lacs

Pune, Bengaluru, Hyderabad

Hybrid

Naukri logo

Role & responsibilities 1. Data Analysis, Data transformation of the back-end of an application 2. Performance of the dataset according to consumption methods 3. Data modeling exposure with strong SQL knowledge 4. Strong hands-on ETL IBM Datastage pipelines development, testing and support. 5. Good programming / scripting in UNIX (bash) skills for development 6. Good Knowledge of understanding Data warehouse & Data Integration techniques. 7. Application performance tuning to optimize resource and time utilization. 8. Create data integration and technical solutions for the customer applications. 9. Detail oriented with good problem solving, organizational, analysis, highly motivated and adaptive with the ability to grasp things quickly. 10. Ability to work effectively and efficiently in a team and individually with excellent interpersonal, technical and communication skills. Preferred candidate profile 1.Azure cloud knowledge or experience 2. Python scripting, Data pipeline processing 3. Experience in Banking Financial systems (Risk & Finance) 4. Knowledge / experience in Hadoop, Hive data processing.

Posted 3 months ago

Apply

6 - 10 years

8 - 13 Lacs

Mumbai

Work from Office

Naukri logo

Role & responsibilities 6+ years of experience with proven track of - Strong SQL background Strong DataStage tool experience Airflow, GitHub, Git Actions or Jenkins Teradata or similar DB experience Python and Spark background is plus

Posted 3 months ago

Apply

5 - 10 years

7 - 13 Lacs

Pune

Hybrid

Naukri logo

his is Abirami from SRS Infoway. We are recruiting for our client company Infosys to Contract to hire for the position of API Developer. If you are interested, please E-mail your updated resume to abirami.b@srsinfoway.com Thanks & Regards Abirami abirami.b@srsinfoway.com SRS Infoway

Posted 3 months ago

Apply

7 - 10 years

7 - 17 Lacs

Chennai

Work from Office

Naukri logo

Experience - 7+ years Job Location - Chennai/Hyderabad/Bangalore/Pune Skills Required: Should have the development experience in Datastage. Should have the experience in leading a team.

Posted 3 months ago

Apply

7 - 10 years

7 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

Experience - 7+ years Job Location - Chennai/Hyderabad/Bangalore/Pune Skills Required: Should have the development experience in Datastage. Should have the experience in leading a team.

Posted 3 months ago

Apply

7 - 10 years

7 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

Experience - 7+ years Job Location - Chennai/Hyderabad/Bangalore/Pune Skills Required: Should have the development experience in Datastage. Should have the experience in leading a team.

Posted 3 months ago

Apply

6 - 10 years

8 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Role & responsibilities 6+ years of experience with proven track of - Strong SQL background Strong DataStage tool experience Airflow, GitHub, Git Actions or Jenkins Teradata or similar DB experience Python and Spark background is plus

Posted 3 months ago

Apply

6 - 10 years

8 - 13 Lacs

Hyderabad

Work from Office

Naukri logo

Role & responsibilities 6+ years of experience with proven track of - Strong SQL background Strong DataStage tool experience Airflow, GitHub, Git Actions or Jenkins Teradata or similar DB experience Python and Spark background is plus

Posted 3 months ago

Apply

4 - 7 years

7 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

1. Candidate having hands on experience in IBM Datastage as developer 2. Having experience in designing developing Datastage Jobs to solve complex business requirement 3. Strong knowledge of SQL & UNIX 4. Managing and coordinating team across locations.

Posted 3 months ago

Apply

4 - 7 years

7 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

1. Candidate having hands on experience in IBM Datastage as developer 2. Having experience in designing developing Datastage Jobs to solve complex business requirement 3. Strong knowledge of SQL & UNIX 4. Managing and coordinating team across locations.

Posted 3 months ago

Apply

5 - 10 years

6 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

#Hiring #Top MNC #Hiring Alert

Posted 3 months ago

Apply

5 - 10 years

6 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

#Hiring #Top MNC #Hiring Alert

Posted 3 months ago

Apply

4 - 7 years

7 - 10 Lacs

Bangalore Rural

Work from Office

Naukri logo

Hiring for DataStage Developer Contract Role Experience: 4+ Years Location: PAN India Sivasakthi sivashakthi@srsinfoway.com

Posted 3 months ago

Apply

4 - 7 years

7 - 10 Lacs

Pune

Work from Office

Naukri logo

Hiring for DataStage Developer Contract Role Experience: 4+ Years Location: PAN India Sivasakthi sivashakthi@srsinfoway.com

Posted 3 months ago

Apply

3 - 8 years

10 - 20 Lacs

Pune

Hybrid

Naukri logo

- Understands requirements to build, enhance, or integrate programs and processes for one or more Acxiom Client solutions and/or applications - responsible for new development, ongoing maintenance, support, and optimization

Posted 3 months ago

Apply

3 - 6 years

5 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

We re looking for a Build and Release Engineer Lead to join our ever-evolving Information Technology team to help us unleash the potential of every business. About the team We are Worldpay s Data Solution team building high value assets to support the Finance for informed decision making. The candidate must be primarily working with the offshore team to support the data assets built using Snowflake, DataStage, AWS. Hence the candidate must possess these skill sets including string SQL background. What you ll own 1)Responsible for providing high-level consulting services to clients for Data Engineering assignments 2)Designs, develop, and implementation of complex, large-scale data & ETL pipelines and related system projects. 3)Reviews, analyzes, and modifies programming systems including encoding, testing, debugging, and installing for a complex, large-scale computer system Where you ll own it You ll own it in our modern Bangalore hub. With hubs in the heart of city centers and tech capitals, things move fast in APAC. We pride ourselves on being an agile and dynamic collective, collaborating with different teams and offices across the globe. What you bring 1 )Create and maintain data models, databases, and data pipelines using Snowflake. 2)Data Integration: Implement ETL (Extract, Transform, Load) processes to integrate data from various sources into Snowflake. 3)Performance Optimization: Optimize queries and data structures for performance and scalability. 4)Collaboration: Work closely with data engineers, analysts, and other stakeholders to support data-driven decision-making. 5)Security: Ensure data security and compliance with relevant regulations.

Posted 3 months ago

Apply

2 - 7 years

3 - 6 Lacs

Mumbai

Work from Office

Naukri logo

1. Good Knowledge of S3. Manager 2. DataLake Concepts and Performance Optimizations in DataLake 3. DataWareHouse Concepts and Amazon Redshift. 4. Athena and Redshift Spectrum . 5. Strong Understanding of Glue Concepts , Glue Data catalogue . Experienced in Implementing END to END ETL solutions using AWS GLUE with variety of source and Target systems. 6 . Must be very strong in Pyspark. Must be able to implement all Standard and Complex ETL Transformations using Pyspark. Should be able to perform various performance Optimization techniques using Spark and Spark SQL. 7. Good knowledge of SQL is a MUST. SHould be able to implement all Standard Data Transformations using SQL. Also should be able to analyze data stored in Redshift datarwarehouse and Datalakes. 8. Must have good understanding of Athena and Redshift Spectrum. 9. Understanding of RDS 10. Understanding of DataBase Migration Service and experience in Migrating from Diverse databases. 11. Understanding of writing Lambda functions and Layers for connecting to various services.Sr. Associate 12. Understanding of Cloud Watch , Cliud Watch Events , Event bridge and also some orchestration tools in AWS like Step Functions and Apache Airflow.

Posted 3 months ago

Apply

7 - 8 years

9 - 10 Lacs

Pune

Work from Office

Naukri logo

Job Purpose "This position is open with Bajaj Finance ltd." Manage and enrich Franchise customer data of 45 Cr to enable creation and generation of Pre-Approved offers, improve Approval rates by enriching data through verified sources and enable instant disbursals by continuously improving fill rates of verified information to help scale volumes in 3in1 App. Responsible for identifying data requirements and evaluating quality of data sourced through external vendors. This capability will help build scale on 3in1 App of the Company and targets to improve Market share of the Company by 10% from current levels at the same portfolio risk levels Duties and Responsibilities 3. PRINCIPAL ACCOUNTABILITIES (Accountabilities associated with the job) ‚ Data Quality, Data Standardization & New Data Enrichment Management oData Standardization & Data Quality oEnsure all Existing & Prospect records are maintained in a single repository oAll other data points related to these records and any data enrichment to be maintained in a single repository oDefine SOP & rules for duplicate identification and resolution oDefine validation rules for data fields to ensure that junk data does not enter the repository oPrepare BRD and User Story for development of Internal Data Storage design that enables to maintain every data point received with tagging of Source, time, Verified / Non-Verified data and Confidence level of the same. oDefine field quality validations and create data quality index oDefine logic for flagging of Confidence levels for the data fields to enable end users of the data to use the data as per their confidence flagging oGovernance of New Data element or New Data Source oWork with IT BIU team in creation of Real Time Data Quality dashboards oResponsible for ensuring error rates in Data Quality are maintained within threshold levels oWork with IT BIU team in creation of Data Confidence & Fill Rate dashboards oDefine SOP and scope of activities for Data Quality management through Manual intervention to be executed through outsourced Credit Ops shop in Bareily oControl checks on data quality management done through Bareily shop oDefine validation rules and confidence levels for any variable brought into the single repository oResponsible to ensure that the data tables have no data quality issues oData Enrichment oEngage with Partnership team in validating the information is meeting the quality standards that have been defined oWill qualify whether data points from a Particular Partner or Source can be used for Enrichment oWill also identify and maintain list of sources, websites which are to be used for Data Enrichment ‚ Stakeholder Management ‚ Setup and convene Change Control Board meeting with all stake holders across all businesses and functions ‚ Engaging with Data Partnership team for Data quality and Data Offer team for usage of data for offer generation ‚ Engaging with EDW team for creating and managing tables ‚ Team development ‚ Identify the right talent at positions within the function ‚ Establish individual performance expectations and regularly review individual performance of the team ‚ Identify development needs of teams and provide appropriate opportunities (trainings for professional development etc.) to ensure teams are motivated and equipped to drive business Required Qualifications and Experience ucational Qualifications a)Qualifications ‚ Post-graduation/ MBA/ Educational background in mathematical, statistical and data science orientations b)Work Experience ‚ Minimum 7+ years of experience in EDW management ‚ Experience with large data handling and quantitative analysis ‚ Exposure to data analytics tools (Eg. SAS, SQL and Decision Tree) automated BI reports ‚ Strong Problem-solving ability and analytical skills

Posted 3 months ago

Apply

3 - 8 years

5 - 10 Lacs

Pune

Work from Office

Naukri logo

Provide expertise in analysis, requirements gathering, design, coordination, customization, testing and support of reports, in client’s environment Develop and maintain a strong working relationship with business and technical members of the team Relentless focus on quality and continuous improvement Perform root cause analysis of reports issues Development / evolutionary maintenance of the environment, performance, capability and availability. Assisting in defining technical requirements and developing solutions Effective content and source-code management, troubleshooting and debugging Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Cognos Developer & Admin Required. Education:The resource should be full time MCA/M. Tech/B. Tech/B.E. and should preferably have relevant certifications Experience:The resource should have a minimum of 3 years of experience of working in the in BI DW projects in areas pertaining to reporting and visualization using cognos. The resources shall have worked in at least two projects where they were involved in developing reporting/ visualization He shall have good understanding of UNIX. Should be well conversant in English and should have excellent writing, MIS, communication, time management and multi-tasking skill Preferred technical and professional experience Experience with various cloud and integration platforms (e.g. AWS, Google, Azure) Agile mindset - ability to process changes of priorities and requests, ownership, critical thinking Experience with an ETL/Data Integration tool (eg. IBM InfoSphere DataStage, Azure Data Factory, Informatica PowerCenter)

Posted 3 months ago

Apply

3 - 8 years

5 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : IBM InfoSphere DataStage Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will work closely with the team to ensure the successful delivery of high-quality software solutions. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Collaborate with cross-functional teams to gather and analyze requirements. Design, develop, and test software applications using IBM InfoSphere DataStage. Troubleshoot and debug applications to identify and fix defects. Ensure the scalability, performance, and security of the applications. Document technical specifications and user manuals for reference purposes. Professional & Technical Skills: Must To Have Skills:Proficiency in IBM InfoSphere DataStage. Strong understanding of ETL concepts and data integration techniques. Experience in designing and implementing data integration solutions. Knowledge of SQL and database concepts. Familiarity with data warehousing and data modeling principles. Additional Information: The candidate should have a minimum of 3 years of experience in IBM InfoSphere DataStage. This position is based at our Hyderabad office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 3 months ago

Apply

3 - 8 years

5 - 10 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for ensuring the smooth functioning of applications and addressing any issues that may arise. Your typical day will involve collaborating with the team to understand requirements, designing and developing applications, and testing and debugging code to ensure optimal performance and functionality. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work-related problems. Collaborate with the team to understand application requirements. Design and develop applications based on business process requirements. Test and debug code to ensure optimal performance and functionality. Address any issues or bugs that arise in the applications. Provide technical support and guidance to end-users. Stay updated with the latest industry trends and technologies. Assist in the deployment and maintenance of applications. Professional & Technical Skills: Must To Have Skills:Proficiency in Ab Initio. Strong understanding of data integration and ETL concepts. Experience in designing and developing ETL workflows using Ab Initio. Knowledge of database concepts and SQL. Familiarity with data warehousing and data modeling. Good To Have Skills:Experience with other ETL tools such as Informatica or DataStage. Additional Information: The candidate should have a minimum of 3 years of experience in Ab Initio. This position is based at our Pune office. A 15 years full-time education is required. Qualifications 15 years full time education

Posted 3 months ago

Apply

8 - 13 years

15 - 25 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Who are we looking for? We are looking for an experienced ETL developer (8+ years) with good experience in DataStage who can work with the team in identifying the client's needs and develop pragmatic business solutions to business problems The Individual should be passionate about technology, experienced in developing and managing cutting edge technology applications. Technical Skills: 8+ years of experience in DataStage 9.x or above. Should be conversant in using various stages, sequence Activities, dynamic loading of data from different file formats and Oracle and SQL Server Should be familiar with modifying stage properties for a given requirement. Good to have proficiency in using Oracle PL/SQL features like Built in Functions, Analytical Functions, Cursors, Cursor variables, Should have hands-on Unix shell scripting. Strong experience in a BI/Data Warehouse environment with involvement in design, development, implementation, Unit Testing troubleshooting and support of ETL process Experience in defining DataStage ETL development standard and best practices.

Posted 3 months ago

Apply

12 - 14 years

15 - 30 Lacs

Pune

Work from Office

Naukri logo

Mandatory Skills: • On-premise ETL Tools: Informatica, Datastage, Talend, SSIS • Cloud ETL Tools: Azure Data Factory (ADF) or Databricks • Data Architecture & Azure Architecture • Snowflake Dipika 8409250974

Posted 3 months ago

Apply

Exploring Datastage Jobs in India

Datastage is a popular ETL (Extract, Transform, Load) tool used by organizations to extract data from different sources, transform it, and load it into a target data warehouse. The demand for datastage professionals in India has been on the rise due to the increasing reliance on data-driven decision-making by companies across various industries.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

These cities are known for their vibrant tech industries and have a high demand for datastage professionals.

Average Salary Range

The average salary range for datastage professionals in India varies based on experience levels. Entry-level positions can expect to earn around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 12-15 lakhs per annum.

Career Path

In the datastage field, a typical career progression may look like: - Junior Developer - ETL Developer - Senior Developer - Tech Lead - Architect

As professionals gain experience and expertise in datastage, they can move up the ladder to more senior and leadership roles.

Related Skills

In addition to proficiency in datastage, employers often look for candidates with the following skills: - SQL - Data warehousing concepts - ETL tools like Informatica, Talend - Data modeling - Scripting languages like Python or Shell scripting

Having a diverse skill set can make a candidate more competitive in the job market.

Interview Questions

  • What is Datastage and how does it differ from other ETL tools? (basic)
  • Explain the difference between a server job and a parallel job in Datastage. (medium)
  • How do you handle errors in Datastage? (medium)
  • What is a surrogate key and how is it generated in Datastage? (advanced)
  • How would you optimize performance in a Datastage job? (medium)
  • Explain the concept of partitioning in Datastage. (medium)
  • What is a Datastage transformer stage and how is it used? (medium)
  • How do you handle incremental loads in Datastage? (advanced)
  • What is a lookup stage in Datastage and when would you use it? (medium)
  • Describe the difference between sequential file and dataset stages in Datastage. (basic)
  • What is a configuration file in Datastage and how is it used? (medium)
  • How do you troubleshoot Datastage job failures? (medium)
  • Explain the purpose of the Datastage director. (basic)
  • How do you handle data quality issues in Datastage? (advanced)
  • What is a shared container in Datastage and how is it beneficial? (medium)
  • Describe the difference between persistent data and hashed file stages in Datastage. (medium)
  • How do you schedule Datastage jobs for execution? (basic)
  • Explain the use of parameter sets in Datastage. (medium)
  • What is a Datastage transformer variable and how is it defined? (medium)
  • How do you handle complex transformations in Datastage? (advanced)
  • What is a surrogate key and how is it generated in Datastage? (advanced)
  • How do you handle rejected data in Datastage? (medium)
  • Describe the purpose of a Datastage job sequencer. (medium)
  • How do you handle metadata in Datastage? (medium)
  • Explain the concept of parallel processing in Datastage. (medium)

Closing Remark

As you explore job opportunities in Datastage in India, remember to showcase your skills and knowledge confidently during interviews. By preparing well and demonstrating your expertise, you can land a rewarding career in this growing field. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies