Jobs
Interviews

592 Datastage Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 6.0 years

0 Lacs

kochi, kerala

On-site

You have more than 4 years of experience in implementing solutions for the integration of applications. In this role, you will be responsible for performing development and testing activities following the SDLC framework. It is crucial to always think about scalability, automation, and proactive optimization. You must measure everything and identify technical risks to sprint commitments early on, escalating them as needed. You will need to learn the Model N application and adapt to it quickly. As part of your responsibilities, you will be customer-facing, gathering project requirements, creating technical designs, and preparing the necessary documentation. You should be prepared to take on different roles as required by project situations to ensure project success. To excel in this role, you should have a minimum of 2 years of experience in building and scaling APIs. Proficiency in Python is a must, along with knowledge of other integration technologies such as Informatica, IICS, DataStage, or Mulesoft. Strong experience in working with SQL and related technologies is essential. You should also have experience in building pipelines from scratch for data migration or conversion projects. Experience with basic database administrative activities like creating tenants, clusters, score, key vaults, etc., is required. Familiarity with Git and CI/CD processes is preferred. You should be adept at performance tuning and query tuning by generating and explaining plans for SQL queries. Knowledge of reporting tools like Tableau or Power BI would be beneficial. Your eagerness to learn new technologies and problem-solving skills will be valuable assets in this role. Preferred certifications include any one of Informatica, Mulesoft, or Databricks certifications, as well as cloud certifications (AWS or Azure). Python certifications would be an added advantage.,

Posted 1 day ago

Apply

1.0 - 6.0 years

8 - 9 Lacs

Bengaluru

Work from Office

IN Data Engineering & Analytics(IDEA) Team is looking to hire a rock star Data Engineer to build and manage the largest petabyte-scale data infrastructure in India for Amazon India businesses. IN Data Engineering & Analytics (IDEA) team is the central Data engineering and Analytics team for all A.in businesses. The teams charter includes 1) Providing Unified Data and Analytics Infrastructure (UDAI) for all A.in teams which includes central Petabyte-scale Redshift data warehouse, analytics infrastructure and frameworks for visualizing and automating generation of reports & insights and self-service data applications for ingesting, storing, discovering, processing & querying of the data 2) Providing business specific data solutions for various business streams like Payments, Finance, Consumer & Delivery Experience. The Data Engineer will play a key role in being a strong owner of our Data Platform. He/she will own and build data pipelines, automations and solutions to ensure the availability, system efficiency, IMR efficiency, scaling, expansion, operations and compliance of the data platform that serves 200 + IN businesses. The role sits in the heart of technology & business worlds and provides opportunity for growth, high business impact and working with seasoned business leaders. An ideal candidate will be someone with sound technical background in managing large data infrastructures, working with petabyte-scale data, building scalable data solutions/automations and driving operational excellence. An ideal candidate will be someone who is a self-starter that can start with a Platform requirement & work backwards to conceive and devise best possible solution, a good communicator while driving customer interactions, a passionate learner of new technology when the need arises, a strong owner of every deliverable in the team, obsessed with customer delight, business impact and gets work done in business time. 1. Design/implement automation and manage our massive data infrastructure to scale for the analytics needs of Amazon IN. 2. Build solutions to achieve BAA(Best At Amazon) standards for system efficiency, IMR efficiency, data availability, consistency & compliance. 3. Enable efficient data exploration, experimentation of large datasets on our data platform and implement data access control mechanisms for stand-alone datasets 4. Design and implement scalable and cost effective data infrastructure to enable Non-IN(Emerging Marketplaces and WW) use cases on our data platform 5. Interface with other technology teams to extract, transform, and load data from a wide variety of data sources using SQL, Amazon and AWS big data technologies 6. Must possess strong verbal and written communication skills, be self-driven, and deliver high quality results in a fast-paced environment. 7. Drive operational excellence strongly within the team and build automation and mechanisms to reduce operations 8. Enjoy working closely with your peers in a group of very smart and talented engineers. A day in the life India Data Engineering and Analytics (IDEA) team is central data engineering team for Amazon India. Our vision is to simplify and accelerate data driven decision making for Amazon India by providing cost effective, easy & timely access to high quality data. We achieve this by providing UDAI (Unified Data & Analytics Infrastructure for Amazon India) which serves as a central data platform and provides data engineering infrastructure, ready to use datasets and self-service reporting capabilities. Our core responsibilities towards India marketplace include a) providing systems(infrastructure) & workflows that allow ingestion, storage, processing and querying of data b) building ready-to-use datasets for easy and faster access to the data c) automating standard business analysis / reporting/ dash-boarding d) empowering business with self-service tools to manage data and generate insights. 1+ years of data engineering experience Experience with SQL Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc.

Posted 3 days ago

Apply

2.0 - 6.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Diverse Lynx is looking for ETL Test Engineer to join our dynamic team and embark on a rewarding career journey Responsible for ensuring the accuracy, completeness, and efficiency of the ETL process used to transfer data from one system to another The primary duties of an ETL Test Engineer may include: Developing ETL test cases and test plans that ensure data quality, accuracy, and completeness Conducting functional and non-functional testing of ETL processes to validate the integrity of the data being transferred Identifying and documenting defects, issues, and potential improvements in the ETL process and sharing them with the development team Creating and maintaining ETL test environments that simulate production environments for testing purposes Conducting load testing to measure the scalability and performance of ETL processes under different workloads Conducting regression testing to ensure that changes made to ETL processes do not introduce new defects or issues Developing and maintaining test automation scripts to improve the efficiency of ETL testing To perform the role of an ETL Test Engineer effectively, candidates should possess strong analytical, problem-solving, and communication skills, as well as experience with ETL testing tools and technologies, such as SQL, ETL testing frameworks, and test automation tools

Posted 3 days ago

Apply

7.0 - 9.0 years

5 - 5 Lacs

Thiruvananthapuram

Work from Office

1. Production monitoring and troubleshooting in on Prem ETL and AWS environment 2. Working experience using ETL Datastage along with DB2 3. Awareness to use tools such as Dynatrace, Appdynamics, Postman , AWS CICD 4. Software code development experience in ETL batch processing and AWS cloud 5. Software code management, repository updates and reuse 6. Implementation and/or configuration, management, and maintenance of software 7. Implementation and configuration of SaaS and public, private and hybrid cloud-based PaaS solutions 8. Integration of SaaS and PaaS solutions with Data Warehouse Application Systems including SaaS and PaaS upgrade management 9. Configuration, Maintenance and support for entire DWA Application Systems landscape including but not limited to supporting DWA Application Systems components and tasks required to deliver business processes and functionally (e.g., logical layers of databases, data marts, logical and physical data warehouses, middleware, interfaces, shell scripts, massive data transfer and uploads, web development, mobile app development, web services and APIs) 10. DWA Application Systems support for day-to-day changes and business continuity and for addressing key business, regulatory, legal or fiscal requirements 11. Support for all Third-party specialized DWA Application Systems 12. DWA Application Systems configuration and collaboration with infrastructure service supplier required to provide application access to external/third parties 13. Integration with internal and external systems (e.g., direct application interfaces, logical middleware configuration and application program interface (API) use and development) 14. Collaboration with third party suppliers such as infrastructure service supplier and enterprise public cloud providers 15. Documentation and end user training of new functionality 16. All activities required to support business process application functionality and to deliver the required application and business functions to End Users in an integrated service delivery model across the DWA Application Development lifecycle (e.g., plan, deliver, run) . Maintain data quality and run batch schedules , Operations and Maintenance 17. Deploy code to all the environments (Prod, UAT, Performance, SIT etc.) 18. Address all open tickets within the SLA CDK (Typescript) CFT (YAML) Nice to have GitHub Scripting -Bash/SH Security minded/best practices known Python Databricks & Snowflake Required Skills Databricks,Datastage,CloudOps,production support

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

The Data Engineer plays a critical role in the organization by designing, building, and maintaining scalable data pipelines and infrastructure. Collaborating closely with cross-functional teams, you ensure the smooth flow of data and enhance data-driven decision-making. Your key responsibilities include designing, developing, and maintaining data pipelines and ETL processes using tools such as Snowflake, Azure, AWS, Data Bricks, Informatica, and DataStage. You will work with data scientists and stakeholders to understand data requirements, ensuring data availability and integrity. Additionally, optimizing and tuning the performance of data infrastructure and processing systems, implementing data security and privacy measures, troubleshooting and performance tuning of ETL processes, and developing documentation for data infrastructure and processes are crucial aspects of your role. You will also participate in the evaluation and selection of new technologies and tools to enhance data engineering capabilities, provide support and mentorship to junior data engineers, adhere to best practices in data engineering, and maintain high standards of quality. Collaboration with cross-functional teams to support data-related initiatives and projects is essential for success in this role. To qualify for this position, you should possess a Bachelor's or Master's degree in Computer Science, Engineering, or a related field, along with proven experience in data engineering, ETL development, and data warehousing. Proficiency in Snowflake, AWS, Azure, Data Bricks, Informatica, and DataStage, strong programming skills in languages like Python, SQL, or Java, experience with big data technologies and distributed computing, and knowledge of data modeling and database design principles are required. Your ability to work with stakeholders, understanding data requirements, translating them into technical solutions, and knowledge of data governance, data quality, and data integration best practices are critical. Experience with cloud data platforms and services, excellent problem-solving and analytical abilities, strong communication and collaboration skills, and the ability to thrive in a fast-paced and dynamic environment are essential for success. Relevant certifications in cloud platforms and data engineering, such as AWS Certified Big Data - Specialty, Microsoft Certified: Azure Data Engineer, and SnowPro Core Certification, will be advantageous. In summary, as a Data Engineer, you will play a vital role in designing, building, and maintaining data pipelines and infrastructure, collaborating with cross-functional teams, optimizing data performance, and ensuring data security and privacy to support data-driven decision-making and initiatives effectively.,

Posted 3 days ago

Apply

4.0 - 7.0 years

10 - 20 Lacs

Hyderabad

Work from Office

Key Responsibilities: Design, develop, and execute test cases for ETL workflows and data pipelines. Validate data transformations and ensure data integrity across source and target systems. Perform unit, system, and integration testing for data migration projects. Collaborate with data engineers and developers to troubleshoot and resolve data issues. Automate test processes to improve efficiency and coverage. Document test results, defects, and provide detailed reports to stakeholders. Ensure compliance with data governance and quality standards. Required Skills: 3.6 years of experience in ETL testing, data validation, or related roles. Strong proficiency in SQL for data querying and validation. Hands-on experience with ETL tools such as Talend ( preferred ) , Informatica, or data stage. Experience with test automation tools and scripting. Understanding of data migration processes and challenges. Familiarity with cloud platforms like AWS, Azure, or GCP. Knowledge of data warehousing concepts and platforms (e.g., Snowflake). Preferred Skills: Experience with SAS programming and SAS Viya. Exposure to tools like SonarQube, Qlik Replicate, or IBM Data Replicator. Familiarity with CI/CD pipelines and version control systems (e.g., Git). Knowledge of data governance and compliance frameworks

Posted 3 days ago

Apply

4.0 - 7.0 years

10 - 20 Lacs

Hyderabad

Work from Office

Job Description: We are seeking a talented ETL Developer with hands-on experience in Talend Management Console on Cloud and Snowflake to join our growing data engineering team. The ideal candidate will play a critical role in designing, developing, and optimizing scalable data pipelines to support enterprise-level analytics and reporting across cloud-based environments. Key Responsibilities: Design and develop robust ETL/ELT pipelines using Talend Management Console on Cloud to integrate data from on-premise and cloud-based sources. Implement and optimize data ingestion, transformation, and loading processes in Snowflake to support business intelligence and reporting needs. Manage, monitor, and deploy data jobs through Talend Cloud , ensuring high performance and automation. Collaborate with data architects , business stakeholders, and analytics teams to ensure integration solutions align with business goals and best practices. Troubleshoot and resolve issues related to ETL workflows, data latency, and system performance. Ensure data quality , consistency, and security across all data pipelines. Create and maintain comprehensive documentation for ETL processes, data mappings, workflows, and technical specifications. Participate in code reviews , sprint planning, and other Agile ceremonies as part of a collaborative development team. Work closely with Snowflake specialists to implement best practices in data warehousing . Implement robust error handling and data validation mechanisms for seamless data movement. Required Qualifications: Minimum 4 + years of hands-on experience with Talend , particularly using Talend Management Console on Cloud Strong expertise in Snowflake data warehouse and cloud data integration Solid understanding of ETL/ELT concepts , data modeling , and data architecture principles Proficiency in SQL and experience working with relational databases Experience with cloud platforms such as AWS , Azure , or GCP Knowledge of data quality , performance tuning, and job optimization techniques Ability to troubleshoot complex data integration workflows and deliver solutions under tight deadlines Preferred Qualifications: Exposure to Informatica (optional but a plus) Experience with Python or Shell scripting for automation tasks Familiarity with CI/CD practices , DevOps tools, and version control (e.g., Git, Jenkins) Understanding of data governance and security practices in cloud-based environments Soft Skills: Strong analytical and problem-solving skills Excellent verbal and written communication Ability to work independently and as part of a cross-functional team Detail-oriented with strong documentation and collaboration capabilities

Posted 3 days ago

Apply

7.0 - 12.0 years

15 - 30 Lacs

Hyderabad

Work from Office

Role : ETL Development Location: Hyderabad(Kokapet) Required Experience: 7+ Years Interview Mode: First level Technical Virtual - HR Round (Face to Face) Mode of hire : Permanent 4 Days Work From Office (Regular Dayshifts) Must Have: ETL Development ,Data Stage and ADF. Share CV's to: aravind.kuppili@otsi.co.in Role & responsibilities Education: Bachelors degree in computer and information technology or a related field, such as engineering. Minimum 7 years of development experience in Datastage (Version V8.5 or higher) with experience in processing high volume jobs 7+ years experience in Advanced Infosphere DataStage Design and ADF Development. 3+ years in DB2 UDB Administration and Support 2+ years Report Solution/Design experiences. Experience Datastage 11.3 and 8.7 is a must Strong experience with UNIX and shell scripting Mastery level on Datastage 8.7 server and parallel versions Write ETL Technical Specifications 3+yrsAzure Data Factory (ADF)/ Microsoft SQL Server Jobs: Using ADF connectors to connect to various data sources and destinations. Implementing data flows within ADF for complex transformations. Scheduling and monitoring ADF pipelines. Creating and managing pipelines in ADF to orchestrate data movement and transformation Creating and managing SQL Server Agent jobs for automated tasks, including ETL processes. Writing and optimizing SQL queries for data extraction and transformation. Using SSIS packages for complex ETL operations within SQL Server. Troubleshooting and resolving issues related to SQL Server jobs and SSIS packages. Preferred candidate profile

Posted 4 days ago

Apply

6.0 - 11.0 years

6 - 10 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

At YASH, we re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire ETL(Extract, Transform, Load) Professionals in the following areas : Job description: Skill Set - SQL Snowflake Snaplogic ETL Tool Job description 6+ years of IT experience in Analysis, Design, Development and unit testing of Data warehousing applications using industry accepted methodologies and procedures Write complex SQL queries to implement ETL (Extract, Transform, Load) processes and for Business Intelligence reporting. Strong problem solving & technical skills coupled with confident decision making for enabling effective solutions leading to high customer satisfaction Deliver robust solutions through Query optimization ensuring Data Quality. Should have experience in writing Functions and Stored Procedures. Strong understanding of the principles of Data Warehouse using Fact Tables, Dimension Tables, star and snowflake schema modelling Analyse & translate functional specifications /user stories into technical specifications. Good to have experience in Design/ Development in any ETL tool like DataStage and Snaplogic. Good interpersonal skills, experience in handling communication and interactions between different teams Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture

Posted 4 days ago

Apply

2.0 - 5.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Project Role : Application Automation Engineer Project Role Description : Deliver predictive and intelligent delivery approaches based on automation and analytics. Drive the automation of delivery analytics to gather insights from data. Must have skills : SAP BW/4HANA Data Modeling & Development Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Automation Engineer, you will apply innovative ideas to drive the automation of Delivery Analytics at the client level. A typical day involves collaborating with various teams to identify automation opportunities, developing solutions to enhance efficiency, and ensuring that the analytics processes are streamlined and effective. You will engage in problem-solving discussions, contribute to key decisions, and support your team in achieving their objectives while fostering a culture of collaboration and innovation. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with strategic goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BW/4HANA Data Modeling & Development.- Strong understanding of data warehousing concepts and best practices.- Experience with ETL processes and data integration techniques.- Familiarity with reporting tools and data visualization techniques.- Ability to troubleshoot and optimize data models for performance. Additional Information:- The candidate should have minimum 5 years of experience in SAP BW/4HANA Data Modeling & Development.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica PowerCenter Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to ensure the highest quality of deliverables, while continuously seeking opportunities for improvement in application functionality and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application specifications and user guides.- Collaborate with cross-functional teams to gather requirements and provide technical insights. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica PowerCenter.- Strong understanding of ETL processes and data integration techniques.- Experience with data warehousing concepts and methodologies.- Familiarity with SQL and database management systems.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 3 years of experience in Informatica PowerCenter.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

2.0 - 5.0 years

5 - 9 Lacs

Pune

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BW/4HANA Data Modeling & Development Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and enhancements to existing applications while keeping abreast of the latest technologies and methodologies in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BW/4HANA Data Modeling & Development.- Strong understanding of data warehousing concepts and best practices.- Experience with ETL processes and data integration techniques.- Familiarity with reporting tools and data visualization techniques.- Ability to troubleshoot and optimize data models for performance. Additional Information:- The candidate should have minimum 5 years of experience in SAP BW/4HANA Data Modeling & Development.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

2.0 - 5.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BW/4HANA Data Modeling & Development Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and enhancements to existing applications while ensuring that all development aligns with best practices and organizational standards. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of milestones. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BW/4HANA Data Modeling & Development.- Strong understanding of data warehousing concepts and methodologies.- Experience with ETL processes and data integration techniques.- Familiarity with reporting tools and data visualization techniques.- Ability to troubleshoot and optimize data models for performance. Additional Information:- The candidate should have minimum 7.5 years of experience in SAP BW/4HANA Data Modeling & Development.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

2.0 - 5.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica PowerCenter Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions, and ensuring that applications function seamlessly within the existing infrastructure. You will also engage in problem-solving activities, providing support and enhancements to existing applications, while continuously seeking ways to improve processes and user experiences. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica PowerCenter.- Strong understanding of ETL processes and data integration techniques.- Experience with data warehousing concepts and methodologies.- Familiarity with SQL and database management systems.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 5 years of experience in Informatica PowerCenter.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BW/4HANA Data Modeling & Development Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications function seamlessly to support organizational goals. You will also participate in testing and refining applications to enhance user experience and efficiency, while staying updated on industry trends and best practices to continuously improve your contributions. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application specifications and user guides.- Collaborate with cross-functional teams to gather requirements and provide technical insights. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BW/4HANA Data Modeling & Development.- Strong understanding of data warehousing concepts and methodologies.- Experience with ETL processes and tools.- Familiarity with reporting tools and data visualization techniques.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 3 years of experience in SAP BW/4HANA Data Modeling & Development.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

2.0 - 5.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and enhancements to existing applications while maintaining a focus on quality and efficiency. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling and ETL processes.- Experience with SQL and database management.- Familiarity with cloud computing concepts and services.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : Shell ScriptingMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. A typical day involves collaborating with cross-functional teams to gather insights, analyzing user needs, and translating them into functional specifications. You will engage in discussions to refine application designs and ensure alignment with business objectives, while also participating in testing and validation processes to guarantee that the applications meet the established requirements. Your role will be pivotal in driving the development of innovative solutions that enhance operational efficiency and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with stakeholders to gather and analyze requirements for application design.- Participate in the testing and validation of applications to ensure they meet business needs. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with Shell Scripting.- Strong understanding of data modeling and ETL processes.- Experience with SQL and database management.- Familiarity with cloud-based data solutions and architecture. Additional Information:- The candidate should have minimum 3 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica PowerCenter Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing application features, and ensuring that the solutions align with organizational goals. You will also participate in testing and troubleshooting to enhance application performance and user experience, contributing to the overall success of the projects you are involved in. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in continuous learning to stay updated with industry trends and technologies. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica PowerCenter.- Strong understanding of ETL processes and data integration techniques.- Experience with data warehousing concepts and methodologies.- Familiarity with SQL and database management systems.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 3 years of experience in Informatica PowerCenter.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : Shell ScriptingMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. A typical day involves collaborating with cross-functional teams to gather insights, analyzing user needs, and translating them into functional specifications. You will engage in discussions to refine application designs and ensure alignment with business objectives, while also participating in testing and validation processes to guarantee that the applications meet the established requirements. Your role will be pivotal in driving the development of innovative solutions that enhance operational efficiency and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with stakeholders to gather and analyze requirements for application design.- Participate in the testing and validation of applications to ensure they meet business needs. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with Shell Scripting.- Strong understanding of data modeling and ETL processes.- Experience with SQL and database management.- Familiarity with cloud-based data solutions and architecture. Additional Information:- The candidate should have minimum 3 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

3.0 - 8.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also engage in problem-solving discussions with your team, providing guidance and support to ensure successful project outcomes. Additionally, you will monitor project progress and make necessary adjustments to keep everything on track, fostering a collaborative environment that encourages innovation and efficiency. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate knowledge sharing sessions to enhance team capabilities.- Mentor junior team members to support their professional growth. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with data governance and compliance standards.- Ability to work with large datasets and perform data analysis. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

5.0 - 8.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP BW/4HANA Data Modeling & Development Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that the applications are aligned with business objectives and user needs, while maintaining a focus on quality and efficiency throughout the project lifecycle. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing and mentoring among team members.- Monitor project progress and ensure timely delivery of milestones. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BW/4HANA Data Modeling & Development.- Strong understanding of data warehousing concepts and best practices.- Experience with ETL processes and data integration techniques.- Familiarity with reporting tools and data visualization techniques.- Ability to troubleshoot and optimize data models for performance. Additional Information:- The candidate should have minimum 7.5 years of experience in SAP BW/4HANA Data Modeling & Development.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

3.0 - 8.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Informatica PowerCenter Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also engage in problem-solving discussions with your team, providing guidance and support to ensure successful project outcomes. Your role will require you to stay updated on industry trends and best practices to enhance application performance and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate communication between technical teams and stakeholders to ensure alignment on project goals.- Mentor junior team members, providing guidance and support in their professional development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica PowerCenter.- Strong understanding of data integration processes and ETL methodologies.- Experience with database management systems and SQL.- Familiarity with data warehousing concepts and best practices.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 3 years of experience in Informatica PowerCenter.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

5.0 - 8.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP BW/4HANA Data Modeling & Development Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that the applications are aligned with business objectives and user needs, while maintaining a focus on quality and efficiency throughout the project lifecycle. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of milestones. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BW/4HANA Data Modeling & Development.- Strong understanding of data warehousing concepts and best practices.- Experience with ETL processes and data integration techniques.- Familiarity with reporting tools and data visualization techniques.- Ability to troubleshoot and optimize data models for performance. Additional Information:- The candidate should have minimum 5 years of experience in SAP BW/4HANA Data Modeling & Development.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 4 days ago

Apply

5.0 - 10.0 years

0 - 2 Lacs

Hyderabad

Hybrid

Role & responsibilities 5+ Years of Extensive Experience working with multiple Databases, ETL and BI testing Working experience in Investment Management and Capital Markets Domain with is preferred Experience working in the applications like Eagle, Calypso, Murex will be added advantage Experience in delivering large releases to the customer through direct and partner teams. Experience in testing data validation scenarios and Data ingestion, pipelines, and transformation processes. Experience in Vertica, DataStage, Teradata, and Big Data environments for both data Ingestion and Consumption. Extensive knowledge on any Business Intelligence tool, Preferably MicroStrategy and Tableau. Extensive experience in writing and troubleshooting complex SQL Queries. Expert in providing QA solutions based on Data Warehousing and Dimensional Modelling design. Expert in drafting ETL Source to Target Mapping document design. Identify data validation tools that will suit the ETL project conditions. Ensure all sign offs on deliverables (overall test strategy, test plan, test cases and test results) and that testing meets governance requirements. Establishing and driving Automation Capabilities. Collaborating with dev & architect teams to identify and prioritize opportunities for automation. Experience in ETL automation with open-source tools, Service Virtualization, CI/CD.

Posted 4 days ago

Apply

12.0 - 17.0 years

13 - 18 Lacs

Mumbai

Work from Office

Overview: We are seeking an experienced Data Architect with over 12 years of expertise in data engineering, big data, and cloud data solutions particularly on Microsoft Azure . The ideal candidate will have a proven track record of delivering scalable data architectures, building enterprise data lakes, leading complex migrations, and architecting real-time and batch data pipelines. You ll be responsible for end-to-end architecture from data ingestion and transformation to governance, analytics, and performance optimization. Key Responsibilities: Architecture & Design Design scalable, high-performance, cloud-native data architectures using Azure Data Lake, Azure Synapse, and Databricks . Develop high-level and low-level architecture documents (HLD/LLD) for modern data platforms. Define data models using star and snowflake schemas , optimizing for analytics and query performance. Data Engineering & ETL Lead the development of ETL/ELT pipelines using Azure Data Factory , PySpark , Spark SQL , and Databricks . Manage ingestion of structured and semi-structured data from diverse sources to Azure-based data lakes and warehouses. Implement real-time data pipelines using Azure Event Hubs and Structured Streaming . Governance & Security Define and implement data governance frameworks including lineage, cataloging, access controls , and compliance (e.g., GDPR ). Collaborate with MDM and governance teams using tools like Informatica AXON and EDC . Performance Tuning & Optimization Drive cost-efficient architecture design with partitioning, caching, indexing, and cluster optimization. Monitor and troubleshoot data pipelines using Azure Monitor , Log Analytics , and Databricks tools . Stakeholder Engagement Collaborate with data scientists, analysts, business stakeholders, and DevOps teams to deliver robust, scalable data platforms. Conduct design reviews and training sessions to support platform adoption and knowledge sharing. Key Skills & Technologies: Cloud Platforms: Azure (ADF, ADLS, Azure SQL, Synapse, Databricks), AWS (S3, RDS, EC2) Big Data: Spark, Delta Lake, PySpark, Hadoop ETL Tools: Azure Data Factory, Informatica, IBM DataStage Data Modeling: Star, Snowflake, SCD, Fact & Dimension Tables Programming: Python, PySpark, SQL, Shell Scripting, R Visualization Tools: Power BI, Tableau, Cognos Data Governance: Informatica MDM, AXON, EDC Certifications Preferred: Microsoft Certified: Azure Data Engineer Associate Databricks Data Engineer Associate / Professional

Posted 6 days ago

Apply

Exploring Datastage Jobs in India

Datastage is a popular ETL (Extract, Transform, Load) tool used by organizations to extract data from different sources, transform it, and load it into a target data warehouse. The demand for datastage professionals in India has been on the rise due to the increasing reliance on data-driven decision-making by companies across various industries.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

These cities are known for their vibrant tech industries and have a high demand for datastage professionals.

Average Salary Range

The average salary range for datastage professionals in India varies based on experience levels. Entry-level positions can expect to earn around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 12-15 lakhs per annum.

Career Path

In the datastage field, a typical career progression may look like: - Junior Developer - ETL Developer - Senior Developer - Tech Lead - Architect

As professionals gain experience and expertise in datastage, they can move up the ladder to more senior and leadership roles.

Related Skills

In addition to proficiency in datastage, employers often look for candidates with the following skills: - SQL - Data warehousing concepts - ETL tools like Informatica, Talend - Data modeling - Scripting languages like Python or Shell scripting

Having a diverse skill set can make a candidate more competitive in the job market.

Interview Questions

  • What is Datastage and how does it differ from other ETL tools? (basic)
  • Explain the difference between a server job and a parallel job in Datastage. (medium)
  • How do you handle errors in Datastage? (medium)
  • What is a surrogate key and how is it generated in Datastage? (advanced)
  • How would you optimize performance in a Datastage job? (medium)
  • Explain the concept of partitioning in Datastage. (medium)
  • What is a Datastage transformer stage and how is it used? (medium)
  • How do you handle incremental loads in Datastage? (advanced)
  • What is a lookup stage in Datastage and when would you use it? (medium)
  • Describe the difference between sequential file and dataset stages in Datastage. (basic)
  • What is a configuration file in Datastage and how is it used? (medium)
  • How do you troubleshoot Datastage job failures? (medium)
  • Explain the purpose of the Datastage director. (basic)
  • How do you handle data quality issues in Datastage? (advanced)
  • What is a shared container in Datastage and how is it beneficial? (medium)
  • Describe the difference between persistent data and hashed file stages in Datastage. (medium)
  • How do you schedule Datastage jobs for execution? (basic)
  • Explain the use of parameter sets in Datastage. (medium)
  • What is a Datastage transformer variable and how is it defined? (medium)
  • How do you handle complex transformations in Datastage? (advanced)
  • What is a surrogate key and how is it generated in Datastage? (advanced)
  • How do you handle rejected data in Datastage? (medium)
  • Describe the purpose of a Datastage job sequencer. (medium)
  • How do you handle metadata in Datastage? (medium)
  • Explain the concept of parallel processing in Datastage. (medium)

Closing Remark

As you explore job opportunities in Datastage in India, remember to showcase your skills and knowledge confidently during interviews. By preparing well and demonstrating your expertise, you can land a rewarding career in this growing field. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies