Home
Jobs

369 Datastage Jobs - Page 11

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6 - 7 years

13 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Level : Senior Data Engineer Experience : 7-10 Location : Any , Bangalore Preferred, Chennai, Hyderabad or Noida Website : https://www.qualitestgroup.com/ About Us: Qualitest is the world s leading managed services provider of AI-led quality engineering solutions. It helps brands transition through the digital assurance journey and make the move from conventional functional testing to adopt innovations such as automation, AI, blockchain, and XR. Qualitest s core mission is to mitigate business risks associated with digital adoption. It fulfills this through customized quality engineering solutions that leverage Qualitest s deep, industry-specific knowledge for various sectors, including technology, telecommunications, finance, healthcare, media, utilities, retail, manufacturing, and defense. These scalable solutions protect brands through end-to-end value demonstration with a focus on customer experience and release velocity. Qualitest has offices in the United States, United Kingdom, Germany, Israel, Romania, India, Mexico, Portugal, Switzerland, and Argentina. It employs more than 7,000 engineers who serve over 400 customers worldwide. A pioneer and innovator in its industry, Qualitest is the only services provider positioned by Everest Group as a Leader in both Next-generation Quality Engineering (QE) Services PEAK Matrix Assessment 2023 and the Quality Engineering (QE) Specialist Services PEAK Matrix Assessment 2023. Responsibilities Design and execute scalable data migration pipelines from relational databases (such as Teradata) to AWS Databricks. Utilize tools like IBM DataStage, FiveTran, Airflow, and Databricks Workflow to support data migration tasks. Optimize ETL processes to enhance performance and streamline data transformations with Spark and SQL. Develop and automate data ingestion, validation, and quality checks within a cloud environment to ensure reliable data flows. Work closely with data architects and business stakeholders to ensure a seamless, efficient, and secure migration process.

Posted 2 months ago

Apply

8 - 10 years

13 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

Level : Architect Experience : 10+ Location : Any , Bangalore Preferred, Chennai, Hyderabad or Noida Website : https://www.qualitestgroup.com/ About Us: Qualitest is the world s leading managed services provider of AI-led quality engineering solutions. It helps brands transition through the digital assurance journey and make the move from conventional functional testing to adopt innovations such as automation, AI, blockchain, and XR. Qualitest s core mission is to mitigate business risks associated with digital adoption. It fulfills this through customized quality engineering solutions that leverage Qualitest s deep, industry-specific knowledge for various sectors, including technology, telecommunications, finance, healthcare, media, utilities, retail, manufacturing, and defense. These scalable solutions protect brands through end-to-end value demonstration with a focus on customer experience and release velocity. Qualitest has offices in the United States, United Kingdom, Germany, Israel, Romania, India, Mexico, Portugal, Switzerland, and Argentina. It employs more than 7,000 engineers who serve over 400 customers worldwide. A pioneer and innovator in its industry, Qualitest is the only services provider positioned by Everest Group as a Leader in both Next-generation Quality Engineering (QE) Services PEAK Matrix Assessment 2023 and the Quality Engineering (QE) Specialist Services PEAK Matrix Assessment 2023. Responsibilities : With 8 to 10 years of experience, I have led large-scale migrations from Teradata to AWS Databricks. I possess expertise in the Databricks tech stack, including Delta Lake, Spark, Unity Catalog, Databricks Workflows, and Lakehouse architecture. I am skilled in Stored Procedures, PL/SQL, and SQL across relational databases such as Oracle, SQL Server, and DB2. Additionally, I have hands-on experience working with IBM DataStage and Airflow for building ETL/ELT jobs. In my leadership roles, I have successfully managed technical teams, collaborated with stakeholders, and driven automation for smooth migration processes. I focus on ensuring performance, security, scalability, and cost efficiency at every stage of the migration. Key Skills: ETL SQL AWS

Posted 2 months ago

Apply

2 - 5 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Good Hands-on experience in DBT is required. ETL DataStage and snowflake - preferred. Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences

Posted 2 months ago

Apply

2 - 9 years

20 - 23 Lacs

Bengaluru

Work from Office

Naukri logo

P-1364 As a Spark Staff Technical Solutions Engineer, you will provide a deep dive technical and consulting related solutions for the challenging Apache Spark / ML / AI / Delta / Streaming / Lakehouse reported issues by our customers and resolve any challenges involving the Databricks unified analytics platform with your highly comprehensive technical and customer communication skills. You will assist our customers in their Databricks journey and provide them with the guidance, knowledge, and expertise that they need to realize value and achieve their strategic objectives using our products. Outcomes Performing initial level analysis and troubleshooting issues in Apache Spark using Spark UI metrics, DAG, Event Logs for various customer reported job slowness issues. Troubleshoot, resolve and suggest deep code-level analysis of Spark to address customer issues related to Spark core internals, Spark SQL, Structured Streaming, Delta, Lakehouse and other databricks runtime features. Assist the customers in setting up reproducible spark problems with solutions in the areas of Spark SQL, Delta, Memory Management, Performance tuning, Streaming, Data Science, Data Integration areas in Spark. Contribute in the development of tools/automation initiatives. Participate in the Designated Solutions Engineer program and drive one or two of strategic customer s day to day Spark and Cloud issues. Provide best practices guidance around Spark runtime performance and usage of Spark core libraries and APIs for custom-built solutions developed by Databricks customers. Provide front line support on the third party integrations with Databricks environment. Plan and coordinate with Account Executives, Customer Success Engineers and Resident Solution Architects for coordinating the customer issues and best practices guidelines. Participate in screen sharing meetings, answering slack channel conversations with our internal stakeholders and customers, helping in driving the major spark issues at an individual contributor level. Review the Engineering JIRA tickets and proactively intimate the support leadership team for following up on the action items. Manage the assigned spark cases on a daily basis and adhere to committed SLAs. Build an internal wiki, knowledge base with technical documentation, manuals for the support team and for the customers. Participate in the creation and maintenance of company documentation and knowledge base articles. Achieving above and beyond expectations of the support organization KPIs. Coordinate with Engineering and Backline Support teams to provide assistance in identifying, reporting product defects. Be a true proponent of customer advocacy. Participate in weekend and weekday on-call rotation and run escalations during databricks runtime outages, incident situations, ability to multitask and plan day 2 day activities and provide escalated level of support for critical customer operational issues, etc. Strengthen your AWS/Azure and Databricks platform expertise through continuous learning and internal training programs. Competencies 8+ years of experience in designing, building, testing, and maintaining Python/Java/Scala based applications in typical project delivery and consulting environments. 2+ years of hands-on experience in developing any two or more of the Big Data, Hadoop, Apache Spark ,Machine Learning, Artificial Intelligence, Streaming, Kafka, Data Science, ElasticSearch related industry use cases at the production scale. Spark experience is mandatory. Hands on experience in the performance tuning/troubleshooting of Hive and Spark based applications at production scale. Proven and real time experience in JVM and Memory Management techniques such as Garbage collections, Heap/Thread Dump Analysis is preferred. Working knowledge in Data Lakes and preferably on the SCD types use cases at production scale. Working and hands-on experience with any SQL-based databases, Data Warehousing/ETL technologies like Informatica, DataStage, Oracle, Teradata, SQL Server, MySQL is preferred. Linux/Unix administration skills is a plus Hands-on experience with AWS or Azure or GCP is preferred Excellent written and oral communication skills Demonstrated analytical and problem-solving skills, particularly those that apply to a Distributed Big Data Computing environment. About Databricks Databricks is the data and AI company. More than 10,000 organizations worldwide including Comcast, Cond Nast, Grammarly, and over 50% of the Fortune 500 rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark , Delta Lake and MLflow. To learn more, follow Databricks on Twitter , LinkedIn and Facebook . Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region, please visit https: / / www.mybenefitsnow.com / databricks . Our Commitment to Diversity and Inclusion . Compliance If access to export-controlled technology or source code is required for performance of job duties, it is within Employers discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis alone.

Posted 3 months ago

Apply

2 - 6 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

Core Expertise: At least 2+ years of production support experience (Experience in Telecom system will be added advantage) L2 Application Support / Installation / Configuration experience for product IBM DataStage Ability to Troubleshoot Production/Product related issues. Must having knowledge with IBM DataStage administration and Data Integration. Lead the end-to-end design, implementation, and maintenance of the EDH, ensuring scalability, security, and high availability. Good knowledge of Python/Shell scripting Monitor data ingestion, processing, and quality assurance processes to ensure a reliable data pipeline. Technical Skills: Experience in Linux/Unix environments for system monitoring and script execution. Have hands-on experience of Oracle/Teradata Operational Skills: Monitor and manage daily DataStage to meet operational KPIs Address alerts and incidents related to data platform. Manage application administration and operation to meet operational KPIs. Collaborate with IT teams and other stakeholders to address issues Perform root cause analysis and implement permanent fixes for recurring issues. Generate regular reports on Data Platform performance, processed data, and error trends. Maintain up-to-date documentation of processes, configurations, and troubleshooting procedures. Use monitoring tools to track system health and data integrity. Strong troubleshooting skills with a focus on minimizing downtime and operational disruptions. Ability to analyze data and identify discrepancies or anomalies. Should be aware of ITSM processes Soft Skills: Willingness to work in 24 *7 shift based support environment Good verbal and written communication skills, with the ability to communicate effectively with cross-functional teams, stakeholders, and vendors. Strong analytical and problem-solving skills. Effective communication and collaboration abilities with cross-functional teams. High attention to detail and the ability to work under pressure in a fast-paced environment. Good to have Skills: Any relevant certification in administration related to Data Integration Products Minimum Work Experience: 3-6 years of experience in IBM DataStage

Posted 3 months ago

Apply

3 - 8 years

13 - 15 Lacs

Noida

Work from Office

Naukri logo

Develop Analytics Based Decision Making Frameworks for clients across Banking, Insurance sector IBM Datastage developement Client Management Support business development and new analytics solution development activities Skills and attributes for success Domain expertise in one of the industries across Banking, Insurance, not mandatory Statistical modelling (Logistic / Linear regression, GLM modelling, Time-series forecasting, Scorecard development etc.) Deep understanding of ETL processes, data warehousing concepts, and strong analytical skills. Design, develop, and implement ETL processes using IBM DataStage. Create and manage DataStage jobs & sequences. Develop and optimize SQL queries for data extraction, transformation, and loading. Integrate data from various sources including databases, flat files. Ensure data quality and consistency throughout the ETL process. Optimize DataStage jobs for performance and efficiency as well as Troubleshoot and resolve performance issues related to ETL processes. Conduct unit testing, integration testing, and system testing for ETL processes. Validate data accuracy and integrity post-migration. Work closely with QA teams to ensure high-quality deliverables. Create and maintain comprehensive documentation for ETL processes, data flows, and system configurations. Update documentation based on changes and enhancements. Collaborate with business analysts, data architects, and other stakeholders to understand data requirements and business needs. Participate in project meetings and provide regular updates on progress. Stay updated with the latest trends and best practices in ETL and data integration technologies. Suggest and implement improvements to existing processes and systems To qualify for the role you must have B Tech from top tier engineering schools or Masters in Statistics / Economics from the top universities Minimum 3 years of relevant experience Ideally you ll also have Strong communication, facilitation, relationship-building, presentation and negotiation skills. Be highly flexible, adaptable, and creative. Comfortable interacting with senior executives (within the firm and at the client) Strong leadership skills and supervisory responsibility.

Posted 3 months ago

Apply

5 - 10 years

8 - 14 Lacs

Mumbai

Work from Office

Naukri logo

- Hands-on and deep experience working as a lead in ETL Development with Informatica or DataStage, Strong in SQL Query. - Secondary Skills Data Modeling and Shell Scripting. - Pipeline Design and Code Optimization. - Good to have Cloud experience (GCP, Azure, AWS, Snowflake). - Good verbal and written communication skills. - Ability to communicate with customers, developers, and other stakeholders. - Mentor and guide team members. - Good Presentation skills. - Strong Team Player. Required Past Experience : - 5+ Years experience in Informatica or DataStage ETL experience. - Handled at least 2 development projects from ETL Perspective. - Designed and Developed ETL Pipeline for heterogeneous sources. - File (CSV,XML,JSON.)Processing. - Expertize in Shell/Python Scripting. - Hand-On Experience to write ETL Logic in SQL or PL/SQL. - ETL Testing and Troubleshooting. - Performed Performance Optimization of ETL Pipelines. - Provided technical guidance to Team Members. - Performed Code Reviews. - Experience in Data Modeling. - Involvement in Data Pipeline Design with an Architect. - Hands-on experience in Code Versioning Tools like Git , SVN. - Good Knowledge of Code Deployment Process and Documentation.

Posted 3 months ago

Apply

3 - 5 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : .Net Full Stack Development Good to have skills : Datastage Minimum 3 year(s) of experience is required Educational Qualification : BE Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using .Net Full Stack Development. Your typical day will involve working with cross-functional teams, analyzing business requirements, and developing scalable and maintainable applications. Roles & Responsibilities: Design, develop, and maintain scalable and maintainable applications using .Net Full Stack Development. Collaborate with cross-functional teams to analyze business requirements and develop technical solutions. Ensure the quality of the application by conducting unit testing and code reviews. Troubleshoot and debug issues in the application and provide timely resolutions. Stay updated with the latest technologies and trends in .Net Full Stack Development and apply them to improve the application development process. Professional & Technical Skills: Must To Have Skills:Proficiency in .Net Full Stack Development. Good To Have Skills:Experience with AngularJS, ReactJS, or VueJS. Strong understanding of software development principles and design patterns. Experience with database technologies such as SQL Server or Oracle. Experience with version control systems such as Git or SVN. Solid grasp of web development technologies such as HTML, CSS, and JavaScript. Additional Information: The candidate should have a minimum of 3 years of experience in .Net Full Stack Development. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering scalable and maintainable applications. This position is based at our Pune office. Qualification BE

Posted 3 months ago

Apply

2 - 6 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : IBM InfoSphere DataStage Good to have skills : NA Minimum 2 year(s) of experience is required Educational Qualification : 15 years full time education with Engineering or equivalent Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will work closely with the team to ensure the successful delivery of high-quality software solutions. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Collaborate with cross-functional teams to gather and analyze requirements. Design, develop, and test applications using IBM InfoSphere DataStage. Troubleshoot and debug issues in existing applications. Ensure the scalability and performance of applications. Document technical specifications and user guides for reference. Professional & Technical Skills: Must To Have Skills:Proficiency in IBM InfoSphere DataStage. Strong understanding of ETL concepts and data integration techniques. Experience in designing and implementing data integration solutions. Knowledge of SQL and database concepts. Familiarity with data warehousing and data modeling. Good To Have Skills:Experience with IBM InfoSphere Information Server. Experience with other ETL tools such as Informatica or Talend. Additional Information: The candidate should have a minimum of 2 years of experience in IBM InfoSphere DataStage. This position is based at our Bengaluru office. A 15 years full time education with Engineering or equivalent is required. Qualification 15 years full time education with Engineering or equivalent

Posted 3 months ago

Apply

3 - 5 years

5 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : IBM InfoSphere DataStage Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will work closely with the team to ensure the successful delivery of high-quality solutions. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Collaborate with cross-functional teams to gather and analyze requirements. Design, develop, and test software applications using IBM InfoSphere DataStage. Troubleshoot and debug applications to identify and fix issues. Optimize application performance and ensure scalability. Document technical specifications and user guides for reference. Professional & Technical Skills: Must To Have Skills:Proficiency in IBM InfoSphere DataStage. Strong understanding of ETL concepts and data integration. Experience with data modeling and database design. Knowledge of SQL and database querying languages. Familiarity with data warehousing and business intelligence concepts. Additional Information: The candidate should have a minimum of 3 years of experience in IBM InfoSphere DataStage. This position is based at our Bengaluru office. A 15 years full time education is required. Qualification 15 years full time education

Posted 3 months ago

Apply

3 - 7 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : IBM InfoSphere DataStage Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : Required UG Degree Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using IBM InfoSphere DataStage. Your typical day will involve working with the development team, analyzing business requirements, and developing solutions to meet those requirements. Roles & Responsibilities: Design, develop, and maintain ETL processes using IBM InfoSphere DataStage. Collaborate with cross-functional teams to analyze business requirements and develop solutions to meet those requirements. Develop and maintain technical documentation related to ETL processes. Perform unit testing and support system testing and user acceptance testing. Troubleshoot and resolve issues related to ETL processes. Professional & Technical Skills: Must To Have Skills:Strong experience in IBM InfoSphere DataStage. Good To Have Skills:Experience in SQL, Unix, and Shell scripting. Experience in designing, developing, and maintaining ETL processes. Experience in analyzing business requirements and developing solutions to meet those requirements. Experience in performing unit testing and supporting system testing and user acceptance testing. Additional Information: The candidate should have a minimum of 3 years of experience in IBM InfoSphere DataStage. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Bengaluru office. Qualification Required UG Degree

Posted 3 months ago

Apply

5 - 10 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Informatica PowerCenter Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : regular 15 years of Education Summary :As an Application Lead for Custom Software Engineering, you will be responsible for leading the effort to design, build, and configure applications using Informatica PowerCenter. Your typical day will involve collaborating with cross-functional teams, ensuring timely delivery of projects, and acting as the primary point of contact for the project. Roles & Responsibilities: Lead the design, development, and implementation of Informatica PowerCenter-based ETL solutions. Collaborate with cross-functional teams to ensure timely delivery of projects. Act as the primary point of contact for the project, providing guidance and support to team members. Ensure adherence to best practices and standards for software development, testing, and deployment. Provide technical leadership and mentorship to team members, ensuring their professional growth and development. Professional & Technical Skills: Must To Have Skills:Strong experience in Informatica PowerCenter. Good To Have Skills:Experience in other ETL tools like DataStage, Talend, or SSIS. Experience in designing and implementing ETL solutions for complex data integration scenarios. Strong understanding of data warehousing concepts and best practices. Experience in SQL and database technologies like Oracle, SQL Server, or Teradata. Experience in Unix/Linux environments and shell scripting. Additional Information: The candidate should have a minimum of 5 years of experience in Informatica PowerCenter. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering high-quality ETL solutions. This position is based at our Bengaluru office. Qualifications regular 15 years of Education

Posted 3 months ago

Apply

3 - 5 years

5 - 9 Lacs

Mumbai

Work from Office

Naukri logo

Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Data Warehouse ETL Testing Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. You will play a crucial role in ensuring the smooth functioning of applications and their alignment with business needs. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Collaborate with cross-functional teams to gather and analyze requirements. Design and develop application solutions based on business process requirements. Create technical specifications and design documents. Perform unit testing and debugging of applications. Conduct code reviews and provide feedback to team members. Stay updated with emerging technologies and industry trends. Assist in troubleshooting and resolving application issues. Professional & Technical Skills: Must To Have Skills:Proficiency in Data Warehouse ETL Testing. Experience with SQL and database concepts. Strong understanding of data warehousing principles and methodologies. Knowledge of ETL tools such as Informatica or DataStage. Familiarity with data modeling and schema design. Good To Have Skills:Experience with Agile development methodologies. Additional Information: The candidate should have a minimum of 3 years of experience in Data Warehouse ETL Testing. This position is based at our Mumbai office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 3 months ago

Apply

7 - 12 years

9 - 14 Lacs

Coimbatore

Work from Office

Naukri logo

Project Role :Application Lead Project Role Description :Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills :Data Warehouse ETL Testing Good to have skills :NA Minimum 7.5 year(s) of experience is required Educational Qualification :15 years full time education Summary:As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for overseeing the entire application development process and ensuring its successful implementation. Your role will involve collaborating with cross-functional teams, managing the team's performance, and making key decisions. You will contribute to problem-solving and provide solutions for your immediate team and across multiple teams. This is an exciting opportunity to showcase your leadership skills and drive the success of application development projects. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute to key decisions Provide solutions to problems for their immediate team and across multiple teams Lead the effort to design, build, and configure applications Act as the primary point of contact for application-related matters Oversee the entire application development process Professional & Technical Skills: Must To Have Skills:Proficiency in Data Warehouse ETL Testing Strong understanding of data warehousing concepts and ETL processes Experience in designing and implementing ETL test strategies and test plans Hands-on experience with ETL testing tools such as Informatica or DataStage Solid knowledge of SQL and database concepts Additional Information: The candidate should have a minimum of 7.5 years of experience in Data Warehouse ETL Testing This position is based at our Bengaluru office A 15 years full-time education is required Qualifications 15 years full time education

Posted 3 months ago

Apply

3 - 8 years

5 - 10 Lacs

Coimbatore

Work from Office

Naukri logo

Project Role :Application Developer Project Role Description :Design, build and configure applications to meet business process and application requirements. Must have skills :Data Warehouse ETL Testing Good to have skills :NA Minimum 3 year(s) of experience is required Educational Qualification :15 years full time education Summary:As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will play a crucial role in ensuring the smooth functioning of applications and their alignment with business needs. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work related problems. Collaborate with cross-functional teams to gather and analyze business requirements. Design, develop, and test software applications using Data Warehouse ETL Testing techniques. Create and maintain technical documentation for applications. Troubleshoot and debug application issues to ensure optimal performance. Participate in code reviews to ensure adherence to coding standards and best practices. Stay updated with emerging technologies and industry trends to continuously improve application development processes. Professional & Technical Skills: Must To Have Skills:Proficiency in Data Warehouse ETL Testing. Experience with SQL and database concepts. Strong understanding of data warehousing principles and methodologies. Knowledge of ETL tools such as Informatica or DataStage. Familiarity with data modeling and data integration techniques. Good To Have Skills:Experience with Agile development methodologies. Experience with automated testing frameworks. Knowledge of cloud-based data warehousing solutions. Additional Information: The candidate should have a minimum of 3 years of experience in Data Warehouse ETL Testing. This position is based at our Bengaluru office. A 15 years full time education is required. Qualifications 15 years full time education

Posted 3 months ago

Apply

3 - 6 years

10 - 19 Lacs

Hyderabad

Hybrid

Naukri logo

Primary Responsibilities: Design, develop, and maintain advanced data pipelines and ETL processes using niche technologies Collaborate with cross-functional teams to understand complex data requirements and deliver tailored solutions Ensure data quality and integrity by implementing robust data validation and monitoring processes Optimize data systems for performance, scalability, and reliability Develop comprehensive documentation for data engineering processes and systems Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Undergraduate degree or equivalent experience 3+ years of experience in building data pipeline using DataStage, Spark, Python etc. Good experience in any RDBMS Exposure to any Cloud Platform Knowledge in Healthcare domain #NJP

Posted 3 months ago

Apply

3 - 6 years

10 - 19 Lacs

Gurgaon

Hybrid

Naukri logo

Qualifications - External • 3+ years of relevant Datastage development experience. • 2 - 3 years experience in development/coding on Spark/Scala or Python or Pyspark. • 1 - 2 years of experience working on Microsoft Azure Databricks • Relevant experience on Databases like Teradata, Snowflake • Sound knowledge of SQL programming and SQL Query Skills • Hands-on development experience in UNIX scripting. • Exposure to job schedulers like Airflow and ability to create and modify DAGs • Strong Communication skills (written and Verbal) • Ability to understand the existing application codebase, perform impact analysis and update the code when required based on the business logic or for optimization • Exposure on DevOps methodology and creating CI/CD deployment pipeline • Excellent Analytical and Communication skills (Both Verbal and Written) • Experience in working on data warehousing projects • Experience with Test Driven Development and Agile methodologies • Proficient in learning & adopting new technologies and use them to execute the use cases for business problem solving • Ability to apply the knowledge of principles and techniques to solve technical problems and write code based on technical design. #NJP

Posted 3 months ago

Apply

2 - 7 years

4 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Job Job Title Datastage developer Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills:Technology->Data Management - Data Integration->DataStage Preferred Skills: Technology->ETL & Data Quality->IBM Infosphere Datastage Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit * Location of posting is subject to business requirements

Posted 3 months ago

Apply

12 - 14 years

20 - 35 Lacs

Pune

Work from Office

Naukri logo

Role: Datasatge Tech Lead Location - Pune (Hybrid) / Remote Experience: 12 to 14 Years Primary Skill: Datastage, ADF, Azure, SQL, ETL Skills & Qualifications 10+ years of experience as a Data Engineer or similar role. Proven expertise in ETL Tool for data pipeline development and data integration. Strong understanding of data warehousing principles and experience with cloud-based data warehouse solutions. Experience with data quality tools and techniques for data cleansing and validation. Proficiency in SQL scripting for data manipulation and querying experience with cloud-based SQL solutions like Azure SQL Database or Oracle (a plus). Familiarity with cloud platforms like Azure for data storage and processing (a plus). Excellent analytical and problem-solving skills. Effective communication and collaboration skills. Ability to work independently and manage multiple tasks effectively

Posted 3 months ago

Apply

1 - 4 years

3 - 7 Lacs

Gurgaon

Work from Office

Naukri logo

ql-editor "> Mission As a Spark Technical Solutions Engineer, you will provide a deep dive technical and consulting related solutions for the challenging Spark / ML / AI / Delta / Streaming / Lakehouse reported issues by our customers and resolve any challenges involving the Data bricks unified analytics platform with your highly comprehensive technical and customer communication skills. You will assist our customers in their Data bricks journey and provide them with the guidance, knowledge, and expertise that they need to realize value and achieve their strategic objectives using our products. Responsibilities: Performing initial level analysis and troubleshooting issues in Spark using Spark UI metrics, DAG, Event Logs for various customer reported job slowness issues. Troubleshoot, resolve and suggest deep code-level analysis of Spark to address customer issues related to Spark core internals, Spark SQL, Structured Streaming, Delta, Lakehouse and other data bricks runtime features. Assist the customers in setting up reproducible spark problems with solutions in the areas of Spark SQL, Delta, Memory Management, Performance tuning, Streaming, Data Science, Data Integration areas in Spark. Participate in the Designated Solutions Engineer program and drive one or two of strategic customers day to day Spark and Cloud issues. Plan and coordinate with Account Executives, Customer Success Engineers and Resident Solution Architects for coordinating the customer issues and best practices guidelines. Participate in screen sharing meetings, answering slack channel conversations with our internal stakeholders and customers, helping in driving the major spark issues at an individual contributor level. Build an internal wiki, knowledge base with technical documentation, manuals for the support team and for the customers. Participate in the creation and maintenance of company documentation and knowledge base articles. Coordinate with Engineering and Backline Support teams to provide assistance in identifying, reporting product defects. Participate in weekend and weekday on-call rotation and run escalations during data bricks runtime outages, incident situations, ability to multitask and plan day 2 day activities and provide escalated level of support for critical customer operational issues, etc. Provide best practices guidance around Spark runtime performance and usage of Spark core libraries and APIs for custom-built solutions developed by Data bricks customers. Be a true proponent of customer advocacy. Contribute in the development of tools/automation initiatives. Provide front line support on the third party integrations with Data bricks environment. Review the Engineering JIRA tickets and proactively intimate the support leadership team for following up on the action items. Manage the assigned spark cases on a daily basis and adhere to committed SLAs. Achieving above and beyond expectations of the support organization KPIs. Strengthen your AWS/Azure and Data bricks platform expertise through continuous learning and internal training programs. Competencies Min 6 years of experience in designing, building, testing, and maintaining Python/Java/Scala based applications in typical project delivery and consulting environments. 3 years of hands-on experience in developing any two or more of the Big Data, Hadoop, Spark, Machine Learning, Artificial Intelligence, Streaming, Kafka, Data Science, Elasticsearch related industry use cases at the production scale. Spark experience is mandatory. Hands on experience in the performance tuning/troubleshooting of Hive and Spark based applications at production scale. Proven and real time experience in JVM and Memory Management techniques such as Garbage collections, Heap/Thread Dump Analysis is preferred. Working and hands-on experience with any SQL-based databases, Data Warehousing/ETL technologies like Informatica, DataStage, Oracle, Teradata, SQL Server, MySQL and SCD type use cases is preferred. Hands-on experience with AWS or Azure or GCP is preferred Excellent written and oral communication skills Linux/Unix administration skills is a plus Working knowledge in Data Lakes and preferably on the SCD types use cases at production scale. Demonstrated analytical and problem-solving skills, particularly those that apply to a Distributed Big Data Computing environment.

Posted 3 months ago

Apply

5 - 9 years

10 - 14 Lacs

Mumbai

Work from Office

Naukri logo

The purpose of this role is to oversee the development of our database marketing solutions, using database technologies such as Microsoft SQL Server/Azure, Amazon Redshift, Google BigQuery. The role will be involved in design, development, troubleshooting, and issue resolution. The role involves upgrading, enhancing, and optimizing the technical solution. It involves continuous integration and continuous deployment of various requirements changes in the business logic implementation. Interactions with internal stakeholders and/or clients to explain technology solutions and a clear understanding of client s business requirements through which to guide optimal design/solution to meet their needs. The ability to communicate to both technical and non-technical audiences is key. Job Description: Must Have Skills: Database (SQL server / Snowflake / Teradata / Redshift / Vertica / Oracle / Big query / Azure DW etc. ETL (Extract, Transform, Load) tool (Talend, Informatica, SSIS, DataStage, Matillion) Python, UNIX shell scripting, Project resource management Workflow Orchestration (Tivoli, Tidal, Stonebranch) Client-facing skills Good to have Skills: Experience in Cloud computing (one or more of AWS, Azure, GCP) . AWS Preferred. Key responsibilities: Understanding and practical knowledge of data warehouse, data mart, data modelling, data structures, databases, and data ingestion and transformation Strong understanding of ETL processes as well as database skills and common IT offerings i.e. storage, backups and operating system. Has a strong understanding of the SQL and data base programming language Has strong knowledge of development methodologies and tools Contribute to design and oversees code reviews for compliance with development standards Designs and implements technical vision for existing clients Able to convert documented requirements into technical solutions and implement the same in given timeline with quality issues. Able to quickly identify solutions for production failures and fix them. Document project architecture, explain detailed design to team and create low level to high level design. Perform mid to complex level tasks independently. Support Client, Data Scientists and Analytical Consultants working on marketing solution. Work with cross functional internal team and external clients . Strong project Management and organization skills . Ability to lead/work 1 - 2 projects of team size 2 - 3 team members. Code management systems which include Code review and deployments Location: DGS India - Pune - Baner M- Agile Brand: Merkle Time Type: Full time Contract Type: Permanent

Posted 3 months ago

Apply

2 - 5 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Good Hands on experience in DBT is required. ETL Datastage and snowflake preferred. Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences

Posted 3 months ago

Apply

2 - 7 years

4 - 9 Lacs

Noida

Work from Office

Naukri logo

Project Role : Technology Products & Offering Developer Project Role Description : Use in-depth industry knowledge and a solid foundation of Software Engineering, to deliver market-leading software, platforms, products, offerings and/or assets for Accenture or its clients. Must have skills : Microsoft SQL Server Integration Services (SSIS) Good to have skills : NA Minimum 2 year(s) of experience is required Educational Qualification : Btech Computer Science and Engineering Summary :As a Technology Products & Offering Developer, you will be responsible for utilizing your expertise in Microsoft SQL Server Integration Services SSIS to deliver market-leading software, platforms, products, offerings, and/or assets for Accenture or its clients. Your typical day will involve working with SSIS, developing and deploying advanced solutions to meet client needs. Roles & Responsibilities: Lead the development and deployment of advanced solutions using Microsoft SQL Server Integration Services SSIS. Collaborate with cross-functional teams to understand client needs and develop solutions that meet those needs. Conduct detailed analysis of complex data sets, employing statistical methodologies and data munging techniques for actionable insights. Communicate technical findings effectively to stakeholders, utilizing data visualization tools for clarity. Stay updated with the latest advancements in Microsoft SQL Server Integration Services SSIS and related technologies, integrating innovative approaches for sustained competitive advantage. Professional & Technical Skills: Must To Have Skills:Proficiency in Microsoft SQL Server Integration Services SSIS. Good To Have Skills:Experience with other ETL tools such as Informatica or DataStage. Strong understanding of database concepts and SQL. Experience with data visualization tools such as Tableau or Power BI. Experience in implementing various data integration and transformation solutions. Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: The candidate should have a minimum of 2 years of experience in Microsoft SQL Server Integration Services SSIS. The ideal candidate will possess a strong educational background in computer science, information technology, or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Noida office. Qualifications Btech Computer Science and Engineering

Posted 3 months ago

Apply

5 - 10 years

7 - 12 Lacs

Mumbai, Hyderabad

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : MicroStrategy Business Intelligence Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : Graduate Minimum 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using MicroStrategy Business Intelligence. Your typical day will involve collaborating with cross-functional teams, analyzing business requirements, and developing solutions to meet those requirements. Roles & Responsibilities: Design, develop, and maintain MicroStrategy Business Intelligence applications to meet business requirements. Collaborate with cross-functional teams to analyze business requirements and develop solutions to meet those requirements. Develop and maintain technical documentation related to MicroStrategy Business Intelligence applications. Provide technical support and troubleshooting for MicroStrategy Business Intelligence applications. Professional & Technical Skills: Must To Have Skills:Strong experience in MicroStrategy Business Intelligence. Good To Have Skills:Experience in other Business Intelligence tools such as Tableau, Power BI, or QlikView. Experience in designing, developing, and maintaining MicroStrategy Business Intelligence applications. Strong understanding of data warehousing concepts and database design principles. Experience in SQL and database programming. Experience in ETL tools such as Informatica or DataStage. Additional Information: The candidate should have a minimum of 12 years of experience in MicroStrategy Business Intelligence. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Chennai office. Qualifications Graduate Minimum 15 years full time education

Posted 3 months ago

Apply

3 - 8 years

5 - 10 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : No Function Specialty Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for ensuring the smooth functioning of applications and addressing any issues that may arise. Your typical day will involve collaborating with the team to understand requirements, designing and developing applications, and testing and debugging code to ensure its functionality and performance. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work-related problems. Collaborate with the team to understand application requirements. Design and develop applications based on business process requirements. Test and debug code to ensure its functionality and performance. Identify and address any issues or bugs in the applications. Provide technical support and guidance to end-users. Stay updated with the latest industry trends and technologies. Assist in the deployment and maintenance of applications. Professional & Technical Skills: Must To Have Skills:Proficiency in Ab Initio. Strong understanding of data integration and ETL concepts. Experience in designing and developing ETL workflows using Ab Initio. Knowledge of database concepts and SQL. Familiarity with data warehousing and data modeling. Good To Have Skills:Experience with data visualization tools such as Tableau or Power BI. Experience with other ETL tools like Informatica or DataStage. Knowledge of scripting languages like Python or Shell scripting. Additional Information: The candidate should have a minimum of 3 years of experience in Ab Initio. This position is based at our Pune office. A 15 years full-time education is required. Qualifications 15 years full time education

Posted 3 months ago

Apply

Exploring Datastage Jobs in India

Datastage is a popular ETL (Extract, Transform, Load) tool used by organizations to extract data from different sources, transform it, and load it into a target data warehouse. The demand for datastage professionals in India has been on the rise due to the increasing reliance on data-driven decision-making by companies across various industries.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

These cities are known for their vibrant tech industries and have a high demand for datastage professionals.

Average Salary Range

The average salary range for datastage professionals in India varies based on experience levels. Entry-level positions can expect to earn around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 12-15 lakhs per annum.

Career Path

In the datastage field, a typical career progression may look like: - Junior Developer - ETL Developer - Senior Developer - Tech Lead - Architect

As professionals gain experience and expertise in datastage, they can move up the ladder to more senior and leadership roles.

Related Skills

In addition to proficiency in datastage, employers often look for candidates with the following skills: - SQL - Data warehousing concepts - ETL tools like Informatica, Talend - Data modeling - Scripting languages like Python or Shell scripting

Having a diverse skill set can make a candidate more competitive in the job market.

Interview Questions

  • What is Datastage and how does it differ from other ETL tools? (basic)
  • Explain the difference between a server job and a parallel job in Datastage. (medium)
  • How do you handle errors in Datastage? (medium)
  • What is a surrogate key and how is it generated in Datastage? (advanced)
  • How would you optimize performance in a Datastage job? (medium)
  • Explain the concept of partitioning in Datastage. (medium)
  • What is a Datastage transformer stage and how is it used? (medium)
  • How do you handle incremental loads in Datastage? (advanced)
  • What is a lookup stage in Datastage and when would you use it? (medium)
  • Describe the difference between sequential file and dataset stages in Datastage. (basic)
  • What is a configuration file in Datastage and how is it used? (medium)
  • How do you troubleshoot Datastage job failures? (medium)
  • Explain the purpose of the Datastage director. (basic)
  • How do you handle data quality issues in Datastage? (advanced)
  • What is a shared container in Datastage and how is it beneficial? (medium)
  • Describe the difference between persistent data and hashed file stages in Datastage. (medium)
  • How do you schedule Datastage jobs for execution? (basic)
  • Explain the use of parameter sets in Datastage. (medium)
  • What is a Datastage transformer variable and how is it defined? (medium)
  • How do you handle complex transformations in Datastage? (advanced)
  • What is a surrogate key and how is it generated in Datastage? (advanced)
  • How do you handle rejected data in Datastage? (medium)
  • Describe the purpose of a Datastage job sequencer. (medium)
  • How do you handle metadata in Datastage? (medium)
  • Explain the concept of parallel processing in Datastage. (medium)

Closing Remark

As you explore job opportunities in Datastage in India, remember to showcase your skills and knowledge confidently during interviews. By preparing well and demonstrating your expertise, you can land a rewarding career in this growing field. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies