Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
15 - 20 years
11 - 15 Lacs
Mumbai
Work from Office
More than 15 years of experience in Technical, Solutioning, and Analytical roles. 5+ years of experience in building and managing Data Lakes, Data Warehouses, Data Integration, Data Migration, and Busines Intelligence/Artificial Intelligence solutions in the Cloud (GCP/AWS/Azure) Ability to understand business requirements, translate them into functional and non-functional areas, and define non-functional boundaries in terms of Availability, Scalability, Performance, Security, Resilience, etc. Experience in architecting, designing, and implementing end-to-end data pipelines and data integration solutions for varied structured and unstructured data sources and targets. Experience of having worked in distributed computing and enterprise environments like Hadoop, GCP/AWS/Azure Cloud. Well-versed with various Data Integration, and ETL technologies in Cloud like Spark, Pyspark/Scala, Dataflow, DataProc, EMR, etc. on various Cloud. Experience of having worked with traditional ETL tools like Informatica / DataStage / OWB / Talend , etc. Deep knowledge of one or more Cloud and On-Premise Databases like Cloud SQL, Cloud Spanner, Big Table, RDS, Aurora, DynamoDB, Oracle, Teradata, MySQL, DB2, SQL Server, etc. Exposure to any of the No-SQL databases like Mongo dB, CouchDB, Cassandra, Graph dB, etc. Experience in architecting and designing scalable data warehouse solutions on the cloud on Big Query or Redshift. Experience in having worked on one or more data integration, storage, and data pipeline toolsets like S3, Cloud Storage, Athena, Glue, Sqoop, Flume, Hive, Kafka, Pub-Sub, Kinesis, Dataflow, DataProc, Airflow, Composer, Spark SQL, Presto, EMRFS, etc. Preferred experience of having worked on Machine Learning Frameworks like TensorFlow, PyTorch, etc. Good understanding of Cloud solutions for Iaas, PaaS, SaaS, Containers, and Microservices Architecture and Design. Ability to compare products and tools across technology stacks on Google, AWS, and Azure Cloud. Good understanding of BI Reposting and Dashboarding and one or more toolsets associated with it like Looker, Tableau, Power BI, SAP BO, Cognos, Superset, etc. Understanding of Security features and Policies in one or more Cloud environments like GCP/AWS/Azure. Experience of having worked in business transformation projects for the movement of On-Premise data solutions to Clouds like GCP/AWS/Azure. Role: Lead multiple data engagements on GCP Cloud for data lakes, data engineering, data migration, data warehouse, and business intelligence. Interface with multiple stakeholders within IT and business to understand the data requirements. Take complete responsibility for the successful delivery of all allocated projects on the parameters of Schedule, Quality, and Customer Satisfaction. Responsible for design and development of distributed, high volume multi-thread batch, real-time, and event processing systems. Implement processes and systems to validate data, and monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it. Work with the Pre-Sales team on RFP, and RFIs and help them by creating solutions for data. Mentor Young Talent within the Team, Define and track their growth parameters. Contribute to building Assets and Accelerators. Other Skills: Strong Communication and Articulation Skills. Good Leadership Skills. Should be a good team player. Good Analytical and Problem-solving skills.
Posted 2 months ago
5 - 9 years
10 - 14 Lacs
Mumbai
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : IBM InfoSphere DataStage Good to have skills : Java Minimum 5 year(s) of experience is required Educational Qualification : Must have minimum 15 years of full time education Summary :As an Application Lead for Software Development, you will be responsible for leading the effort to design, build, and configure applications using IBM InfoSphere DataStage. Your typical day will involve collaborating with cross-functional teams, managing project timelines, and ensuring the successful delivery of high-quality software solutions. Roles & Responsibilities: Lead the design, development, and implementation of software applications using IBM InfoSphere DataStage. Act as the primary point of contact for the project, collaborating with cross-functional teams to ensure project timelines are met. Manage project resources, including developers, testers, and other team members, to ensure successful delivery of high-quality software solutions. Provide technical guidance and mentorship to team members, ensuring adherence to best practices and standards for software development. Identify and mitigate project risks, communicating effectively with stakeholders to ensure project success. Professional & Technical Skills: Must To Have Skills:Strong experience with IBM InfoSphere DataStage. Good To Have Skills:Experience with Java. Solid understanding of software development best practices and methodologies. Experience with project management tools and techniques. Strong analytical and problem-solving skills. Experience as a DataStage developer Proficiency in SQL or another relevant language. Test Driven Development experience is an advantage Agile mindset and Scrum values are a must, SAFe experience is preferred Additional Information: At least 6 years of proven experience as an ETL developer and experience of working with IBM InfoSphere DataStage. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering high-quality software solutions. This position is based at our Bengaluru office. Must have minimum 15 years of full time education. Must have Good Communication skill. Qualification Must have minimum 15 years of full time education
Posted 2 months ago
3 - 7 years
5 - 9 Lacs
Mumbai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica PowerCenter Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : Graduate Summary :As a Data Platform Engineer, you will be responsible for assisting with the blueprint and design of the data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models using Informatica PowerCenter. Roles & Responsibilities: Collaborate with Integration Architects and Data Architects to design and implement data platform components using Informatica PowerCenter. Assist with the development and deployment of data integration solutions, including data mapping, data transformation, and data quality. Ensure data platform components are scalable, reliable, and maintainable. Collaborate with cross-functional teams to troubleshoot and resolve data integration issues. Professional & Technical Skills: Must To Have Skills:Proficiency in Informatica PowerCenter. Good To Have Skills:Experience with other ETL tools such as Talend or DataStage. Strong understanding of data integration concepts and techniques. Experience with data modeling and database design. Experience with SQL and relational databases. Solid grasp of data quality and data governance best practices. Additional Information: The candidate should have a minimum of 3 years of experience in Informatica PowerCenter. The ideal candidate will possess a strong educational background in computer science, information technology, or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Mumbai office. Qualification Graduate
Posted 2 months ago
3 - 7 years
10 - 14 Lacs
Mumbai
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Data Warehouse ETL Testing Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : Graduation Summary :As an Application Lead for Data Warehouse ETL Testing, you will be responsible for leading the effort to design, build, and configure applications. You will act as the primary point of contact and work with a team to ensure the successful delivery of projects. Your typical day will involve testing and validating data warehouse ETL processes, identifying and resolving issues, and collaborating with cross-functional teams to ensure project success. Roles & Responsibilities: Lead the effort to design, build, and configure applications, acting as the primary point of contact. Test and validate data warehouse ETL processes, identifying and resolving issues to ensure data accuracy and integrity. Collaborate with cross-functional teams to ensure project success, including developers, business analysts, and project managers. Develop and maintain test plans, test cases, and test scripts to ensure comprehensive testing coverage. Provide guidance and mentorship to junior team members, ensuring their growth and development within the organization. Professional & Technical Skills: Must To Have Skills:Strong experience in Data Warehouse ETL Testing. Good To Have Skills:Experience with SQL, Unix, and scripting languages. Experience with ETL tools such as Informatica, DataStage, or Talend. Strong understanding of data warehousing concepts and methodologies. Experience with Agile development methodologies. Excellent analytical and problem-solving skills. Additional Information: The candidate should have a minimum of 3 years of experience in Data Warehouse ETL Testing. The ideal candidate will possess a strong educational background in computer science, information technology, or a related field. This position is based at our Mumbai office. Qualification Graduation
Posted 2 months ago
2 - 4 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : IBM InfoSphere DataStage Good to have skills : NA Minimum 2 year(s) of experience is required Educational Qualification : 15 years of full time education Summary :As an Application Developer for Packaged Application Development, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using IBM InfoSphere DataStage. Your typical day will involve collaborating with cross-functional teams, analyzing business requirements, and developing solutions to meet those requirements. Roles & Responsibilities: Design, build, and configure applications using IBM InfoSphere DataStage to meet business process and application requirements. Collaborate with cross-functional teams to analyze business requirements and develop solutions to meet those requirements. Develop and maintain technical documentation related to application development. Provide technical support and troubleshooting for applications developed using IBM InfoSphere DataStage. Professional & Technical Skills: Must To Have Skills:2+ years of experience in IBM InfoSphere DataStage. Good To Have Skills:Experience in other ETL tools such as Informatica or Talend. Strong understanding of data warehousing concepts and ETL processes. Experience in SQL and database technologies such as Oracle or SQL Server. Experience in Unix/Linux environments and shell scripting. Experience in Agile development methodologies. Additional Information: The candidate should have a minimum of 2 years of experience in IBM InfoSphere DataStage. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Bengaluru office. Qualification 15 years of full time education
Posted 2 months ago
2 - 4 years
5 - 9 Lacs
Chennai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years of full term education Summary :As an Application Developer with expertise in Ab Initio, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve working with Ab Initio, collaborating with cross-functional teams, and delivering high-quality solutions. Roles & Responsibilities: Design, build, and configure applications to meet business process and application requirements using Ab Initio. Collaborate with cross-functional teams to identify and prioritize requirements, ensuring that solutions are delivered on time and within budget. Develop and maintain technical documentation, including design documents, test plans, and user manuals. Troubleshoot and resolve technical issues, ensuring that applications are running smoothly and efficiently. Stay up-to-date with the latest trends and technologies in Ab Initio and related fields, integrating innovative approaches for sustained competitive advantage. Professional & Technical Skills: Must To Have Skills:Expertise in Ab Initio. Good To Have Skills:Experience with ETL tools such as Informatica or DataStage. Strong understanding of data warehousing concepts and principles. Experience with SQL and relational databases such as Oracle or SQL Server. Experience with Unix/Linux operating systems and shell scripting. Solid grasp of software development life cycle (SDLC) methodologies and best practices. Additional Information: The candidate should have a minimum of 2 years of experience in Ab Initio. The ideal candidate will possess a strong educational background in computer science, software engineering, or a related field, along with a proven track record of delivering high-quality solutions. This position is based at our Chennai office. Qualification 15 years of full term education
Posted 2 months ago
5 - 10 years
5 - 9 Lacs
Chennai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Microsoft SQL Server Integration Services (SSIS) Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years of full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Microsoft SQL Server Integration Services (SSIS). Your typical day will involve working with SSIS, developing and testing applications, and collaborating with cross-functional teams to ensure successful project delivery. Roles & Responsibilities: Design, develop, and test applications using Microsoft SQL Server Integration Services (SSIS) to meet business process and application requirements. Collaborate with cross-functional teams to ensure successful project delivery, including working with business analysts, project managers, and quality assurance teams. Troubleshoot and debug applications to ensure optimal performance and functionality. Develop and maintain technical documentation, including design documents, user manuals, and test plans. Professional & Technical Skills: Must To Have Skills:Proficiency in Microsoft SQL Server Integration Services (SSIS). Good To Have Skills:Experience with other ETL tools such as Informatica or DataStage. Strong understanding of database concepts and SQL programming. Experience with data warehousing and data modeling. Experience with version control systems such as Git or SVN. Additional Information: The candidate should have a minimum of 5 years of experience in Microsoft SQL Server Integration Services (SSIS). The ideal candidate will possess a strong educational background in computer science, information technology, or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Chennai office. Qualification 15 years of full time education
Posted 2 months ago
5 - 7 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Warehouse ETL Testing Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years of full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve working with Data Warehouse ETL Testing and collaborating with cross-functional teams to ensure the successful delivery of data-driven solutions. Roles & Responsibilities: Lead the design, development, and implementation of Data Warehouse ETL Testing solutions. Collaborate with cross-functional teams to ensure the successful delivery of data-driven solutions. Develop and maintain ETL processes, including data mapping, data transformation, and data loading. Perform data analysis and validation to ensure data quality and integrity. Create and maintain technical documentation, including design documents, test plans, and user manuals. Professional & Technical Skills: Must To Have Skills:Strong experience in Data Warehouse ETL Testing. Good To Have Skills:Experience with SQL, Unix, and scripting languages. Solid understanding of data modeling and database design principles. Experience with ETL tools such as Informatica, DataStage, or Talend. Experience with data analysis and validation to ensure data quality and integrity. Additional Information: The candidate should have a minimum of 5 years of experience in Data Warehouse ETL Testing. The ideal candidate will possess a strong educational background in computer science, information technology, or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Pune office. Qualification 15 years of full time education
Posted 2 months ago
3 - 5 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Data Warehouse ETL Testing Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : Fulltime Education Summary :As an Application Lead for Data Warehouse ETL Testing, you will be responsible for leading the effort to design, build, and configure applications. You will act as the primary point of contact and work with a team to ensure the successful delivery of projects. Your typical day will involve testing and validating ETL processes, identifying and resolving issues, and collaborating with cross-functional teams to ensure project success. Roles & Responsibilities: Lead the effort to design, build, and configure applications for Data Warehouse ETL Testing. Act as the primary point of contact for the project and collaborate with cross-functional teams to ensure project success. Test and validate ETL processes, identify and resolve issues, and ensure data quality and integrity. Develop and maintain test plans, test cases, and test scripts, and ensure compliance with project requirements and standards. Provide technical guidance and mentorship to team members, and ensure adherence to best practices and industry standards. Professional & Technical Skills: Must To Have Skills:Strong experience in Data Warehouse ETL Testing. Good To Have Skills:Experience with ETL tools such as Informatica, Talend, or DataStage. Experience with SQL and database technologies such as Oracle, SQL Server, or MySQL. Experience with testing tools such as HP ALM, JIRA, or Quality Center. Strong understanding of data warehousing concepts and methodologies. Experience with Agile development methodologies and tools such as Scrum or Kanban. Additional Information: The candidate should have a minimum of 3 years of experience in Data Warehouse ETL Testing. The ideal candidate will possess a strong educational background in computer science, information technology, or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Bengaluru office. Qualification Fulltime Education
Posted 2 months ago
4 - 7 years
5 - 13 Lacs
Bengaluru
Work from Office
Data Engineer Please find the JD below for the ETL Snowflake developer role. 4 - 5 years of Strong experience on advanced SQL any Database. (preferably snowflake , Oracle or Teradata,). Extensive experience in data integration area and hands on experience in any of the ETL tools like Datastage , Informatica , Snaplogic etc. Able to transform technical requirements into data collection query. Should be capable of working with business and other IT teams and convert the requirements into queries. Good understanding of ETL architecture and design. Good knowledge in UNIX commands, Databases, SQL, PL/SQL Good to have experience on AWS Glue • Good to have knowledge on Qlik replication tool.
Posted 2 months ago
5 - 7 years
5 - 9 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SnapLogic Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 Years of full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using SnapLogic. Your typical day will involve working with the development team, analyzing business requirements, and developing solutions to meet those requirements. Roles & Responsibilities: Design, develop, and maintain SnapLogic integrations and workflows to meet business requirements. Collaborate with cross-functional teams to analyze business requirements and develop solutions to meet those requirements. Develop and maintain technical documentation for SnapLogic integrations and workflows. Troubleshoot and resolve issues with SnapLogic integrations and workflows. Professional & Technical Skills: Must To Have Skills:Strong experience in SnapLogic. Good To Have Skills:Experience in other ETL tools like Informatica, Talend, or DataStage. Experience in designing, developing, and maintaining integrations and workflows using SnapLogic. Experience in analyzing business requirements and developing solutions to meet those requirements. Experience in troubleshooting and resolving issues with SnapLogic integrations and workflows. Additional Information: The candidate should have a minimum of 5 years of experience in SnapLogic. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful solutions using SnapLogic. This position is based at our Pune office. Qualifications 15 Years of full time education
Posted 2 months ago
5 - 9 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : IBM InfoSphere DataStage Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with the team to develop and implement solutions, ensuring they align with business needs and standards. You will also engage with multiple teams, contribute to key decisions, and provide problem-solving solutions for your team and across multiple teams. With your creativity and expertise in IBM InfoSphere DataStage, you will play a crucial role in developing efficient and effective applications. Roles & Responsibilities: Expected to be an SME, collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Design, develop, and test applications using IBM InfoSphere DataStage. Collaborate with business analysts and stakeholders to gather requirements. Ensure applications meet business process and application requirements. Troubleshoot and debug applications to resolve issues. Create technical documentation for reference and reporting purposes. Professional & Technical Skills: Must To Have Skills:Proficiency in IBM InfoSphere DataStage. Strong understanding of ETL concepts and data integration. Experience in designing and implementing data integration solutions. Knowledge of SQL and database concepts. Experience with data warehousing and data modeling. Good To Have Skills:Experience with IBM InfoSphere Information Server. Familiarity with other ETL tools such as Informatica or Talend. Additional Information: The candidate should have a minimum of 5 years of experience in IBM InfoSphere DataStage. This position is based at our Bengaluru office. A 15 years full time education is required. Qualifications 15 years full time education
Posted 2 months ago
5 - 7 years
6 - 10 Lacs
Mumbai
Work from Office
Job Title: Senior DataStage Developer Location: Mumbai, India Experience: 5-7 Years Mandatory Skill : Datastage , Python SQL Job Description: We are seeking a highly skilled and experienced Senior DataStage Developer to join in Mumbai. The ideal candidate will possess strong expertise in IBM DataStage, Python, and SQL, with a proven track record of building and maintaining efficient ETL pipelines and data solutions. Key Responsibilities: Design, develop, and optimize ETL processes using IBM DataStage to integrate data from multiple sources into target systems. Develop robust and reusable Python scripts to support data transformations, automation, and integration tasks. Write and optimize complex SQL queries for data extraction, transformation, and loading processes. Collaborate with data architects, business analysts, and other stakeholders to understand data requirements and deliver effective solutions. Perform system tuning, troubleshooting, and debugging to ensure the accuracy and efficiency of ETL processes. Ensure data integrity and quality through validation, testing, and performance monitoring. Implement best practices in data governance, security, and compliance to meet organizational standards. Provide technical leadership and mentorship to junior developers, fostering skill development and knowledge sharing. Required Skills: ETL Development: Strong hands-on experience with IBM DataStage, including job design, development, and debugging. Python: Proficiency in writing scripts for data processing, automation, and integration. SQL: Advanced knowledge of SQL for data extraction, query optimization, and database operations. Experience with database technologies such as Oracle, SQL Server, or DB2. Strong understanding of data warehousing concepts, including star schemas, snowflake schemas, and dimensional modeling. Familiarity with Agile development methodologies and tools like JIRA or Confluence. Qualifications: Bachelor s or Master s degree in Computer Science, Information Technology, or a related field. 5-7 years of experience in ETL development and data engineering, with a focus on IBM DataStage, Python, and SQL. Strong problem-solving and analytical skills with attention to detail. Excellent communication and interpersonal skills, with the ability to work effectively in a team-oriented environment.
Posted 2 months ago
7 - 10 years
9 - 13 Lacs
Pune
Work from Office
Job Title: Lead DataStage Developer Location: Mumbai, India Experience: 7 -10 Years Mandatory Skill : Datastage , Python SQL Job Description: We are seeking a highly skilled and experienced Senior DataStage Developer to join in Mumbai. The ideal candidate will possess strong expertise in IBM DataStage, Python, and SQL, with a proven track record of building and maintaining efficient ETL pipelines and data solutions. Key Responsibilities: Design, develop, and optimize ETL processes using IBM DataStage to integrate data from multiple sources into target systems. Develop robust and reusable Python scripts to support data transformations, automation, and integration tasks. Write and optimize complex SQL queries for data extraction, transformation, and loading processes. Collaborate with data architects, business analysts, and other stakeholders to understand data requirements and deliver effective solutions. Perform system tuning, troubleshooting, and debugging to ensure the accuracy and efficiency of ETL processes. Ensure data integrity and quality through validation, testing, and performance monitoring. Implement best practices in data governance, security, and compliance to meet organizational standards. Provide technical leadership and mentorship to junior developers, fostering skill development and knowledge sharing. Required Skills: ETL Development: Strong hands-on experience with IBM DataStage, including job design, development, and debugging. Python: Proficiency in writing scripts for data processing, automation, and integration. SQL: Advanced knowledge of SQL for data extraction, query optimization, and database operations. Experience with database technologies such as Oracle, SQL Server, or DB2. Strong understanding of data warehousing concepts, including star schemas, snowflake schemas, and dimensional modeling. Familiarity with Agile development methodologies and tools like JIRA or Confluence. Qualifications: Bachelor s or Master s degree in Computer Science, Information Technology, or a related field. 5-7 years of experience in ETL development and data engineering, with a focus on IBM DataStage, Python, and SQL. Strong problem-solving and analytical skills with attention to detail. Excellent communication and interpersonal skills, with the ability to work effectively in a team-oriented environment.
Posted 2 months ago
4 - 9 years
10 - 14 Lacs
Indore, Bhilai/Bhillai, Raipur
Work from Office
Lead and manage a small team responsible for the development of NuoData TransformX s migration engine to transition Datastage workloads to modern data platforms. Design, develop, and implement a scalable migration framework to automate data movement from IBM Datastage to Databricks, Snowflake, or other target platforms. Collaborate with product managers, data architects, and engineers to shape the product roadmap and deliver high-impact solutions. Optimize data integration workflows, ensuring high performance, scalability, and cost efficiency . Troubleshoot and resolve issues related to data transformation, ingestion, and pipeline orchestration. Provide technical guidance, mentoring, and leadership to team members. Ensure compliance with data governance, security, and quality standards throughout the migration process. Keep up to date with the latest advancements in data integration, ETL modernization, and cloud-based data processing . Required Qualifications Skills: 4+ years of experience in data engineering and integration, with at least 1+ years working with IBM Datastage . Experience leading data migration projects from legacy ETL tools to modern cloud-based solutions . Strong hands-on expertise in Databricks, Snowflake, Apache Spark , or other modern data platforms. Deep understanding of ETL/ELT processes, data integration, and data transformation strategies . Proficiency in SQL, Python, Scala, or Java for data processing and transformation. Experience working with cloud ecosystems (AWS, Azure, GCP) for data warehousing and pipeline development. Familiarity with orchestration tools like Apache Airflow, Prefect, or similar. Strong problem-solving skills and ability to work in a fast-paced, collaborative environment . Excellent communication skills to coordinate with stakeholders and present technical solutions effectively. Preferred Qualifications: Certification in Databricks, Snowflake, or IBM Datastage . Experience with real-time data streaming technologies (Kafka, Spark Streaming, etc.) . Prior experience in leading migration automation initiatives
Posted 2 months ago
7 - 12 years
35 - 45 Lacs
Pune
Remote
Design ,develop &maintain ETL processes using DataStage for data integration & transformation Develop & optimize complex SQL queries Utilize Azure Data Factory (ADF) for data integration and orchestration Work with Datastage for data warehousing Required Candidate profile Experience with DataStage for ETL development. Experience with Azure Data Factory (ADF) for data integration. Experience with data warehousing and relational database design.
Posted 2 months ago
5 - 7 years
11 - 12 Lacs
Chennai
Work from Office
Job Title: DataStage ETL Developer Experience Range: 8+ years (including 3+ years in DataStage development and administration) Hiring Location: [Bangalore, Chennai, Hyderabad, Pune, Noida, Kochi, Trivandrum] Must-Have Skills: 3+ years of experience in DataStage development and administration Strong expertise in designing, developing, and maintaining DataStage ETL jobs Experience in troubleshooting DataStage job failures and providing root cause analysis Strong SQL skills and proficiency in relevant coding languages Ability to optimize DataStage jobs for performance Expertise in DataStage solution conception, design, and deployment Excellent analytical and problem-solving abilities Strong communication and interpersonal skills Ability to work in a multidisciplinary team Good-to-Have Skills: Experience with other ETL tools such as Informatica, Ab Initio, or Oracle ETL Knowledge of data warehousing concepts and best practices Experience in supporting production DataStage jobs and ongoing data loads Ability to review job designs and code created by other developers Familiarity with deployment and monitoring of DataStage jobs Job Responsibilities: Design, develop, and maintain DataStage ETL jobs Optimize existing DataStage jobs for better performance Review job designs and code from other developers Troubleshoot job failures and conduct root cause analysis Ensure adherence to company standards and best practices Support production DataStage jobs and ongoing data loads Develop new DataStage jobs to load data into the data warehouse Test DataStage jobs for accuracy and performance Deploy DataStage jobs to production environments Monitor execution and troubleshoot issues as needed Maintain proper documentation for all developed jobs Assist project leaders with timeframes and objectives
Posted 2 months ago
2 - 7 years
4 - 9 Lacs
Karnataka
Work from Office
Experienced data modelers, SQL, ETL, with some development background to provide defining new data schemas, data ingestion for Adobe Experience Platform customers. Interface directly with enterprise customers and collaborate with internal teams. 10+ years of strong experience with data transformation & ETL on large data sets/5+ years of Data Modelling experience (i.e., Relational, Dimensional, Columnar, Big Data) 5+ years of complex SQL or NoSQL experience Experience in advanced Data Warehouse concepts
Posted 2 months ago
2 - 7 years
4 - 9 Lacs
Dadra and Nagar Haveli, Chandigarh, Daman
Work from Office
Clientpersistent Role- C2H Location(Hybrid) Experience5-8 Years Budget20 LPA POC:Bhajan About The Role MUST have SkillData Stage, SnowSQL Skill Description ETL, Data Warehousing, Data Stage, Testing,IBM Datastage Data Stage, IBM Datastage, ETL, Testing, Data Warehousing, SnowSQL Location - Chandigarh,Dadra & Nagar Haveli,Daman,Diu,Goa,Haveli,Jammu,Lakshadweep,Nagar,New Delhi,Puducherry,Sikkim
Posted 2 months ago
2 - 7 years
4 - 9 Lacs
Pune
Work from Office
Description Skills requiredPrimaryInformatica Administrator SecondarySAP BODS and IBM Datastage & Cognos Administrator About The Role 4 to 6 years of experience as an Informatica Administrator or similar role with a proven track record of managing Informatica PowerCenter environments. Design deploy and manage Informatica PowerCenter environments including installation configuration and optimization of Informatica servers services and repositories. In-depth knowledge of Informatica PowerCenter architecture components and services including PowerCenter Repository Service Integration Service and Workflow Manager. Experience with performance tuning and optimization of Informatica workflows mappings and sessions. Proficiency in scripting languages such as Unix Shell scripting or PowerShell. Added advantage of having other ETL skills experience in SAP BODS IBM Datastage / Cognos. Monitor Informatica workflows sessions and tasks to ensure optimal performance reliability and availability. Troubleshoot and resolve issues related to Informatica workflows mappings transformations and data connectivity. Implement and manage security policies and access controls in Informatica to ensure data privacy and compliance with regulatory requirements. Collaborate with data engineers developers and business users to design and implement data integration solutions using Informatica PowerCenter. Develop and maintain Informatica workflows mappings and transformations to support data extraction transformation and loading (ETL) processes. Configure and manage connectivity to various data sources and targets including databases data warehouses cloud platforms and external systems. Implement backup recovery and disaster recovery strategies for Informatica environments to ensure data integrity and availability. Stay up-to-date with the latest Informatica features best practices and industry trends and provide recommendations for optimizing our Informatica environment. Excellent problem-solving and troubleshooting skills with the ability to diagnose and resolve complex technical issues. Informatica certifications such as Informatica Administrator Specialist or Informatica Developer Specialist preferred. Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade B Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility No Global Role Family 60237 (P) Business Support Operations Local Role Name 9609 ADMINISTRATOR Local Skills 2938 Informatica PowerCenter Administration Languages RequiredEnglish Role Rarity To Be Defined
Posted 2 months ago
2 - 7 years
4 - 9 Lacs
Chennai
Work from Office
Abinitio Developer 4+ years Up to 15LPA Trivandrum, Chennai AbInitio,Java,Gde,Unix Immediate to 30 days
Posted 2 months ago
3 - 5 years
18 - 20 Lacs
Delhi NCR, Mumbai, Bengaluru
Work from Office
Build and maintain ETL pipelines using Azure Data Factory, Dataflows etc. Work with teams to understand and solve data needs (Ingestion, transformation, integration) Manage data lakes and data warehouses in Azure Optimize ADF/Spark jobs for speed and cost Proficient in Microsoft Azure services Automate data workflows using Azure Data Factory (ADF) pipelines Troubleshoot and fix ADF job issues Stay updated on new features in Azure. Good to have knowledge or any ETL tool like SSIS, Informatica, DataStage. Good to have ETL migration experience Strong analytical and problem-solving skills. Excellent writing skills, with the ability to create clear requirements, specifications, and Documentation Proficient in SQL. Willing to work on R & D project and various other technologies Location-Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad,Remote
Posted 2 months ago
4 - 7 years
0 - 2 Lacs
Chennai
Hybrid
Position Purpose Provide a brief description of the overall purpose of the position, why this position exists and how it will contribute in achieving the teams goal. The requested position is developer-analyst in an open environment, which requires knowledge of the mainframe, TSO, JCL, OPC environment. Responsibilities Direct Responsibilities For a predefined applications scope take care of: Design Implementation (coding / parametrization, unit test, assembly test, integration test, system test, support during functional/acceptance test) Roll-out support Documentation Continuous Improvement Ensure that SLA targets are met for above activities Handover to Italian teams if knowledge and skills are not available in ISPL Coordinate closely with Data Platform Teams’s and also all other BNL BNP Paribas IT teams (Incident coordination, Security, Infrastructure, Development teams, etc.) Collaborate and support Data Platform Teams to Incident Management, Request Management and Change Management Contributing Responsibilities Contribute to the knowledge transfer with BNL Data Platform team Help build team spirit and integrate into BNL BNP Paribas culture Contribute to incidents analysis and associated problem management Contribute to the acquisition by ISPL team of new skills & knowledge to expand its scope Technical & Behavioral Competencies Fundamental skills: IBM DataStage SQL Experience with Data Modeling and tool ERWin Important skill - knowledge of at least one of database technologies is required: Teradata Oracle SQL Server. Basic knowledge about Mainframe usage TSO, ISPF/S, Scheduler IWS, JCL Nice to have: Knowledge of MS SSIS Experience with Service Now ticketing system Knowledge of Requirements Collection, Analysis, Design, Development and Test activity Continuous improvement approaches Knowledge of Python Knowledge and experience with RedHat Linux, Windows, AIX, WAS, CFT
Posted 2 months ago
5 - 10 years
18 - 24 Lacs
Chennai
Work from Office
Are you ready to make an impact at DTCC Do you want to work on innovative projects, collaborate with a dynamic and supportive team, and receive investment in your professional developmentAt DTCC, we are at the forefront of innovation in the financial markets. Were committed to helping our employees grow and succeed. We believe that you have the skills and drive to make a real impact. We foster a thriving internal community and are committed to creating a workplace that looks like the world that we serve. Pay and Benefits: Competitive compensation, including base pay and annual incentive. Comprehensive health and life insurance and well-being benefits, based on location. Pension / Retirement benefits Paid Time Off and Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being. DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (onsite Tuesdays, Wednesdays and a third day unique to each team or employee). The impact you will have in this role: The Data Architect (DA) at DTCC will consult with information technology, data analytics, business/ product creation and development staff to craft and implement the blueprint of a database to store information for a specific system that will support the work of the enterprise. The DA will develop a detailed knowledge of the underlying data as well as business events within the enterprise and become the subject matter expert (SME) on content, current and potential future uses of data, and the quality and interrelationship between core elements of the data repository and data products. The DA should be able to understand how the information will be gathered, retained and exploited. Reviewing business requirements and/or conducting in-person interviews, profiling source system data, documentation, analysis, and design of proposed systems. This role encompasses all aspects of data management; conceptual, logical, and physical data model creation. The individual will be involved in end-to end data lifecycle management activities; evaluating and recommending new and emerging data management and storage technologies and standards. Your Primary Responsibilities: Reviewing Business Requirements Documents (BRDs) and/or User Stories (Agile methodology), translating them into viable conceptual and subsequently, logical data models. Translate the business requirements into design artifacts, employing the core design principles. Profiling the data in the source system(s) by writing SQL statements. Designing logical data models that conform to existing standards and conventions. Develop mapping documents that demonstrate data lineage. Revising data dictionaries, governance practices, standards, guidance, architecture patterns. Converting logical data models into working physical data models and, document and maintain lineage between the two artifacts. Developing and maintaining reference architecture artifacts Conduct weekly walkthroughs of the data models with different collaborators. **NOTE: The Primary Responsibilities of this role are not limited to the details above. ** Talents needed for Success: 7-15 years of experience with information technology programs and services, with demonstrated expertise in enterprise data management and related technologies 5-15 years of data architecture experience Thorough knowledge of Erwin Data Modeler (V9 or higher) or a comparable CASE tool Working knowledge of data warehousing concepts and the ability to convert a physical design into a dimensional design Knowledge of Agile development methodology and concepts Experience working as a member of distributed data architecture teams Understanding of information systems and data life-cycle management best practices and methodologies, formal systems engineering life cycle (SELC), and systems development life cycle (SDLC) Hands-on SQL experience Good understanding of the cloud technology and some of the tools used in that space Experience with two or more of the following DBMS technologies: Oracle, PostgreSQL, Snowflake, Microsoft SQL Server. Effective communication skills and customer relationship management skills Thorough knowledge of Microsoft Office suite of products Experience with ETL tool(s): (Informatica, DataStage, Talend, etc.) Experience with Business Intelligence (BI) tools: (i.e. OBIEE, MicroStrategy, Tableau, Power BI, QuickSight, etc.) Competency in analytical and problem-solving skills. Good communication skills. Actual salary is determined based on the role, location, individual experience, skills, and other considerations. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 2 months ago
6 - 10 years
11 - 15 Lacs
Jaipur
Work from Office
As a Senior Consultant, Data Engineer at Hakkoda, you are more than just a builder you are a trusted advisor, leading technical teams in designing and developing cutting-edge cloud data solutions, including Snowflake. You will partner with clients to architect and optimize data pipelines, ensuring scalable, secure, and high-performing data environments. Your expertise in data migration, governance, and architecture will drive meaningful transformation for data-driven organizations. In this role, you ll thrive in a collaborative and fast-paced environment, where curiosity, innovation, and leadership are valued. What We Are Looking For: We are currently hiring for the position of Sr. Consultant Data Engineer to join our expanding team of experts. In this role, you will be instrumental in designing and developing solutions within the Snowflake Data Cloud environment. Responsibilities encompass data ingestion pipelines, data architecture, data governance, and security. The ideal candidate thrives on optimizing data systems and enjoys the challenge of building them from the ground up. Qualifications: Location: Jaipur, Rajasthan (Work from Office) Looking for candidates who can join within a month Bachelor s degree in engineering, computer science, or equivalent field. 6-10 years of experience in related technical roles, encompassing data management, database development, ETL, Data Warehouses, and pipelines. At least 3+ years of experience within the Snowflake Data Cloud environment. Proven experience in designing and developing data warehouses using platforms such as Teradata, Oracle Exadata, Netezza, SQL Server, and Spark. Proficiency in building ETL / ELT ingestion pipelines with tools like DataStage, Informatica, Matillion. Strong SQL scripting skills. Cloud experience, particularly on AWS (experience with Azure and GCP is advantageous). Proficient in Python scripting, with a requirement for Scala expertise. Ability to prepare comprehensive reports and present them to internal and customer stakeholders. Demonstrated problem-solving skills and an action-oriented mindset. Strong interpersonal skills, including assertiveness and the ability to build robust client relationships. Comfortable working in Agile teams. Previous experience in hiring, developing, and managing a technical team. Advanced proficiency in English.
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Datastage is a popular ETL (Extract, Transform, Load) tool used by organizations to extract data from different sources, transform it, and load it into a target data warehouse. The demand for datastage professionals in India has been on the rise due to the increasing reliance on data-driven decision-making by companies across various industries.
These cities are known for their vibrant tech industries and have a high demand for datastage professionals.
The average salary range for datastage professionals in India varies based on experience levels. Entry-level positions can expect to earn around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 12-15 lakhs per annum.
In the datastage field, a typical career progression may look like: - Junior Developer - ETL Developer - Senior Developer - Tech Lead - Architect
As professionals gain experience and expertise in datastage, they can move up the ladder to more senior and leadership roles.
In addition to proficiency in datastage, employers often look for candidates with the following skills: - SQL - Data warehousing concepts - ETL tools like Informatica, Talend - Data modeling - Scripting languages like Python or Shell scripting
Having a diverse skill set can make a candidate more competitive in the job market.
As you explore job opportunities in Datastage in India, remember to showcase your skills and knowledge confidently during interviews. By preparing well and demonstrating your expertise, you can land a rewarding career in this growing field. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2