Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
4 - 6 years
6 - 11 Lacs
Pune
Work from Office
About The Role Primary Skills- Teradata DBA (70%); SQL (20 %); Scripting/Python (10%) Secondary Skills- Datastage; Gitlab/Github; Agile Methodlogies DBA tasks (config and DDL operations); Incident handling that require DBA follow-up; Non-automated access request; Monitoring; Workload tuning (TASM)/SQL optimization/System Improvement Plans; Follow-up of DNB specific workload/reporting events; Modernization from VantageCloud Enterprise to VantageCloud Lake; Lead 24/7 OnCall Mandatory to work 5 days from the ODC. Primary Skills Teradata DBA
Posted 2 months ago
3 - 8 years
5 - 10 Lacs
Gurgaon
Work from Office
Project Role : Technology Architect Project Role Description : Design and deliver technology architecture for a platform, product, or engagement. Define solutions to meet performance, capability, and scalability needs. Must have skills : IBM Db2 Good to have skills : Database Architecture Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Application Tech Support Practitioner Project Role DescriptionAct as the ongoing interface between the client and the system or application. Dedicated to quality, using exceptional communication skills to keep our world class systems running. Can accurately define a client issue and can interpret and design a resolution based on deep product knowledge.Must to have skillsIBM Db2, SSI:NON SSI: Good to have skillsSSI:NON SSI:Job Requirements Job Title:DB2 for Mainframe Database AdministratorJob Summary :We are seeking a skilled DB2 for Mainframe Database Administrator to manage and maintain our enterprise-level DB2 databases on z/OS. The ideal candidate will be responsible for ensuring the performance, availability, and security of DB2 databases while also planning and implementing database solutions for the future.Key Responsibilities:Database Management:Administer and maintain DB2 databases on IBM z/OS mainframe environments, including installation, configuration, upgrades, and patching.Performance Tuning:Monitor and optimize the performance of DB2 databases, identifying bottlenecks, and implementing performance tuning strategies to ensure high availability and responsiveness.Backup & Recovery:Develop, implement, and manage backup and recovery procedures to safeguard critical data. Perform regular backups and ensure that recovery procedures are robust and well-documented.Security Management:Implement and manage database security, including user access controls, encryption, and auditing, ensuring compliance with industry standards and company policies.Problem Resolution:Troubleshoot and resolve issues related to DB2 database operations, including providing support for complex technical problems and developing root cause analysis.Data Integrity & Availability:Ensure data integrity and availability by managing replication, data migration, and data archiving processes.Documentation:Maintain comprehensive documentation of the database environment, including configurations, processes, and procedures.Collaboration:Work closely with application developers, system administrators, and other stakeholders to design and implement database solutions that meet business requirements.Disaster Recovery:Participate in the development and testing of disaster recovery plans to ensure data continuity in the event of a disaster.Compliance & Auditing:Ensure that database systems comply with regulatory requirements and participate in regular audits. Qualifications:Experience:Minimum of 5 years of experience as a DB2 Database Administrator in a mainframe environment.Technical Skills: Technical Proficiency:In-depth knowledge of DB2 for z/OS, SQL,SPUFI, IBM Data Studio, CA Tools For DB2 (RC Query, Migrator, Log Analyzer, etc.), BMC Utilities, CA Utilities, IBM Utilities, File Manager For DB2, InfoSphere CDCStrong knowledge of mainframe technologies and z/OS operating systemMainframe TechnologiesZ/OS, JCL, REXX, CLIST, ESP schedulerPerformance Monitoring ToolsIBM OMEGAMON, CA Thread Terminator, IBM Query Monitor, IBM Tivoli Enterprise Portal, Dynatrace, (other monitoring software is owned but not implemented at the moment ex Detector/Subsystem Analyzer)Version Control Systems.ChangemanSecurity ToolsRACFOperating SystemsZ/OSDB2 System Version:V12M510 & V13M100Scripting LanguagesREXX, JCL, CLISTExperience with database performance tuning and optimizationFamiliarity with backup and recovery tools and processesUnderstanding of database security best practicesExperience with disaster recovery planning and implementationSoft Skills: Strong analytical and problem-solving abilitiesExcellent communication and collaboration skillsAbility to work independently and as part of a teamAttention to detail and strong organizational skillsPreferred Qualifications:IBM Certified Database Administrator DB2 for z/OSExperience with data replication tools (e.g., Q Replication, InfoSphere)Knowledge of automation tools and scripting languages (e.g., REXX, JCL)Additional Information: The candidate should have a minimum of 3 years of experience in DB2 for Mainframe Database Administrator This position is based at our Pune office. A 15 years full-time education is required.
Posted 2 months ago
7 - 12 years
35 - 90 Lacs
Pune, Bengaluru, Hyderabad
Work from Office
Data analysts collect and store data related to sales, market research, logistics, linguistics, or other behaviors. They bring their technical expertise, ensure the quality and accuracy of that data and process shape.
Posted 2 months ago
6 - 11 years
18 - 33 Lacs
Chennai, Pune, Bengaluru
Hybrid
Data analysts collect and store data related to sales, market research, logistics, linguistics, or other behaviors. They bring their technical expertise, ensure the quality and accuracy of that data and process shape.
Posted 2 months ago
5 - 10 years
50 - 75 Lacs
Bengaluru
Work from Office
What You ll Do Our ideal candidate is an energetic, self-motivated individual focused on solving customer problems. He/she is a responsive team player who can proactively contribute for building technical strategies for applications and systems by promoting an understanding of the technology and business roadmap. When you add the numbers, you will realize that this role will probably help influence the largest pool of customers anywhere in the world. If that gives you goose bumps - then you are the kind of candidate, we want. You will be responsible for development of merchant and associate facing applications that are used to provide the right promotion experience for the members. You will work in our own internal open stack cloud, Azure, GCP, using several technologies including but not limited to JAVA, Spring, Kafka, Cosmos DB, REST APIs, GraphQL, etc. Help define Technical Roadmap and work very closely with the Engineering Manager to own the entire product delivery end to end. Work very closely with different product and business stakeholders at various locations in US and India to drive the execution of multiple business plans and technologies Support business objectives by collaborating with business partners to define priorities, identify opportunities and drive resolutions Improve, optimize and identify opportunities for efficient software development processes Help to Hire, Develop and Retain a strong team of software engineers. Exhibit strong leadership and communication skills to collaborate with product, engineering and management teams across different geographic locations Promote and support company policies, procedures, mission, values, and standards of ethics and integrity. Lead and direct large-scale, complex, cross-functional projects by reviewing project requirements; translating requirements into technical solutions; directing and reviewing the solutions. Design artifacts (for example, proof of concepts, prototypes); write and develop code; oversee software design; review unit test cases; communicate status and issues to engineering manager, team members and stakeholders; enhance design to prevent recurrences of defects; ensure on-time delivery and hand-offs; interact with engineering manager to provide input on project plan. What You ll bring Deep expertise in Architecture and Design Patterns, Cloud Native Microservices Development, Developing resilient, reliable quality software Problem Solving, Building and Leading High Performing Teams, Thought Leadership Technical mindset and product development driven experience and ability to deep dive on technical issues and provide guidance to the team. Passion for quick learning and adapting in a fast-paced global environment Strong sense of ownership, focus on quality, responsiveness, efficiency and innovation Great communication skills, ability to work with other teams and people. Good stake holder management skills. A proven track record for ability to work with distributed teams in a collaborative and productive manner Retail / E-Commerce background is an advantage Mandatory Skills: An Engineering Degree - B.E/B.Tech/MS in any stream - Computer Science preferred. 10+ years of relevant experience, Minimum of 5 years of experience with agile principles and practices. Experience of working in Scrum teams. Hands on experience in building scalable cloud native software platforms/applications Very strong understanding and experience in the area of software development lifecycle. Strong understanding of CS Fundamentals, Data Structures, Algorithms and Problem Solving Azure or Google Cloud certification, developer/architect certification preferred. Mandatory Technology skills Proficiency in Java 17, Spring Boot, RxJava, Python Experience in working on NoSQL databases like Cassandra, Azure Cosmos or MongoDB. Experience in working with SQL DBs like Azure SQL, PostgreSQL, MariaDB, MySQL Experience in working with Microsoft Azure and Google Cloud Platform Caching frameworks - Memcached, Redis Data pipeline orchestrators - Azure Data Factory, Apache Airflow Data warehouse tools - Google Big Query, Azure Synapse Analytics (formerly Azure SQL Data Warehouse) Streaming frameworks - Apache Spark, Kafka Application Performance Monitoring - Splunk/AppDynamics/New Relic Observability - Grafana, Dynatrace Good to have: Experience in GraphQL, DB2, IBM Datastage, SAP Finance System Minimum Qualifications... Minimum Qualifications:Option 1: Bachelors degree in computer science, computer engineering, computer information systems, software engineering, or related area and 4 years experience in software engineering or related area.Option 2: 6 years experience in software engineering or related area. Preferred Qualifications... Master s degree in Computer Science, Computer Engineering, Computer Information Systems, Software Engineering, or related area and 2 years experience in software engineering or related area
Posted 2 months ago
8 - 10 years
12 - 20 Lacs
Chennai, Bengaluru
Work from Office
7+ yrs in DataStage and Oracle PL/SQL troubleshooting experience.8+ years of experience in DataStage 9.x or above. Oracle PL/SQL features like Built in Functions, Analytical Functions, Cursors,variables,Unix shell scripting.
Posted 2 months ago
8 - 13 years
30 - 35 Lacs
Bengaluru
Work from Office
Work closely with the Technology Product Owners, Lead Engineers, and Engineer Chapter Lead to understand the business requirements and develop technology solutions. Actively engage in the whole delivery lifecycle from inception, design, development, testing, deployment, operations, monitoring and continuous improvement of systems and services. Consistently take responsibility for high quality, high output work and strong attention to detail Provide ongoing support for platforms as required as part of continuous improvement initiatives e.g. problem and incident management Contribute to engineering communities and provide ongoing support of platforms as required. Collaborate with other technology and business stakeholders and teams to ensure successful work request/project delivery. Take ownership for the creation, revision, and drive adoption of code management practices including: source code management, peer-reviews, CI/CD automation to ensure the quality, security and consistency of delivered solutions What will you bring To grow and be successful in this role, you will ideally bring the following: At least 8 years of relevant work experience in application delivery on data warehouses in the banking services domain Deep experience with at least one data warehousing platform such as Teradata or Microsoft Technology stack (SQL Server, SQL Server Integration Services and SQL Server Reporting Services) Deep knowledge on the concepts of Data warehousing and use of it in Teradata/SQL Server. Strong experience in ETL practices (DataStage/Informatica/Ab Initio) Ability to develop and maintain complex SQL queries for data analysis and reporting. Extensive hands-on experience in data analysis, profiling and technical solution design in line with data warehousing principles. Good understanding of SDLC and Agile methodologies. Solid understanding of software engineering principles, patterns, and practices. Good experience with DevOps practices, including CI/CD pipelines and observability tools
Posted 2 months ago
1 - 3 years
9 - 13 Lacs
Bengaluru
Work from Office
Career Area: Technology, Digital and Data Job Description: Your Work Shapes the World at Caterpillar Inc. When you join Caterpillar, youre joining a global team who cares not just about the work we do - but also about each other. We are the makers, problem solvers, and future world builders who are creating stronger, more sustainable communities. We dont just talk about progress and innovation here - we make it happen, with our customers, where we work and live. Together, we are building a better world, so we can all enjoy living in it. Your Impact Shapes the World at Caterpillar Inc When you join Caterpillar , youre joining a global team who cares not just about the work we do - but also about each other. We are the makers, problem solvers and future world builders who are creating stronger, more sustainable communities. We dont just talk about progress and innovation here - we make it happen, with our customers, where we work and live. Together, we are building a better world, so we can all enjoy living in it. Job Summary We are seeking a skilled IT Analyst Applications (Junior Data Engineer) to join our Financial Systems Tech Support - CAT IT Team. The incumbent will have a good foundation in data engineering principles, a keen analytical mindset, and the ability to work collaboratively in a fast-paced environment. As a Junior Data Engineer, you will play a critical role in designing, developing, and maintaining data pipelines and infrastructure to support our organizations data needs. The preference for this role is to be based out of Bangalore, PSN Office What you will do Job Responsibilities may include, but are not limited to: Data Pipeline Development Develop, and maintain scalable and reliable data pipelines to ingest, process, and store large volumes of data from various sources. Implement data workflows using ETL (Extract, Transform, Load) processes to ensure data quality and integrity. Data Integration and Management Collaborate with data architects, product owners, business analysts, and other stakeholders to understand data requirements and deliver data solutions that meet their needs. Integrate data from multiple sources, including databases, APIs, and external data providers, into a unified data platform. Manage and optimize data storage solutions, ensuring efficient data retrieval and processing. Data Quality and Governance Implement data quality checks and validation processes to ensure the accuracy and consistency of data. Adhere to data governance policies and best practices to maintain data security and compliance. Performance Optimization Monitor and optimize data pipeline performance to ensure timely and efficient data processing. Identify and resolve performance bottlenecks and system issues to maintain data pipeline reliability. Documentation and Collaboration Document data engineering processes, workflows, and solutions for future reference and knowledge sharing. Collaborate with cross-functional teams to support data-driven decision-making and project initiatives. What you will have Bachelor s degree in computer science, Information Technology, Data Science, or a related field. 1-3 years of experience in data engineering, data analysis, or software development. Strong experience on advanced SQL any Database. (Oracle, Teradata, snowflake). Experience in data integration area and hands on experience in any of the ETL tools like Datastage , Informatica , Snaplogic etc. Able to transform technical requirements into data collection query. Should be capable of working with business and other IT teams and convert the requirements into queries. Good understanding of ETL architecture and design. Good knowledge in UNIX commands, Databases, SQL, PL/SQL Good to have an experience on AWS Glue Good to have knowledge on Qlik replication tool Soft Skills Good analytical and problem-solving skills with attention to detail. Excellent communication and interpersonal skills to collaborate effectively with cross-functional teams. Ability to manage multiple tasks and prioritize workload in a fast-paced environment. Eagerness to learn new technologies and stay updated with industry trends. Skills desired: Technical Analysis: Knowledge of the concepts, methodologies, and activities in technical analysis; ability to forecast the direction of technical landscape using technical indicators and take collective actions. Level Basic Understanding: Lists commonly used technical analysis tools and technologies. Documents key technical analysis procedures and policies. Describes basic concepts and major activities associated with technical analysis. Identifies the sources of information to extract the required data and technical indicators. IT Environment: Knowledge of an organizations IT purposes and activities; ability to create an effective IT environment for business operations. Level Basic Understanding: Names key IT departments, functions, and players. Gathers information on key IT initiatives and projects. Describes key activities performed by IT professionals and managers. Identifies major roles and responsibilities of the IT function. Requirements Analysis: Knowledge of tools, methods, and techniques of requirement analysis; ability to elicit, analyze and record required business functionality and non-functionality requirements to ensure the success of a system or software development project. Level Basic Understanding: Cites examples of functional and non-functional requirements. Describes basic concepts and major activities associated with requirements analysis. Explains the life cycle context and scope of requirements analysis. Explains the structure and components of effective requirements analysis documents. What you will get: Work Life Harmony Earned and medical leave. Flexible work arrangements Relocation assistance Holistic Development Personal and professional development through Caterpillar s employee resource groups across the globe Career developments opportunities with global prospects Health and Wellness Medical coverage -Medical, life and personal accident coverage Employee mental wellness assistance program Financial Wellness Employee investment plan Pay for performance -Annual incentive Bonus plan. Additional Information: Caterpillar is not currently hiring individuals for this position who now or in the future require sponsorship for employment visa status; however, as a global company, Caterpillar offers many job opportunities outside of the U.S. which can be found through our employment website at www.caterpillar.com/careers Caterpillar is an Equal Opportunity Employer (EEO) EEO/AA Employer. All qualified individuals, including minorities, females, veterans and individuals with disabilities - are encouraged to apply. Posting Dates: Caterpillar is an Equal Opportunity Employer (EEO). Not ready to applyJoin our Talent Community .
Posted 2 months ago
1 - 3 years
3 - 5 Lacs
Bengaluru
Work from Office
Career Area: Technology, Digital and Data Job Description: Your Work Shapes the World at Caterpillar Inc. When you join Caterpillar, youre joining a global team who cares not just about the work we do - but also about each other. We are the makers, problem solvers, and future world builders who are creating stronger, more sustainable communities. We dont just talk about progress and innovation here - we make it happen, with our customers, where we work and live. Together, we are building a better world, so we can all enjoy living in it. Your Impact Shapes the World at Caterpillar Inc When you join Caterpillar , youre joining a global team who cares not just about the work we do - but also about each other. We are the makers, problem solvers and future world builders who are creating stronger, more sustainable communities. We dont just talk about progress and innovation here - we make it happen, with our customers, where we work and live. Together, we are building a better world, so we can all enjoy living in it. Job Summary We are seeking a skilled IT Analyst Applications (Junior Data Engineer) to join our Financial Systems Tech Support - CAT IT Team. The incumbent will have a good foundation in data engineering principles, a keen analytical mindset, and the ability to work collaboratively in a fast-paced environment. As a Junior Data Engineer, you will play a critical role in designing, developing, and maintaining data pipelines and infrastructure to support our organizations data needs. The preference for this role is to be based out of Bangalore, PSN Office What you will do Job Responsibilities may include, but are not limited to: Data Pipeline Development Develop, and maintain scalable and reliable data pipelines to ingest, process, and store large volumes of data from various sources. Implement data workflows using ETL (Extract, Transform, Load) processes to ensure data quality and integrity. Data Integration and Management Collaborate with data architects, product owners, business analysts, and other stakeholders to understand data requirements and deliver data solutions that meet their needs. Integrate data from multiple sources, including databases, APIs, and external data providers, into a unified data platform. Manage and optimize data storage solutions, ensuring efficient data retrieval and processing. Data Quality and Governance Implement data quality checks and validation processes to ensure the accuracy and consistency of data. Adhere to data governance policies and best practices to maintain data security and compliance. Performance Optimization Monitor and optimize data pipeline performance to ensure timely and efficient data processing. Identify and resolve performance bottlenecks and system issues to maintain data pipeline reliability. Documentation and Collaboration Document data engineering processes, workflows, and solutions for future reference and knowledge sharing. Collaborate with cross-functional teams to support data-driven decision-making and project initiatives. What you will have Bachelor s degree in computer science, Information Technology, Data Science, or a related field. 1-3 years of experience in data engineering, data analysis, or software development. Strong experience on advanced SQL any Database. (Oracle, Teradata, snowflake). Experience in data integration area and hands on experience in any of the ETL tools like Datastage , Informatica , Snaplogic etc. Able to transform technical requirements into data collection query. Should be capable of working with business and other IT teams and convert the requirements into queries. Good understanding of ETL architecture and design. Good knowledge in UNIX commands, Databases, SQL, PL/SQL Good to have an experience on AWS Glue Good to have knowledge on Qlik replication tool Soft Skills Good analytical and problem-solving skills with attention to detail. Excellent communication and interpersonal skills to collaborate effectively with cross-functional teams. Ability to manage multiple tasks and prioritize workload in a fast-paced environment. Eagerness to learn new technologies and stay updated with industry trends. Skills desired: Technical Analysis: Knowledge of the concepts, methodologies, and activities in technical analysis; ability to forecast the direction of technical landscape using technical indicators and take collective actions. Level Basic Understanding: Lists commonly used technical analysis tools and technologies. Documents key technical analysis procedures and policies. Describes basic concepts and major activities associated with technical analysis. Identifies the sources of information to extract the required data and technical indicators. IT Environment: Knowledge of an organizations IT purposes and activities; ability to create an effective IT environment for business operations. Level Basic Understanding: Names key IT departments, functions, and players. Gathers information on key IT initiatives and projects. Describes key activities performed by IT professionals and managers. Identifies major roles and responsibilities of the IT function. Requirements Analysis: Knowledge of tools, methods, and techniques of requirement analysis; ability to elicit, analyze and record required business functionality and non-functionality requirements to ensure the success of a system or software development project. Level Basic Understanding: Cites examples of functional and non-functional requirements. Describes basic concepts and major activities associated with requirements analysis. Explains the life cycle context and scope of requirements analysis. Explains the structure and components of effective requirements analysis documents. What you will get: Work Life Harmony Earned and medical leave. Flexible work arrangements Relocation assistance Holistic Development Personal and professional development through Caterpillar s employee resource groups across the globe Career developments opportunities with global prospects Health and Wellness Medical coverage -Medical, life and personal accident coverage Employee mental wellness assistance program Financial Wellness Employee investment plan Pay for performance -Annual incentive Bonus plan. Additional Information: Caterpillar is not currently hiring individuals for this position who now or in the future require sponsorship for employment visa status; however, as a global company, Caterpillar offers many job opportunities outside of the U.S. which can be found through our employment website at www.caterpillar.com/careers Caterpillar is an Equal Opportunity Employer (EEO) EEO/AA Employer. All qualified individuals, including minorities, females, veterans and individuals with disabilities - are encouraged to apply. Posting Dates: Caterpillar is an Equal Opportunity Employer (EEO). Not ready to applyJoin our Talent Community .
Posted 2 months ago
6 - 12 years
17 - 19 Lacs
Bengaluru
Work from Office
At least 3 years of experience in ETL development in a Data Warehouse. Understanding of enterprise data warehousing best practices and standards. Solid experience with Python/PySpark, DataStage ETL and SQL development. Proven experience in cloud infrastructure projects with hands on migration expertise on public clouds such as AWS and Azure, preferably Snowflake. Knowledge of Cybersecurity organization practices, operations, risk management processes, principles, architectural requirements, engineering and threats and vulnerabilities, including incident response methodologies. Good communication and interpersonal skills. Good organization skills and the ability to work independently as well as with a team. Preferred Qualifications AWS Certified Solutions Architect - Associate, AWS Certified DevOps Engineer - Professional and/or AWS Certified Solutions Architect - Professional Experience in financial services banking industry.
Posted 2 months ago
4 - 7 years
6 - 10 Lacs
Pune
Work from Office
Lead production support for a Finance Datastage ETL Platform for a 24*7 SLA based engagement. Reviewing job designs and code created by other DataStage developers. Provide support for ongoing inbound/outbound data loads and issue resolution. Monitor DataStage job execution and troubleshoot issues as needed. Troubleshoot DataStage job failures and provide root cause analysis. Optimize existing DataStage jobs for performance (as necessary)
Posted 2 months ago
3 - 8 years
8 - 14 Lacs
Pune
Work from Office
Must have 3 - 8 years of experience in Power BI - Should have experience on Azure Databricks or experience with any ETL tool like Datastage, Informatica - Should have Business Intelligence Experience in a data warehouse environment - Technical Strong development skills in Azure Databricks is a must - Proficient in making DAX queries in Power BI desktop. - Should be strong in databricks - Should have experience in Agile Methodology( Good to have) - Should be strong in Agile Development experience - Customer Facing experience is a must, with strong communication & Good interpersonal Skills.
Posted 2 months ago
2 - 7 years
6 - 10 Lacs
Pune
Work from Office
As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviour’s. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 3-5 yrs of experience in IICS. 2+ years of hands-on experience with Informatica Intelligent Cloud Services (IICS) Strong knowledge of SQL and experience with data modelling. Experience with AWS cloud platform, Snowflake, DB2. Experience with DataStage ETL Tool Preferred technical and professional experience Excellent problem-solving skills and attention to detail. 2+ years of hands-on experience with Informatica Intelligent Cloud Services (IICS). Ability to communicate results to technical and non-technical audiences
Posted 2 months ago
3 - 8 years
7 - 11 Lacs
Pune
Work from Office
Provide expertise in analysis, requirements gathering, design, coordination, customization, testing and support of reports, in client’s environment Develop and maintain a strong working relationship with business and technical members of the team Relentless focus on quality and continuous improvement Perform root cause analysis of reports issues Development / evolutionary maintenance of the environment, performance, capability and availability. Assisting in defining technical requirements and developing solutions Effective content and source-code management, troubleshooting and debugging. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Cognos Developer & Admin Required. EducationThe resource should be full time MCA/M. Tech/B. Tech/B.E. and should preferably have relevant certifications ExperienceThe resource should have a minimum of 3 years of experience of working in the in BI DW projects in areas pertaining to reporting and visualization using cognos. The resources shall have worked in at least two projects where they were involved in developing reporting/ visualization He shall have good understanding of UNIX. Should be well conversant in English and should have excellent writing, MIS, communication, time management and multi-tasking skill Preferred technical and professional experience Experience with various cloud and integration platforms (e.g. AWS, Google, Azure) Agile mindset – ability to process changes of priorities and requests, ownership, critical thinking Experience with an ETL/Data Integration tool (eg. IBM InfoSphere DataStage, Azure Data Factory, Informatica PowerCenter)
Posted 2 months ago
2 - 5 years
4 - 7 Lacs
Hyderabad
Work from Office
Fusion Plus Solutions Inc is looking for ETL Testing Professional to join our dynamic team and embark on a rewarding career journey. Will be responsible for ensuring the quality and accuracy of data being extracted, transformed, and loaded into databases and data warehouses. Developing and executing test cases to validate the accuracy of data being extracted, transformed, and loaded. Writing SQL scripts and using ETL testing tools to validate the data. Collaborating with ETL developers to understand the data flow and identify testing requirements. Performing data validation, data quality checks, and data reconciliation. Identifying and reporting any errors or data discrepancies to the ETL development team. Developing and maintaining documentation of testing procedures and results. Participating in the development of test plans and test schedules. Excellent analytical and problem-solving skills.
Posted 2 months ago
8 - 13 years
30 - 35 Lacs
Chennai
Work from Office
We are seeking a Data Engineer to join our Parts Revenue Management product line. The ideal candidate will have strong expertise in SQL and a deep understanding of GCP-native services. This role offers the opportunity to work in a fast-paced, dynamic environment, collaborating with multiple technology and business stakeholders. You will gain hands-on experience with cutting-edge technologies, including Big Query, GCP-native ETL services, CI/CD pipelines and Cloud Scheduler tools. Requirements: Minimum 5+ years of experience in complex SQL development. At least 2+ years of Cloud experience (GCP preferred) with solutions designed and implemented at production scale. Strong understanding of key GCP services, especially those related to data processing [Batch/Real Time] leveraging Terraform, Big Query, BQSQL, Dataflow, DataFusion, DataProc, Cloud Build, Airflow & Pub/Sub. Experience in analysis, design, development, implementation using Data Warehousing applications for example IBM DataStage/Dataflow/DataProc. Experience in integrating various data sources like Oracle, Teradata, DB2, Big Query and Flat files. Experience in designing jobs by using various Stages like Look-up, Filter, Sort, copy, remove duplicates, Join, Funnel and Aggregator. Experience in debugging ETL jobs to check in for the errors and warnings associated with each Job run. Hands on experience in Tuning Jobs process identifying and resolve performance in Parallel Jobs. Involving in review meetings and coordinating with the team in job designing and fine-tuning the job performance. Good experience in Autosys or Astronomer, Accurev and GitHub. Having Mainframe background would be preferred and added advantage to this role. #LI-KS3
Posted 2 months ago
4 - 8 years
4 - 8 Lacs
Karnataka
Work from Office
Description ETL POD Lead with ETL testing experience 5 to 7 years Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade C Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility No Global Role Family To be defined Local Role Name To be defined Local Skills Test Manual Lead(Functional) Languages RequiredENGLISH Role Rarity To Be Defined
Posted 2 months ago
4 - 8 years
7 - 11 Lacs
Uttar Pradesh
Work from Office
Experience with Power BI, data visualization tools & human centred design. Dashboard creation in Power BI 5+ years of experience in reporting & analytics in a corporate environment. Preference for People Analytics, but not essential. Experience with some of following toolsWorkday, SAP HR, Success Factors, Excel Exposure to data transformation tools such as Alteryx or Infogix Data3Sixty preferred, but not essential. Knowledge of data principles and methods to manage data sets, including SQL skills. Working as part of multi disciplinary squads to scope, design, develop and deliver a range of people reporting products Creating and maintaining people related management information (reports and dashboards) for the P&C Domain and leaders across the enterprise Providing data and analysis support for key people initiatives and activities Sourcing and transforming people data and creating integrated employee datasets Communicating insights in simple and engaging ways Responding to ad hoc questions and queries as required Advising and enabling the ability to self serve data sourcing and analysis Taking accountability for ensuring data integrity and accuracy
Posted 2 months ago
3 - 6 years
4 - 8 Lacs
Chennai
Work from Office
Abinitio Developer 4+ years Up to 15LPA Trivandrum, Chennai AbInitio,Java,Gde,Unix Immediate to 30 days
Posted 2 months ago
5 - 8 years
4 - 8 Lacs
Bengaluru
Work from Office
Job Title DATASTAGE ADMIN Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills:Technology->Data Management - Data Integration Administration->DataStage Administration Preferred Skills: Technology->Data Management - Data Integration Administration->DataStage Administration Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit * Location of posting is subject to business requirements
Posted 2 months ago
1 - 5 years
3 - 7 Lacs
Bengaluru
Work from Office
taff is looking for Data Engineer III (ETL Datastage) in Flushing to join our dynamic team and embark on a rewarding career journey. We are seeking a skilled ETL DataStage Developer to join our dynamic team. The ideal candidate should have a strong background in designing, developing, and implementing ETL processes using IBM DataStage. The candidate will be responsible for extracting data from various sources, transforming it into a usable format, and loading it into a target data warehouse. Attention to detail, strong analytical skills, and the ability to collaborate with cross-functional teams are essential for success in this role. Responsibilities : Requirements Analysis : Collaborate with business analysts and stakeholders to understand data integration requirements. Analyze source system data and design ETL processes to meet business needs. Data Extraction : Extract data from heterogeneous sources, including databases, flat files, APIs, and other structured/unstructured formats. Develop and maintain source-to-target mapping documentation. Transformation : Design and implement data transformations using IBM DataStage to cleanse, filter, aggregate, and enrich data as required. Ensure data quality and consistency throughout the ETL process. Loading : Load transformed data into target data warehouses or data marts. Optimize loading processes for performance and efficiency. Data Integration : Collaborate with database administrators, system architects, and other stakeholders to ensure seamless data integration across systems. Implement and maintain data integration best practices. Performance Tuning : Identify and resolve performance bottlenecks in ETL processes. Optimize SQL queries and ETL jobs for improved efficiency. Documentation : Document ETL processes, data flows, and data models. Maintain documentation for version control and knowledge sharing. Testing : Develop and execute test plans for ETL processes to ensure data accuracy and reliability. Debug and troubleshoot issues in collaboration with other teams. Monitoring and Maintenance : Monitor ETL jobs for failures and proactively address issues. Perform routine maintenance tasks, including job scheduling and data purging.
Posted 2 months ago
6 - 10 years
30 - 35 Lacs
Pune
Remote
Design, develop and maintain ETL Pipelines. Work with Azure Data Factory (ADF) for data integration Utilize Datastage for data warehousing and relational database design. Required Candidate profile Strong Data Warehouse knowledge and experience, include data modeling, data ingestion, transformation, data consumption patterns.
Posted 2 months ago
6 - 8 years
18 - 20 Lacs
Chennai, Pune, Mumbai
Work from Office
Experience: More than 6 years Job Location: Mumbai, Pune, Chennai Job Description: Translate business requirements to Technical Specifications Customise and Deliver Data model based on customer requirements Map existing web services, build new web services based on customer requirements Integrate web services with other business applications Core skills and Experience: More than 6 years of experience in the Information Management space, ideally from a Consulting, Delivery background Hands on technical skills in IBM MDM Advanced Edition / Collaborative Edition v11 or higher Must have been several full life cycle implementation experience Must have Data modelling experience Good SQL knowledge in DB2/Oracle JAVA, J2EE, SOAP/REST Web services, XML/XSD and other relevant tools and constructs Other skills: Experience in QualityStage, Information Analyzer is an advantage Prior knowledge in ETL tools such as DataStage, Informatica is an advantage Good Client facing skills Excellent written and oral communication skills Should be results oriented, self-motivated and work under minimal supervision A wide degree of creativity and lateral thinking is expected
Posted 2 months ago
15 - 17 years
11 - 15 Lacs
Pune, Mumbai
Work from Office
EXPERIENCE: 14+ years JOB DESCRIPTION A minimum 15 years experience designing and contributing to the delivery of innovative data driven solutions At least 2-3 end to end DWH / Data Lake life cycle implementation in Banking / Insurance Exceptional communications skills to convey complex information in simple terms able to be understood by non-technical audiences. Proven experience and technical understanding of end to end information/data related technologies and best practices including: Data sourcing, Extract/Transform/Load (ETL), Third Normal Form (3NF) modelling, dimensional modelling, reporting, as well as a sound understanding of established data management policys and principles. Ability to interact with the technical team, guide and mentor them in the process of POC s and Pliots Experience in working across a range of cloud and on-premise technologies Experience in working in IBM Netezza, Snowflake, Exatada or Redshift or related DWH DB Exposure to ETL tools like IBM Datastage, Informatica, Talend is a must Ability to solution for either AWS / Azure for services related to data management and data governance Exposure to data governance products like Collibra, informatica, IBM Knowledge Catalog and Precisely is an added advantage Should be familiar with data lakehouse architecture using platforms like Databricks Strong analytical ability with a capability to pitch technical information to staff at multiple levels within the organisation (including non-technical audiences) Ability to rapidly acquire an understanding of complex business problems/requirements to develop solutions and designs, regardless of existing areas of expertise or specialisation. Ability to work at varying levels of detail from high level architecture to low level aspects of component design Experience and knowledge in BI Analytics and Reporting tools - Cognos, Tableau, PowerBI
Posted 2 months ago
8 - 12 years
22 - 25 Lacs
Chennai, Pune, Mumbai
Work from Office
Role: MDM Architect/Technical Lead Experience: More than 8 years Job Location: Mumbai, Pune, Chennai Job Description: Scope Client requirements Deliver Technical Solutions Architecture Specify solutions and articulate value to customers Design, Deliver and customise Data model based on customer requirements Design and map Web services Help identify dependencies, create project plan and schedule project activities Provide support and guidance to sales and technical resources Core skills and Experience: More than 8 years of experience in the Information Management space, ideally from a Consulting, Delivery background Hands on technical skills in IBM MDM Advanced Edition / Collaborative Edition Must have several full life cycle implementation experience Experienced in various implementation styles (Virtual, Physical, Hybrid) Must have Data modelling experience Must have experience in designing and architecting solutions for green field projects Experience in Implementing Data Governance best practices Other skills: Experience in Data Quality implementation methodologies Prior knowledge in ETL tools such as DataStage, Informatica is an advantage Prior Knowledge in other MDM and Data Quality tools such as Informatica MDM and SAS is an advantage Good Client facing skills Excellent written and oral communication skills Excellent presentation skills Should be results oriented, self-motivated and work under minimal supervision A wide degree of creativity and lateral thinking is expected
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Datastage is a popular ETL (Extract, Transform, Load) tool used by organizations to extract data from different sources, transform it, and load it into a target data warehouse. The demand for datastage professionals in India has been on the rise due to the increasing reliance on data-driven decision-making by companies across various industries.
These cities are known for their vibrant tech industries and have a high demand for datastage professionals.
The average salary range for datastage professionals in India varies based on experience levels. Entry-level positions can expect to earn around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 12-15 lakhs per annum.
In the datastage field, a typical career progression may look like: - Junior Developer - ETL Developer - Senior Developer - Tech Lead - Architect
As professionals gain experience and expertise in datastage, they can move up the ladder to more senior and leadership roles.
In addition to proficiency in datastage, employers often look for candidates with the following skills: - SQL - Data warehousing concepts - ETL tools like Informatica, Talend - Data modeling - Scripting languages like Python or Shell scripting
Having a diverse skill set can make a candidate more competitive in the job market.
As you explore job opportunities in Datastage in India, remember to showcase your skills and knowledge confidently during interviews. By preparing well and demonstrating your expertise, you can land a rewarding career in this growing field. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2