Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
0 - 1 Lacs
Hyderabad, Pune
Hybrid
Datastage Developer ETL Datastage, Datastage 5 to 9 yrs Loc: Pune & Hyderabad
Posted 2 months ago
3.0 - 8.0 years
0 - 1 Lacs
Hyderabad
Work from Office
Key Responsibilities 1. Incident Management Monitor production systems for issues and respond promptly to incidents. Log, categorize, and prioritize incidents for resolution. Collaborate with development teams to address and resolve issues. Communicate with stakeholders regarding incident status and resolution timelines. expertia.ai+2linkedin.com+2virtusa.com+2tavoq.com 2. Root Cause Analysis (RCA) Conduct thorough investigations to identify the underlying causes of recurring issues. Implement long-term solutions to prevent future occurrences. Document findings and share insights with relevant teams. tech-champion.com+1tealhq.com+1 3. System Monitoring & Performance Optimization Utilize monitoring tools to track system health and performance. Identify and address performance bottlenecks or capacity issues. Ensure systems meet performance benchmarks and service level agreements (SLAs). virtusa.com 4. Release Management & Application Maintenance Assist with the deployment of software updates, patches, and new releases. Ensure smooth transitions from development to production environments. Coordinate with cross-functional teams to minimize disruptions during releases. virtusa.com+1tealhq.com+1tech-champion.com 5. User Support & Troubleshooting Provide end-user support for technical issues. Investigate user-reported problems and offer solutions or workarounds. Maintain clear communication with users regarding issue status and resolution. virtusa.com+1resumehead.com+1 6. Documentation & Knowledge Sharing Maintain detailed records of incidents, resolutions, and system configurations. Create and update operational runbooks, FAQs, and knowledge base articles. Share knowledge with team members to improve overall support capabilities. virtusa.com Essential Tools & Technologies Monitoring & Alerting : Nagios, Datadog, New Relic Log Management & Analysis : Splunk, Elasticsearch, Graylog Version Control : Git, SVN Ticketing Systems : JIRA, ServiceNow Automation & Scripting : Python, Shell scripting Database Management : SQL, Oracle, MySQL cvformat.io+2tealhq.com+2virtusa.com+2 Skills & Competencies Technical Skills Proficiency in system monitoring and troubleshooting. Strong understanding of application performance metrics. Experience with database management and query optimization. Familiarity with cloud platforms and infrastructure.expertia.ai Soft Skills Analytical Thinking : Ability to diagnose complex issues and develop effective solutions. Communication : Clear and concise communication with stakeholders at all levels. Teamwork : Collaborative approach to problem-solving and knowledge sharing. Adaptability : Flexibility to handle changing priorities and technologies. cvformat.io
Posted 2 months ago
7.0 - 12.0 years
5 - 15 Lacs
Pune, Chennai, Mumbai (All Areas)
Hybrid
Hexaware is hiring for DataStage Developer, below is the required details. Required Skill: DataStage, Ibm DataStage, SQL,ETL. Total Years of experience: 6 to 12 years. Relevant Experience: minimum 4 years. Location: Pune, Mumbai,Chennai. Work Mode: Hybrid. Interested candidates kindly share your updated cv to gopinathr6@hexaware.com by filling the below required details. Candidate Name: Current Company: Total Years of Experience: Relevant Exp in Data stage: Current location: Preferred Location: Current CTC: Expected CTC: Notice period: Regards; Gopinath R.
Posted 2 months ago
5.0 - 10.0 years
15 - 30 Lacs
Bengaluru
Work from Office
Dear Candidates, We are hiring for the position of ETL Developer , Kindly find below details. Organization : Confidential (IT Service MNC Company) Job Location : Bangalore, Marathahalli Experience Required : 5Years - 10Years Designation : ETL Developer Mode of Job : Work From Office Job Description for ETL Developer BPCE ES Dataverse Team Profile Required: Technical profile with 5+ years in Data warehousing and BI Strong fundamentals of Data warehousing and BI concepts Experience in Data Integration, Governance and Management Skills Required Mandatory : Minimum experience of 5 years in IBM Datastage 11.7 and SQL-PLSQL Working knowledge Oracle and PostGRE database Should have hands on experience in unix shell scripting. Personal Skills: Good Communication skills written and verbal with the ability to understand and interact with the diverse range of stakeholders Ability to raise factual alerts & risks when necessary Capability to work with cross location team members / stakeholders in order to establish and maintain a consistent delivery. Good To Have (Optional) Technical - Reporting: PowerBI, SAP BO, Tableau Functional - Finance/Banking - Asset finance / Equipment finance / Leasing Role and Responsibilities Responsible for developing data warehousing solutions (Data Stage, Oracle, PostgreSQL) as per requirements. Must provide hands-on technical knowledge and take ownership while working with Business users, Project Managers, Technical Leads, Architects and Testing teams. Provide guidance to IT management in establishing both a short-term roadmap and long-term DW/BI strategy. Work closely with team members and stakeholders to ensure seamless development and delivery of assigned tasks. Assist the team/lead in team management, identifying training needs and inducting new starters Take part in discussions on BI/DW forums alongside peers to the benefit of the organization INTERESTED CANDIDATE SHARE RESUME ON anshu.baranwal @rigvedtech.com
Posted 3 months ago
3.0 - 5.0 years
5 - 7 Lacs
Bengaluru
Work from Office
As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education High School Diploma/GED Required technical and professional expertise Good Hands on experience in DBT is required. ETL Datastage and snowflake - preferred. Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 3 months ago
3.0 - 7.0 years
0 - 2 Lacs
Chennai
Hybrid
Position Purpose Provide a brief description of the overall purpose of the position, why this position exists and how it will contribute in achieving the teams goal. The requested position is developer-analyst in an open environment, which requires knowledge of the mainframe, TSO, JCL, OPC environment. Responsibilities Direct Responsibilities For a predefined applications scope take care of: Design Implementation (coding / parametrization, unit test, assembly test, integration test, system test, support during functional/acceptance test) Roll-out support Documentation Continuous Improvement Ensure that SLA targets are met for above activities Handover to Italian teams if knowledge and skills are not available in ISPL Coordinate closely with Data Platform Teams’s and also all other BNL BNP Paribas IT teams (Incident coordination, Security, Infrastructure, Development teams, etc.) Collaborate and support Data Platform Teams to Incident Management, Request Management and Change Management Contributing Responsibilities Contribute to the knowledge transfer with BNL Data Platform team Help build team spirit and integrate into BNL BNP Paribas culture Contribute to incidents analysis and associated problem management Contribute to the acquisition by ISPL team of new skills & knowledge to expand its scope Technical & Behavioral Competencies Fundamental skills: IBM DataStage SQL Experience with Data Modeling and tool ERWin Important skill - knowledge of at least one of database technologies is required: Teradata Oracle SQL Server. Basic knowledge about Mainframe usage TSO, ISPF/S, Scheduler IWS, JCL Nice to have: Knowledge of MS SSIS Experience with Service Now ticketing system Knowledge of Requirements Collection, Analysis, Design, Development and Test activity Continuous improvement approaches Knowledge of Python Knowledge and experience with RedHat Linux, Windows, AIX, WAS, CFT
Posted 3 months ago
6.0 - 10.0 years
5 - 11 Lacs
Hyderabad
Work from Office
Greetings from NCG! We have a opening for Snowflake Developer role in Hyderabad office! Below JD for your reference Job Description: We are hiring an experienced Senior Data Engineer with strong expertise in IBM DataStage , Azure Data Platform , and Power BI . The ideal candidate will be responsible for end-to-end data integration, transformation, and reporting solutions that drive business decisions. Key Responsibilities: Design and develop robust ETL solutions using IBM DataStage for data extraction, transformation, and loading. Manage and orchestrate data workflows in Azure using Azure Data Factory , Azure SQL , Data Lake , etc. Build intuitive and dynamic Power BI dashboards and reports for various business stakeholders. Optimize data models and ensure performance tuning across DataStage and Power BI platforms. Collaborate with business users, analysts, and data engineers to gather reporting requirements. Ensure adherence to data governance, security policies, and quality standards. Conduct unit testing and support UAT cycles for data pipelines and reports. Document ETL designs, data mappings, and visualization structures. Required Skills: 6+ years of hands-on experience in ETL development using IBM DataStage . Strong experience with Azure data services : Azure Data Factory, Azure SQL DB, Blob Storage, etc. Advanced knowledge of Power BI , including DAX , Power Query, data modeling, and report publishing. Proficient in writing SQL queries and managing large datasets. Experience in data warehousing concepts and data architecture design . Strong problem-solving skills and attention to detail. For further information please contact the HR Harshini - 9663082098 Swathi - 9972784663 Thanks and regards, Chiranjeevi Nanjunda Talent Acquisition Lead - NCG
Posted 3 months ago
6.0 - 11.0 years
12 - 22 Lacs
Chennai
Work from Office
Job Summary: We are looking for a seasoned ETL Engineer with hands-on experience in Talend or IBM DataStage , preferably both, to lead data integration efforts in the mortgage domain . The ideal candidate will play a key role in designing, developing, and managing scalable ETL solutions that support critical mortgage data processing and analytics workloads. Key Responsibilities: End-to-end ETL solution development using Talend or DataStage.Design and implement robust data pipelines for mortgage origination, servicing, and compliance data.Collaborate with business stakeholders and data analysts to gather requirements and deliver optimized solutions.Perform code reviews, mentor junior team members, and ensure adherence to data quality and performance standards.Manage job orchestration, scheduling, and error handling mechanisms.Document ETL workflows, data dictionaries, and system processes.Ensure data privacy and compliance requirements are embedded in all solutions. Required Skills: Strong experience in ETL tools Talend (preferred) or IBM DataStage.Solid understanding of mortgage lifecycle and related data domains.Proficiency in SQL and experience with relational databases (e.g., Oracle, SQL Server, Snowflake).Familiarity with job scheduling tools , version control , and CI/CD pipelines .Excellent problem-solving, leadership, and communication skills.
Posted 3 months ago
4.0 - 9.0 years
18 - 25 Lacs
Bengaluru
Work from Office
ETL Developer 1. Skills Required Mandatory : - Minimum experience of 5 years in IBM Datastage 11.7 and SQL-PLSQL - Working knowledge Oracle and PostGRE database - Should have hands on experience in unix shell scripting. Personal Skills: - Good Communication skills written and verbal with the ability to understand and interact with the diverse range of stakeholders - Ability to raise factual alerts & risks when necessary - Capability to work with cross location team members / stakeholders in order to establish and maintain a consistent delivery. Good To Have (Optional) - Technical - Reporting: PowerBI, SAP BO, Tableau - Functional - Finance/Banking - Asset finance / Equipment finance / Leasing 2. Role and Responsibilities - Responsible for developing data warehousing solutions (Data Stage, Oracle, PostgreSQL) as per requirements. - Must provide hands-on technical knowledge and take ownership while working with Business users, Project Managers, Technical Leads, Architects and Testing teams. - Provide guidance to IT management in establishing both a short-term roadmap and long-term DW/BI strategy. - Work closely with team members and stakeholders to ensure seamless development and delivery of assigned tasks. - Assist the team/lead in team management, identifying training needs and inducting new starters - Take part in discussions on BI/DW forums alongside peers to the benefit of the organization Please share your updated cv on Avani.Vibhute@rigvedtech.com
Posted 3 months ago
5.0 - 10.0 years
10 - 20 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Job Location: Chennai, Bangalore, Hyderabad Experience: 5 - 15 yrs Job Type: FTE Shift Timing: 2 PM IST till 11 PM IST Note: Looking only for Immediate to 1 week joiners. Must be comfortable for Video discussion. JD Key Skills: Datastage, SQL Looking for a Sr. Data Analyst who is experienced in the ETL/Datawarehousing technologies Experienced Data Analyst in ETL/Datawarehousing Experience with SQL, Datastage, Autosys, Snowflake,AWS Knowledge of Agile execution/delivery processes/tools including Confluence, JIRA, SharePoint, and ServiceNow Candidate needs to be having the below : >Minimum of 5+ years of Data analyst experience >Experienced in ETL (Datastage preferrable but fine with others as well) >Experienced in Data Analysis ( Data Cleansing, Data validation, Data Mapping & Solutioning, ETL QA) >Experienced in SQL (Snowflake, SQL server, Oracle SQL/ PL SQL - Any of these) >Should be capable of Client facing work on a day-to-day basis >Knowledgeable in Investment Banking/Asset Management/Compliance Regulatory reporting - any of these is good-to-have Contact Person - Amrita Please share your updated profile to amrita.anandita@htcinc.com with the below mentioned details: Full Name (As per Aadhar card) - Total Exp. - Rel. Exp. (Datastage) - Rel. Exp. (SQL) - Highest Education (if has done B.Tech/ B.E, then specify) - Notice Period - If serving Notice or not working, then mention your last working day as per your relieving letter - CCTC - ECTC - Current Location - Preferred Location -
Posted 3 months ago
2.0 - 5.0 years
4 - 8 Lacs
Pune
Work from Office
As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Good Hands on experience in DBT is required. ETL Datastage and snowflake – preferred. Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 3 months ago
4.0 - 7.0 years
0 - 2 Lacs
Chennai
Hybrid
All applicants have to mandatorily apply on BNP Career Site- https://bwelcome.hr.bnpparibas/su/a30f52f03625264e Job ID - 48321590 Applicants can directly search the JOB ID - 48321590 on https://bwelcome.hr.bnpparibas/su/a30f52f03625264e and Apply on the same. Responsibilities Direct Responsibilities Coordinate closely with Data Platform Teamss and also all other BNL BNP Paribas IT teams (Incident coordination, Security, Infrastructure, Development teams, etc.) For a predefined applications scope take care of: Ticket Management Propose solutions to improve an application Incident Management (including problem determination) Request Management Change Management Ensure that SLA targets are met for above activities Handover to Italian teams if knowledge and skills are not available in ISPL Contributing Responsibilities Contribute to the definition of procedures and processes necessary for the team Help build team spirit and integrate into BNL BNP Paribas culture Contribute to incidents analysis and associated problem management Contribute to the regular activity reporting and KPI calculation Contribute to the knowledge transfer with BNL Data Platform team Contribute to the acquisition by ISPL team of new skills & knowledge to expand its scope Technical & Behavioral Competencies Fundamental skills: Knowledge about Mainframe usage TSO, ISPF/S, Scheduler , JCL Knowledge about IBM Datastage ETL Tool Familiarity with database technology is required (Teradata, Oracle, DB2, SQL Server) SQL Languange in order to execute basic scripts and queries. Have basic experience with: Service Now ticketing system Aurelia Remedy ticketing system Nice to have: General IT infrastructure knowledge Knowledge of Requirements Collection, Analysis, Design, Development and Test activity Continuous improvement approaches Good written and spoken English Able to communicate efficiently Good Team Player Specific Qualifications (if required) Basic knowledge of Italian language can be an advantage
Posted 3 months ago
2.0 - 5.0 years
4 - 8 Lacs
Pune
Work from Office
As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Good Hands on experience in DBT is required. ETL Datastage and snowflake - preferred. Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 3 months ago
5.0 - 10.0 years
5 - 12 Lacs
Bengaluru
Work from Office
Primary Must Have Skills - Strong experience working with ETL tool IBM Info sphere Data Stage to develop data pipelines and Data Warehousing. Strong hands-on experience on DataBricks. Have strong hands-on experience with SQL and relational databases Proactive with strong communication and interpersonal skills to effectively collaborate with team members and stakeholders. Strong understanding of data processing concepts (ETL) The candidate should be prepared to sometimes step outside the developer role to gather and create their own analysis and requirements. Secondary skills required Experience in T-SQL writing stored procedures. Moderate Experience in AWS Cloud services. Ability to write sufficient and comprehensive documentation about data processing flow.
Posted 3 months ago
7.0 - 12.0 years
27 - 42 Lacs
Chennai
Work from Office
Abinitio: Ab Initio Skills: Graph Development, Ab Initio standard environment parameters, GD(PDL,MFS Concepts)E, EME basics, SDLC, Data Analysis Database: SQL Proficient, DB Load / Unload Utilities expert, relevant experience in Oracle, DB2, Teradata (Preferred) UNIX: Shell Scripting (must), Unix utilities like sed, awk, perl, python Informatica IICS: Good Exp in designing and developing ETL mappings using IICS. Should be familiar with bulk loading concepts, Change Data Capture (CDC),Data Profiling and Data validation concepts . Should have prior experience working with different types of data sources/targets. Understanding configuration, migration and deployment of ETL mappings. Teradata: Assist in the design, development, and testing of Teradata databases and ETL processes to support data integration and reporting Collaborate with data analysts and other team members to understand data requirements and provide solutions DataStage: Overall experience of 5 years in DW/ BI technologies and minimum 5 years development experience in ETL DataStage 8.x/ 9.x tool. Worked extensively in parallel jobs, sequences and preferably in routines. Good conceptual knowledge on data- warehouse and various methodologies.
Posted 3 months ago
2.0 - 5.0 years
4 - 8 Lacs
bengaluru
Work from Office
As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviour’s. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Good Hands-on experience in DBT is required. ETL DataStage and snowflake - preferred. Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted Date not available
7.0 - 11.0 years
16 - 25 Lacs
hyderabad
Work from Office
Key Responsibilities Translate requirements and data mapping documents to a technical design. Develop, enhance and maintain code following best practices and standards. Create and execute unit test plans. Support regression and system testing efforts. Debug and problem solve issues found during testing and/or production. Communicate status, issues and blockers with project team. Support continuous improvement by identifying and solving opportunities. Support continuous improvement by identifying and solving opportunities. Bachelor degree or military experience in related field (preferably computer science) and 5 years of experience in ETL development within a Data Warehouse. Deep understanding of enterprise data warehousing best practices and standards. Strong experience in software engineering comprising of designing, developing and operating robust and highlyscalable cloud infrastructure services. Strong experience with Python PySpark, DataStage ETL and SQL development. Proven experience in cloud infrastructure projects with hands on migration expertise on public clouds such as AWS and Azure, preferably Snowflake. Knowledge of Cybersecurity organization practices, operations, risk management processes, principles, architectural requirements, engineering and threats and vulnerabilities, including incident response methodologies. Strong communication and interpersonal skills. Strong organization skills and the ability to work independently as well as with a team. Mandatory Skills ETL Datawarehouse concepts AWS, Glue SQL python SNOWFLAKE CI/CD Tools (Jenkins, GitHub)
Posted Date not available
8.0 - 13.0 years
14 - 24 Lacs
bengaluru
Work from Office
Technical Skills: Strong knowledge of Talend/EMR/Airflow/DataStage/Hue/DataHub including its architecture, components (e.g., Designer, Director, Administrator), and administration. Proficiency in ETL processes, data warehousing concepts, and data integration methodologies. Platform Management: Install, configure, and manage IBM DataStage and its related components, such as Information Server. Experience in any reporting tools like PowerBI / Tableau/ Cognos/ MicroStrategy/ Sigma/ QlikView Experience in any analytics tools like Sagemaker, aiOla,SparkBeyond, Rstudio Performance Optimization: Monitor and tune DataStage job performance, system resource utilization (CPU, memory, storage), and troubleshoot bottlenecks. Environment Maintenance: Manage DataStage projects, environments, user access (including role-based access control), permissions, and project configurations. Problem Resolution: Diagnose and resolve issues with DataStage jobs, server performance, and connectivity to data sources. Software Updates & Patches: Apply DataStage software updates, patches, and service packs to ensure the environment is current and secure. Collaboration: Work closely with DataStage developers, data engineers, database administrators, and other teams to ensure data integration processes meet technical and business requirements. Security & Compliance: Enhance security practices, implement best practices for data management, and ensure compliance with enterprise technology standards and regulatory expectations. Documentation: Maintain technical documentation, including runbooks, disaster recovery plans, and job run reports. Monitoring & Reporting: Monitor job executions, generate reports on job runs, and create tickets for any IBM-related issues Experience with scripting languages (e.g., PL/SQL, PowerShell, Python, Bash) is often preferred. User and Group Management: Adding, deleting, and managing Hue users and groups, configuring group permissions, and importing users and groups from LDAP or other directories. Permissions Management: Defining which Hue applications and features are visible and available to users based on their group memberships. Hue Process Management: Monitoring and managing Hue processes to ensure optimal performance and availability, including utilizing the supervisor script. Configuration Management: Performing basic and advanced configuration of Hue to effectively administer the application within the cluster environment. This may include setting up Hue for security, configuring authentication, and managing its integration with other services like Impala and HiveServer2. Security Configuration: Implementing security measures, such as configuring user authorization and authentication within Hue, to protect data. Troubleshooting: Identifying and resolving issues related to Hue operation, user access, and performance. Communication: Excellent communication skills, both written and verbal, for interacting with users, other administrators, and potentially vendors. Documentation: Maintaining clear documentation related to Hue configuration, user management procedures, and other administrative tasks. Staying Current: Keeping abreast of new Hue features and best practices to ensure the platform is used effectively and efficiently. Hue Administration: Strong understanding of Hue's features, administration tools, and configuration options. User Management Expertise: Proficiency in managing users and groups within Hue and potentially other authentication systems like LDAP. Permissions and Security Configuration: Ability to configure user permissions and access control within Hue to ensure data security. Technical Skills: Comfortable with command-line interfaces for managing Hue processes and configuring its settings. Troubleshooting and Problem Solving: Ability to diagnose and resolve issues that may arise with Hue functionality or performance. Ability to work in a 24/7 support rotation and handle urgent production issues. Responsibilities Installation, configuration, and support of the Talend platform (including servers and components). User and project management within the Talend Administration Center (TAC). Managing platform licenses, users, and project authorizations. Setting up and managing shared development environments and Git branches. Enforcing platform governance and standards. Deploying and executing Talend jobs, routes, and services. Monitoring job execution, troubleshooting issues, and ensuring successful runs. Managing job scheduling and dependencies using tools like Job Conductor in TAC. Monitoring platform health, capacity planning, and resource allocation. Using tools like the Activity Monitoring Console (AMC) and Talend Log Server for monitoring and troubleshooting. Identifying and resolving performance bottlenecks in Talend jobs and the overall environment. Implementing continuous improvement and automation opportunities. Executing disaster recovery and password rotation exercises. Providing technical support to development and data engineering teams using Talend. Assisting with audit requests. Creating and maintaining documentation for platform setup, configurations, and troubleshooting. Cloud Specific Responsibilities (if applicable): Managing Talend Cloud environments, including users, remote engines, and workspaces. Publishing and managing artifacts within the cloud environment. Managing and promoting environments and workspaces. Qualification: Education qualification: Any degree from a reputed college
Posted Date not available
2.0 - 5.0 years
4 - 8 Lacs
bengaluru
Work from Office
As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviour’s. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Good Hands-on experience in DBT is required. ETL DataStage and snowflake - preferred. Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted Date not available
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |