Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
7 - 12 years
30 - 40 Lacs
Pune, Delhi NCR, Bengaluru
Hybrid
Support enhancements to the MDM platform Develop pipelines using snowflake python SQL and airflow Track System Performance Troubleshoot issues Resolve production issues Required Candidate profile 5+ years of hands on expert level Snowflake, Python, orchestration tools like Airflow Good understanding of investment domain Experience with dbt, Cloud experience (AWS, Azure) DevOps
Posted 2 months ago
5 - 9 years
10 - 15 Lacs
Hyderabad
Work from Office
We are looking for Talend Developer with strong ETL experience for data transformation and automation tasks. Develop, and maintain ETL pipelines using Talend Data Integration, ensuring efficient data extraction, transformation, and loading into PostgreSQL or other databases. Should be proficient in optimizing ETL workflows, handling large datasets, and troubleshooting performance issues. A basic understanding of Python scripting or any other languages will be an added advantage.
Posted 2 months ago
4 - 6 years
10 - 12 Lacs
Chandigarh
Work from Office
KEY TASKS & RESPONSIBILITIES Design, develop, test, and deploy highly efficient SQL code and data mapping code according to specifications. Design and develop ETL code in support of analytic software applications and related analysis projects. Work with Analytics developers, other team members and clients to review the business. requirements and translate them into database objects. Research and utilize new technologies. Collaborate with the Quality Assurance team to test the applications functionality. Provide diagnostic support and fix defects as needed Ensure compliance with eClinical Solutions/industry quality standards, regulations, guidelines, and procedures. Manage multiple timelines and deliverables (for single or multiple clients) and managing client communications as assigned. Provide programming solutions and support using elluminate Data Intelligence Hub platform. Configuration, migration, and support of the elluminate platform Other duties as assigned.
Posted 2 months ago
6 - 8 years
5 - 9 Lacs
Mumbai
Work from Office
Minimum of 6 years of experience in Talend + Snowflake Capacity to work with onsite resources Ability to work as one team mode (part of team and clients are on remote- multinational environment) Ability to work in multiple simultaneous projects Have clear communication with all stakeholder Good experience in Talend Suit Good experience in Snowflake Stored Procedures Very good experience in SQL (SnowFlake desired) Good knowledge on data modelling and data transformation [ business layer]
Posted 2 months ago
8 - 12 years
7 - 11 Lacs
Pune, Hyderabad
Work from Office
o 8 years of experience and strong in Power BI Development, strong in SQL, Data Modelling . Banking experience
Posted 2 months ago
2 - 4 years
3 - 8 Lacs
Pune
Work from Office
Key Responsibilities Design, develop, and maintain robust ETL pipelines for data ingestion, transformation, and loading into data warehouses. Optimize and improve data models to enhance performance and scalability. Collaborate with data analysts, scientists, and other stakeholders to understand data requirements and deliver solutions. Monitor and troubleshoot ETL workflows to ensure smooth operations and data quality. Implement and enforce best practices for data governance, security, and compliance. Analyze and resolve complex technical issues related to ETL processes. Document ETL processes, data architecture, and operational workflows. NoSQL Databases: Hands-on experience with NoSQL databases like MongoDB, Cassandra, or DynamoDB Modern Data Lakehouses: Knowledge of modern data lakehouse platforms like Apache Iceberg, Snowflake, or Dremio. Real-Time Data Processing: Develop and optimize real-time data workflows using tools like Apache Kafka, Apache Flink, or AWS Kinesis. Required Skills and Qualifications Bachelors degree in Computer Science, Data Engineering, or related fields. 2+ years of experience in ETL development and data engineering. Proficiency in ETL tools such as Informatica, Talend, SSIS, or equivalent. Strong knowledge of SQL and database management systems (e.g., PostgreSQL, MySQL, SQL Server). Hands-on experience with data integration and transformation in cloud environments (AWS, Azure, or Google Cloud). Experience with data modeling and working with structured and unstructured data. Familiarity with programming languages like Python, Scala, or Java for data manipulation. Excellent problem-solving and communication skills. NoSQL Databases: Hands-on experience with NoSQL databases like MongoDB, Cassandra, or DynamoDB Modern Data Lakehouses: Knowledge of modern data lakehouse platforms like Apache Iceberg, Snowflake, or Dremio. Real-Time Data Processing: Develop and optimize real-time data workflows using tools like Apache Kafka, Apache Flink, or AWS Kinesis Preferred Skills Knowledge of Big Data technologies like Hadoop, Spark, or Kafka. Experience with data visualization tools (e.g., Tableau, Power BI). Familiarity with DevOps practices for data pipelines. Understanding of machine learning workflows and data preparation for AI models.
Posted 2 months ago
8 - 10 years
15 - 20 Lacs
Noida
Work from Office
Job Description: We are looking for a highly skilled Senior System Engineer with expertise in server administration, automation, network security, and ETL development. The ideal candidate should have a strong understanding of system infrastructure design, security best practices, and database management. Key Responsibilities: -Administer and maintain Windows & Linux servers, ensuring high availability and performance. -Develop and maintain automation scripts using Python, PowerShell, and Bash. -Configure and manage VPNs, firewalls, and network security protocols. -Implement server security best practices, including patch management, hardening, and access control. -Design and manage ETL processes for data extraction, transformation, and loading. -Work with SQL Server and PostgreSQL for database administration, optimization, and troubleshooting. -Manage and support real-time data acquisition using OPC DA/HAD. -Collaborate with cross-functional teams to design and implement scalable system infrastructure solutions. Required Skills & Qualifications: -8+ years of experience in server administration (Windows & Linux). -Strong scripting skills in Python, PowerShell, and Bash for automation. -Expertise in network security, VPN configurations, and firewall management. -Hands-on experience in ETL development and database management (SQL Server/PostgreSQL). -Familiarity with real-time data acquisition using OPC DA/HAD. -Knowledge of system infrastructure design principles and best practices. -Experience in server security (patching, hardening, access control, monitoring, and compliance). Strong problem-solving and troubleshooting skills. Preferred Qualifications: -Experience with cloud platforms (AWS, Azure, or GCP). -Knowledge of containerization (Docker, Kubernetes). -Familiarity with CI/CD pipelines and DevOps tools. -Proficiency in server administration (Windows & Linux). -Scripting knowledge (Python, PowerShell, Bash) for automation. -Knowledge of network security, VPN, and firewall configurations. -Strong understanding of system infrastructure design principles. -Knowledge of server security best practices (patching, hardening, access control). -Hands-on Experience in ETL development -Experience in SQLServer/Postgres database management -Familiar with real-time data acquisition using OPC DA/HAD ETL Development Process, Window & Linux Server, real-time data acquisition using OPC DA/HAD, Strong scripting skills in Python, PowerShell, and Bash for automation, Expertise in network security, VPN configurations, and firewall management.
Posted 2 months ago
3 - 5 years
8 - 14 Lacs
Ahmedabad
Work from Office
A Databricks Engineer is primarily responsible for designing, implementing, and maintaining data engineering solutions using the Databricks platform. Here's a detailed overview of their job profile: Responsibilities : Platform Implementation : - Deploy and configure Databricks clusters and environments based on project requirements. - Set up integrations with data sources, data lakes, and other platforms within the organization's ecosystem. Data Pipeline Development : - Design and develop scalable data pipelines using Databricks, Spark, and related technologies. - Implement ETL (Extract, Transform, Load) processes to ingest, process, and transform data from various sources. Performance Optimization: - Tune and optimize Spark jobs and Databricks clusters for performance, scalability, and cost-efficiency. - Monitor and troubleshoot performance issues, bottlenecks, and resource utilization. Data Modeling and Architecture: - Design data models and schemas to support analytical and reporting requirements. - Collaborate with data architects and analysts to ensure data structures meet business needs. Data Integration and Transformation: - Integrate data from different sources and formats into unified datasets suitable for analysis and reporting. - Implement data transformations and aggregations to prepare data for downstream analytics. Security and Governance: - Implement security policies and access controls within Databricks to protect sensitive data. - Ensure compliance with data governance standards and regulatory requirements. Automation and Orchestration: - Automate deployment, monitoring, and management tasks using Databricks APIs, CLI, and infrastructure-as-code tools. - Orchestrate data workflows and job scheduling to ensure timely execution and reliability. Collaboration and Documentation: - Collaborate with data scientists, analysts, and business stakeholders to understand requirements and deliver solutions. - Document technical designs, processes, and configurations for knowledge sharing and future reference. Skills : Databricks Expertise: In-depth knowledge of Databricks platform capabilities, Spark internals, and best practices. Big Data Technologies: Proficiency in Apache Spark, Scala/Python programming, and data processing frameworks. Cloud Platforms: Experience with cloud platforms (e.g., AWS, Azure, GCP) and services (e.g., S3, Azure Data Lake, BigQuery). Database Management: Understanding of database systems and SQL for data manipulation and querying. Data Engineering: Strong skills in data modeling, ETL development, and data pipeline optimization. Scripting and Automation: Ability to write scripts and automate tasks using Python, Shell scripting, or similar. Problem-Solving: Analytical mindset and troubleshooting skills to resolve complex technical issues. Communication: Effective communication skills to collaborate with cross-functional teams and stakeholders.
Posted 2 months ago
3 - 6 years
4 - 9 Lacs
Pune
Work from Office
Experience with manual testing. Experience with SQL language and Databases is essential. Experience of the Software Testing Life Cycle (STLC) and designing Test Scripts. Experience in ETL Testing (at least 2 years). Experience in ETL Test Automation.
Posted 2 months ago
6 - 11 years
7 - 11 Lacs
Karnataka
Work from Office
Description Requirement 12+ years in Delivery with at least 5+ years as Tech lead in ETL Development Data Warehousing Cloud migration projects Very Good Experience in End-to-End Project Management using Agile Methodology in an onsite-offshore model Experience in leading a cross functional team (Dev test etc) and delivering Data Engineering Cloud Migration projects using google cloud technologies/services. Exposure to google cloud services BQ Dataflow Airflow/Composer GCS IAM CI/CD etc. Google certification will be a plus. Good understanding of Data Warehousing modelling concepts SQL Python. Responsibilities ?Provide Technical and management Leadership to Project Teams for Delivery Excellence ?Provide Technical guidance to team on ground and help troubleshoot resolve problems to ensure timely delivery ?Ensure technology expertise effective process compliance and standardization of process and delivery models ?Should be able to independently review project health and recommend actions as needed ?Measure Technical Project Performance through Project Metrics and take corrective action ?Evaluate the performance of team members and determine training needs. ?Identify Training needs (if any) coordinate training with respective training groups/engage with practice ?Risk and change management ?Effective communication and presentation Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade D Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility No Global Role Family 60227 (P) Engagement Management Local Role Name 6381 Project Manager Local Skills 59074 Google Cloud Security Languages RequiredEnglish Role Rarity To Be Defined
Posted 2 months ago
3 - 6 years
4 - 8 Lacs
Chennai
Work from Office
Abinitio Developer 4+ years Up to 15LPA Trivandrum, Chennai AbInitio,Java,Gde,Unix Immediate to 30 days
Posted 2 months ago
5 - 8 years
4 - 8 Lacs
Bengaluru
Work from Office
Job Title DATASTAGE ADMIN Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills:Technology->Data Management - Data Integration Administration->DataStage Administration Preferred Skills: Technology->Data Management - Data Integration Administration->DataStage Administration Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit * Location of posting is subject to business requirements
Posted 2 months ago
8 - 11 years
10 - 13 Lacs
Hyderabad
Work from Office
Data Engineering Associate Advisor Position Overview: Evernorth, a leading Health Services company, is looking for exceptional data engineers/developers in our Data and Analytics organization. In this role, you will actively participate with your development team on initiatives that support Evernorth's strategic goals as well as subject matter experts to understand business logic you will be engineering. As a Data Engineer, you will help develop an integrated architectural strategy to support next-generation reporting and analytical capabilities on an enterprise-wide scale. You will work in an agile environment, delivering user-oriented products which will be available both internally and externally by our customers, clients, and providers. The Data Engineering Associate Advisor will play a crucial role in transforming raw data into actionable insights that drive business decisions. To be successful in this role, you must have an advanced understanding of ETL processes along with excellent communication skills and be able to build and maintain relationships with both internal and external stakeholders. They will be responsible for partnering with key players in Customer Service, Client Service, Provider Service, and Workforce Planning, while drawing on support from Technology, Finance, Strategy, Operational Readiness, and Solutions Delivery. Roles & Responsibilities: Design and implement ETL processes to extract, transform, and load data from various sources. Collaborate with cross-functional teams to understand data requirements and deliver solutions. Optimize and maintain existing ETL workflows for improved efficiency and performance. Ensure data integrity and quality through rigorous testing and validation. Create and implement best practices to ensure successful delivery and support of large data transformation layer. Collaborate with other team members to ensure timely and successful delivery of production ready applications. Prepare detailed technical documentation. Qualifications Required Experience & Education: 8 - 11 years of relevant ETL development using Informatica and Teradata. Experience with Relational Database Development Teradata, Oracle, SQL Server required. Strong understanding of database concepts and experience with query optimization and performance tuning, preferably in Teradata. Hands-on experience with scripting languages like Shell, Python. AWS Cloud, Databricks, Pysprak experience is a strong plus. Experience with Informatica Cloud is a plus Excellent problem-solving skills and the ability to analyze complex issues and implement effective solutions. Required Skills: Informatica, SQL (SQL Server, Oracle, Teradata) Performance Tuning, Issue Resolution Python, Databricks, DevOps, Basic Cloud Experience (AWS / Azure) Teradata Bteq, mload, fastload experience is preferred. Preferred Skills, Experience & Education: Cloud Certification (AWS Cloud Practitioner or similar)
Posted 2 months ago
1 - 4 years
0 - 3 Lacs
Ahmedabad
Work from Office
Perks : 5 Days working Bi-weekly events Paid sick leaves Casual leaves & CL encashment Employee performance rewards Friendly work culture Medical Insurance Fast growth culture Flexibility Key Skills: Proven Data Analysis experience Excellent knowledge of databases and MS Office Search engines, web analytics and business research tools acumen Working knowledge of data warehousing, modelling, and mining Strong analytical and critical thinking Basic knowledge of Digital Marketing (SEO, PPC etc) Responsibilities: Experience with real data in any domain Experience with ETL Development , Experience with Google sheet and Script, App sheet and Script, python Innovative and strong analytical and algorithmic problem solvers. Familiar with AI Tools like Hevo data etc Familiar with basic SQL queries Hands on experience with advanced excel and vba Extensive experience solving analytical problems using quantitative approaches Experience in data visualization and presentation. Excellent critical thinking skills, combined with the ability to present your beliefs clearly and compellingly verbally and in written form Excellent analytical and organization skills Excellent verbal and written communication skills
Posted 2 months ago
2 - 6 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Informatica PowerCenter Good to have skills : NA Minimum 2 year(s) of experience is required Educational Qualification : regular 15 years of Education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Informatica PowerCenter. Your typical day will involve working with the development team, analyzing business requirements, and developing solutions to meet those requirements. Roles & Responsibilities: Design, develop, and maintain ETL workflows using Informatica PowerCenter. Collaborate with cross-functional teams to analyze business requirements and develop solutions to meet those requirements. Develop and maintain technical documentation, including design documents, test plans, and user manuals. Perform unit testing and support system testing and user acceptance testing. Provide technical support and troubleshooting for production issues. Professional & Technical Skills: Must To Have Skills:Strong experience in Informatica PowerCenter. Good To Have Skills:Experience in Oracle, SQL Server, or other relational databases. Experience in ETL development, data warehousing, and data modeling. Experience in Unix/Linux scripting. Experience in Agile development methodologies. Additional Information: The candidate should have a minimum of 2 years of experience in Informatica PowerCenter. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering high-quality software solutions. This position is based at our Bengaluru office. Qualification regular 15 years of Education
Posted 2 months ago
7 - 12 years
9 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Teradata BI Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with teams to develop solutions that align with business needs and requirements. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the team in implementing innovative solutions- Conduct regular team meetings to ensure alignment and progress- Stay updated on industry trends and technologies to enhance team performance Professional & Technical Skills:- Must To Have Skills:Proficiency in Teradata BI- Strong understanding of data warehousing concepts- Experience in ETL processes and data modeling- Knowledge of SQL and database management- Hands-on experience in developing BI solutions Additional Information:- The candidate should have a minimum of 7.5 years of experience in Teradata BI- This position is based at our Bengaluru office- A 15 years full-time education is required Qualifications 15 years full time education
Posted 2 months ago
3 - 5 years
5 - 9 Lacs
Jaipur
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BusinessObjects Data Services Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Project Role :Application Developer Project Role Description :Design, build and configure applications to meet business process and application requirements. Key Responsibilities:1)Work with functional SMEs to review requirements, field mapping documents from source to target2)Create ETL jobs from scratch and implement simple to complex logic as per mapping document3)Perform Extraction, transformation / source to target mapping and load into Files, Database.4)Performance analysis of existing jobs, and work on enhancements based on new requirements.5)Prepare FD, TD, UT, Cutover documentation Technical Experience:1)Greater than 4 years of ETL Development experience 2)Should have worked independently on ETL Development project at least 1 end to end development project.3)Should understand the basic ETL design concepts like CDC, SCD, Transpose, Updates, Validation4)Having worked with files delimited, fixed width etc will be an added advantage. Professional Attributes :1)Good communication skills2)Should be hands on Collaborative and professional attitude 3)Ability to learn new tools & technologies Educational Qualification :15 years of fulltime education with BE/B Tech or equivalent Qualifications 15 years full time education
Posted 2 months ago
5 - 10 years
7 - 12 Lacs
Mumbai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : Teradata BI Minimum 5 year(s) of experience is required Educational Qualification : minimum 15 years Full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead application development projects Conduct code reviews and ensure coding standards are met Professional & Technical Skills: Must To Have Skills:Proficiency in Google BigQuery Strong understanding of data warehousing concepts Experience with cloud-based data platforms Hands-on experience in SQL and database management Good To Have Skills:Experience with Teradata BI Additional Information: The candidate should have a minimum of 5 years of experience in Google BigQuery This position is based at our Mumbai office A minimum of 15 years Full time education is required Qualifications minimum 15 years Full time education
Posted 2 months ago
2 - 7 years
4 - 9 Lacs
Tamil Nadu
Work from Office
Description Responsibilities Must have 1. Strong expertise in SQL Python PySpark. 2. Good knowledge on Data warehousing techniques. 3. Good knowledge on AWS Big Data services and snowflake. Design develop and maintain scalable data pipelines and architectures for data processing and integration. Implement data streaming solutions to handle real-time data ingestion and processing. Utilize Python and PySpark to develop and optimize data workflows. Leverage AWS services such as S3 Redshift Glue Kinesis and Lambda for data storage processing and analytics. Collaborate with data scientists analysts and other stakeholders to understand data requirements and deliver solutions. Ensure data quality integrity and security across all data pipelines. Monitor and troubleshoot data pipeline performance and reliability. Mentor junior data engineers and provide technical guidance. Stay updated with the latest trends and technologies in data engineering and streaming. Explore and implement Generative AI (GenAI) solutions where applicable Qualifications Bachelors degree in computer science engineering. 8+ years experience in ETL development role. Experience working in AWS PySpark and real time streaming pipeline. Ability to communicate effectively. Strong process documentation skills. Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade C Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility Yes Global Role Family 60242 (P) Data Management Local Role Name 60327 Data Engineer Local Skills 62276 AWS Batch Languages RequiredEnglish Role Rarity To Be Defined
Posted 2 months ago
2 - 7 years
4 - 9 Lacs
Uttar Pradesh
Work from Office
o 8 years of experience and strong in Power BI Development, strong in SQL, Data Modelling . Banking experience
Posted 2 months ago
2 - 7 years
4 - 9 Lacs
Dadra and Nagar Haveli, Chandigarh, Daman
Work from Office
Clientpersistent Role- C2H Location(Hybrid) Experience5-8 Years Budget20 LPA POC:Bhajan About The Role MUST have SkillData Stage, SnowSQL Skill Description ETL, Data Warehousing, Data Stage, Testing,IBM Datastage Data Stage, IBM Datastage, ETL, Testing, Data Warehousing, SnowSQL Location - Chandigarh,Dadra & Nagar Haveli,Daman,Diu,Goa,Haveli,Jammu,Lakshadweep,Nagar,New Delhi,Puducherry,Sikkim
Posted 2 months ago
2 - 7 years
4 - 9 Lacs
Chennai
Work from Office
Abinitio Developer 4+ years Up to 15LPA Trivandrum, Chennai AbInitio,Java,Gde,Unix Immediate to 30 days
Posted 2 months ago
4 - 7 years
0 - 2 Lacs
Chennai
Hybrid
Position Purpose Provide a brief description of the overall purpose of the position, why this position exists and how it will contribute in achieving the teams goal. The requested position is developer-analyst in an open environment, which requires knowledge of the mainframe, TSO, JCL, OPC environment. Responsibilities Direct Responsibilities For a predefined applications scope take care of: Design Implementation (coding / parametrization, unit test, assembly test, integration test, system test, support during functional/acceptance test) Roll-out support Documentation Continuous Improvement Ensure that SLA targets are met for above activities Handover to Italian teams if knowledge and skills are not available in ISPL Coordinate closely with Data Platform Teams’s and also all other BNL BNP Paribas IT teams (Incident coordination, Security, Infrastructure, Development teams, etc.) Collaborate and support Data Platform Teams to Incident Management, Request Management and Change Management Contributing Responsibilities Contribute to the knowledge transfer with BNL Data Platform team Help build team spirit and integrate into BNL BNP Paribas culture Contribute to incidents analysis and associated problem management Contribute to the acquisition by ISPL team of new skills & knowledge to expand its scope Technical & Behavioral Competencies Fundamental skills: IBM DataStage SQL Experience with Data Modeling and tool ERWin Important skill - knowledge of at least one of database technologies is required: Teradata Oracle SQL Server. Basic knowledge about Mainframe usage TSO, ISPF/S, Scheduler IWS, JCL Nice to have: Knowledge of MS SSIS Experience with Service Now ticketing system Knowledge of Requirements Collection, Analysis, Design, Development and Test activity Continuous improvement approaches Knowledge of Python Knowledge and experience with RedHat Linux, Windows, AIX, WAS, CFT
Posted 2 months ago
8 - 11 years
10 - 20 Lacs
Bengaluru
Work from Office
Support the full scope of the data load process, including, client files, ETL/transform, database management, data export, and SLA tracking. Troubleshoot and resolve data load issues, including bad data, file mapping, performance issues for SSIS packages and ETL Processes, and reviewing stored procedures and functions for issues. Develop automated data load programs using SSIS to process and load data sets from multiple sources. Participate in data mapping discussions with business units, clients, vendors and help devise a solution. Communicate with client or vendor’s technical teams regarding data transfer/load issues, resolution steps, following up on reported issues to track them to closure. Perform analysis of data sources and processes to ensure data integrity, completeness, and accuracy. Document data flows, ETL processes, logic as flowcharts/ process diagrams. Work in an agile environment and provide regular updates on the progress of projects. Perform other job-related duties as assigned. Location: On- site Jeddah locationj (ME)
Posted 2 months ago
9 - 11 years
10 - 20 Lacs
Bengaluru
Work from Office
Support the full scope of the data load process, including, client files, ETL/transform, database management, data export, and SLA tracking. Troubleshoot and resolve data load issues, including bad data, file mapping, performance issues for SSIS packages and ETL Processes, and reviewing stored procedures and functions for issues. Develop automated data load programs using SSIS to process and load data sets from multiple sources. Participate in data mapping discussions with business units, clients, vendors and help devise a solution. Communicate with client or vendor’s technical teams regarding data transfer/load issues, resolution steps, following up on reported issues to track them to closure. Perform analysis of data sources and processes to ensure data integrity, completeness, and accuracy. Document data flows, ETL processes, logic as flowcharts/ process diagrams. Work in an agile environment and provide regular updates on the progress of projects. Perform other job-related duties as assigned. Location-On-site (Jeddah location-Middle East)
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2