Home
Jobs
Companies
Resume

105 Oltp Jobs - Page 5

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10 - 15 years

10 - 14 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

The position is part of the Solutions Integration practice which focuses on the integration of information, process and people through the application of multiple technologies. The candidate is expected to handle small to medium scale consulting projects and should possess skills in the design, development, integration, and deployment of data extraction/load programs. Previous experience within Banking and Financial Services is preferred. To be considered, a candidate should be available for traveling (5% or more) and possess the required skills as mentioned below. The position will be based in the C&R Software Office in Bangalore, India . We shall offer Hybrid model of working. Position description - Solution Integration - Lead ETL Consultant - Band D Role/responsibilities: Design, develop, deploy, and support modules of our world-class enterprise-level solution into our international client base Drive technical architecture and design process in conjunction with client requirements Evaluate new design specifications, raise quality standards, and address architectural concerns Evaluate stability, compatibility, scalability, interoperability, and performance of the solution Own design aspects, performance, re-startablity, logging, error handling, security for both on-premises and cloud customers Continually learn new technologies in related areas Single point of contact (SPOC) for the technical implementation. Work with Project Manager to plan and deliver projects from requirements till Go-Live Be responsible to successfully deliver projects with accountability and ownership Getting the broader picture of the project and contributing accordingly. This includes understanding the business, technical & architectural aspects of project implementations. Thought process to build reusable artefacts and to make use of them to reduce development / testing / deployment / maintenance efforts. Ability to work with multiple customers at the same time. Adaptability to work in SDLC, Iterative and Agile methodology. Interact with clients/onsite team members to understand project requirements and goals. Lead client workshops (face to face or over the phone) for consulting, drive solutioning and issue resolution with client Follow up and escalate gaps, issues and enhancements identified throughout the project and drive them to closure Display high level of knowledge and consistent service in all interactions with client Establish positive client relationship(s) to facilitate our implementations Support client activities throughout the implementation of project life cycle including testing phases Support & review test strategy / planning of the end to end solution Lead in developing detailed business and technical specifications based on project requirements and turn into data extraction/load programs. Program ETL tool with business rules to be applied to data from source input to target data repository. Develop and assist in automating data extraction/load programs to run on regular schedule. Assist in managing daily, weekly, and monthly data operations and scheduled processes. Perform data conversion, quality, and integrity checks for all programs/processes Mentor junior members on team and be responsible for their deliverables Engage in Pre-Sales demonstrations, providing solutions and providing estimates In addition to these skills, the individual needs to be skilled in business analysis and knowledge acquisition . An integration consultant interacts with clients (both business and technical personnel) on a constant basis. Hence, it is necessary that an Integration Consultant have extremely good communication skills, should be able to listen carefully to clients and facilitate information gathering sessions. Skills/Experience requirements: Overall 10+ years of IT industry experience Undergraduate / Graduate in Computer Science or Computer Applications such as B. Sc. / B.C.A. / B. Tech. / B. E. / M. Sc. / M. Tech. / M. E. / M.C.A. Strong experience in understanding business requirements and converting those requirements into detailed functional and technical specifications 7 years experience with ETL tool preferably Kettle with knowledge on Metadata Injection, Kettle DB logging, Carte. 7 years experience in writing PL/SQL or T-SQL programming and queries on Oracle / SQL Server Strong knowledge on RDBMS concept and OLTP system architecture. Minimum 5 years experience in writing shell scripts on UNIX Sun Solaris Competent with SQL/database, SQL Server / Postgres, SSRS and other analytical programs, with the desire and ability to understand new software applications Experience reviewing query performance and optimizing/developing more efficient code Experience with creating table indexes to improve database performance Experience writing complex operations, views, stored procedures, triggers and functions to support business needs in a high availability environment Strong knowledge on source code control mechanism on any tool. Knowledge on GIT / BitBucket is added advantage. Strong knowledge of XML and JSON structures and Jenkins. Experience of job scheduling and working knowledge on at least one 3rd party scheduler Hands-on experience in AWS services like PostgreSQL, Aurora, Lambda is preferred Ability to perform data research and root cause analysis on data issues/discrepancies. Experience utilizing SOAP and REST to access web services Experience in Javascript, HTML, CSS Excellent written and verbal communication skills Excellent inter-personal skills and comfortable establishing professional relationships especially remotely (electronic, phone, written) Proven ability to plan and execute effectively to meet critical time-sensitive objectives Ability to effectively work alone and independently Experience in either the Banking or Financial Industry is preferred Experience in SSRS reports development Working knowledge of Python scripting is preferred Good mentorship skills Ability to deliver effectively in high pressure situations

Posted 1 month ago

Apply

2 - 6 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

AWS Data Engineer: ***************** As an AWS Data Engineer, you will contribute to our client and will have the below responsibilities: Work with technical development team and team lead to understand desired application capabilities. Candidate would need to do development using application development by lifecycles, & continuous integration/deployment practices. Working to integrate open-source components into data-analytic solutions Willingness to continuously learn & share learnings with others Required: 5+ years of direct applicable experience with key focus: Glue and Python; AWS; Data Pipeline creation Develop code using Python, such as o Developing data pipelines from various external data sources to internal data. Use of Glue for extracting data from the design data base. Developing Python APIs as needed Minimum 3 years of hands-on experience in Amazon Web Services including EC2, VPC, S3, EBS, ELB, Cloud-Front, IAM, RDS, Cloud Watch. Able to interpret business requirements, analyzing, designing and developing application on AWS Cloud and ETL technologies Able to design and architect server less application using AWS Lambda, EMR, and DynamoDB Ability to leverage AWS data migration tools and technologies including Storage Gateway, Database Migration and Import Export services. Understands relational database design, stored procedures, triggers, user-defined functions, SQL jobs. Familiar with CI/CD tools e.g., Jenkins, UCD for Automated application deployments Understanding of OLAP, OLTP, Star Schema, Snow Flake Schema, Logical/Physical/Dimensional Data Modeling. Ability to extract data from multiple operational sources and load into staging, Data warehouse, Data Marts etc. using SCDs (Type 1/Type 2/ Type 3/Hybrid) loads. Familiar with Software Development Life Cycle (SDLC) stages in a Waterfall and Agile environment. Nice to have: Familiar with the use of source control management tools for Branching, Merging, Labeling/Tagging and Integration, such as GIT and SVN. Experience working with UNIX/LINUX environments Hand-on experience with IDEs such as Jupiter Notebook Education & Certification University degree or diploma and applicable years of experience Job Segment Developer, Open Source, Data Warehouse, Cloud, Database, Technology

Posted 1 month ago

Apply

12 - 22 years

35 - 60 Lacs

Chennai

Hybrid

Naukri logo

Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 8 - 24 Yrs Location- Pan India Job Description : - The Data Modeler will be responsible for the design, development, and maintenance of data models and standards for Enterprise data platforms. Build dimensional data models applying best practices and providing business insights. Build data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. Identify business needs and translate business requirements to Conceptual, Logical, Physical and semantic, multi-dimensional (star, snowflake), normalized/denormalized, Data Vault2.0 model for the project. Knowledge of snowflake and dbt is added advantage. Create and maintain the Source to Target Data Mapping document for this project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. Develop best practices for standard naming conventions and coding practices to ensure consistency of data models. Gather and Publish Data Dictionaries: Maintain data models and capture data models from existing databases and record descriptive information. Work with the Development team to implement data strategies, build data flows and develop conceptual data models. Create logical and physical data models using best practices to ensure high data quality and reduced redundancy. Optimize and update logical and physical data models to support new and existing projects. Data profiling, business domain modeling, logical data modeling, physical dimensional data modeling and design. Data design and performance optimization for large Data Warehouse solutions. understanding data - profile and analysis (metadata (formats, definitions, valid values, boundaries), relationship/usage) Create relational and dimensional structures for large (multi-terabyte) operational, analytical, warehouse and BI systems. Should be good in verbal and written communication. If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in With Regards, Sankar G Sr. Executive - IT Recruitment

Posted 1 month ago

Apply

12 - 22 years

35 - 60 Lacs

Kolkata

Hybrid

Naukri logo

Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 8 - 24 Yrs Location- Pan India Job Description : - The Data Modeler will be responsible for the design, development, and maintenance of data models and standards for Enterprise data platforms. Build dimensional data models applying best practices and providing business insights. Build data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. Identify business needs and translate business requirements to Conceptual, Logical, Physical and semantic, multi-dimensional (star, snowflake), normalized/denormalized, Data Vault2.0 model for the project. Knowledge of snowflake and dbt is added advantage. Create and maintain the Source to Target Data Mapping document for this project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. Develop best practices for standard naming conventions and coding practices to ensure consistency of data models. Gather and Publish Data Dictionaries: Maintain data models and capture data models from existing databases and record descriptive information. Work with the Development team to implement data strategies, build data flows and develop conceptual data models. Create logical and physical data models using best practices to ensure high data quality and reduced redundancy. Optimize and update logical and physical data models to support new and existing projects. Data profiling, business domain modeling, logical data modeling, physical dimensional data modeling and design. Data design and performance optimization for large Data Warehouse solutions. understanding data - profile and analysis (metadata (formats, definitions, valid values, boundaries), relationship/usage) Create relational and dimensional structures for large (multi-terabyte) operational, analytical, warehouse and BI systems. Should be good in verbal and written communication. If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in With Regards, Sankar G Sr. Executive - IT Recruitment

Posted 1 month ago

Apply

12 - 22 years

35 - 60 Lacs

Noida

Hybrid

Naukri logo

Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level 5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 8 - 24 Yrs Location- Pan India Job Description : - The Data Modeler will be responsible for the design, development, and maintenance of data models and standards for Enterprise data platforms. Build dimensional data models applying best practices and providing business insights. Build data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. Identify business needs and translate business requirements to Conceptual, Logical, Physical and semantic, multi-dimensional (star, snowflake), normalized/denormalized, Data Vault2.0 model for the project. Knowledge of snowflake and dbt is added advantage. Create and maintain the Source to Target Data Mapping document for this project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. Develop best practices for standard naming conventions and coding practices to ensure consistency of data models. Gather and Publish Data Dictionaries: Maintain data models and capture data models from existing databases and record descriptive information. Work with the Development team to implement data strategies, build data flows and develop conceptual data models. Create logical and physical data models using best practices to ensure high data quality and reduced redundancy. Optimize and update logical and physical data models to support new and existing projects. Data profiling, business domain modeling, logical data modeling, physical dimensional data modeling and design. Data design and performance optimization for large Data Warehouse solutions. understanding data - profile and analysis (metadata (formats, definitions, valid values, boundaries), relationship/usage) Create relational and dimensional structures for large (multi-terabyte) operational, analytical, warehouse and BI systems. Should be good in verbal and written communication. If interested please forward your updated resume to sankarspstaffings@gmail.com / Sankar@spstaffing.in With Regards, Sankar G Sr. Executive - IT Recruitment

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies