Jobs
Interviews

1035 Indexes Jobs - Page 22

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. Oracle Data Integrator (ODI) Specialist As an ODI Specialist, you will work with technical teams and projects to deliver ETL solutions on-premises and Oracle cloud platforms for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building new ETL solutions, migrating an application to co-exist in the hybrid cloud (On-Premises and Cloud). Our teams have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. Work You’ll Do As an ODI developer you will have multiple responsibilities depending on project type. One type of project may involve migrating existing ETL to Oracle cloud infrastructure. Another type of project might involve building ETL solution on both on-premises and Oracle Cloud. The key responsibilities may involve some or all the areas listed below: Engage with clients to c onduct workshops, understand business requirements and identify business problems to solve with integrations. Lead and build Proof-of-concept to showcase value of ODI vs other platforms. Socialize solution design and enable knowledge transfer. Drive train-the trainer sessions to drive adoption of ODI. Partner with clients to drive outcome and deliver value. Collaborate with cross functional teams. Understand source applications and how it can be integrated. Analyze data sets to understand functional and business context. Create Data Warehousing data model and integration design. Understand cross functional process such as Record to Report (RTR), Procure to Pay (PTP), Order to Cash (OTC), Acquire to Retire (ATR), Project to Complete (PTC) Communicate development status and risks to key stakeholders. Lead the team to design, build, test and deploy. Support client needs by delivering ODI jobs and frameworks. Merge, Customize and Deploy ODI data model as per client business requirements. Deliver large/medium DWH programs, demonstrate expert core consulting skills and advanced level of ODI, SQL, PL/SQL knowledge and industry expertise to support delivery to clients. Focus on designing, building, and documenting re-usable code artifacts. Track, report and optimize ODI jobs performance to meet client SLA. Designing and architecting ODI projects including upgrade/migrations to cloud. Design and implement security in ODI. Identify risks and suggest mitigation plan. Ability to lead the team and mentor junior practitioners. Produce high-quality code resulting from knowledge of the tool, code peer review, and automated unit test scripts. Perform system analysis, follow technical design and work on development activities. Participate in design meetings, daily standups, backlog grooming. Lead respective tracks in Scrum team meetings, including all Agile and Scrum related activities. Reviews and evaluates designs and project activities for compliance with systems design and development guidelines and standards; provides tangible feedback to improve product quality and mitigate failure risk. Develop environment strategy, Build the environment & execute migration plans. Validate the environment to meets all security and compliance controls. Lead the testing efforts during SIT and UAT by coordinating with functional teams and all stakeholders. Contribute to sales pursuits by helping the pursuit team to understand the client request and propose robust solutions. Skills: Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle objects such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files. Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents. Analytics & Cognitive Our Analytics & Cognitive team focuses on enabling our client’s end-to-end journey from On-Premises to Cloud, with opportunities in the areas of: Cloud Strategy, Op Model Transformation, Cloud Development, Cloud Integration & APIs, Cloud Migration, Cloud Infrastructure & Engineering, and Cloud Managed Services. We help our clients see the transformational capabilities of Cloud as an opportunity for business enablement and competitive advantage. Analytics & Cognitive team supports our clients as they improve agility and resilience, and identifies opportunities to reduce IT operations spend through automation by enabling Cloud. We accelerate our clients towards a technology-driven future, leveraging vendor solutions and Deloitte-developed software products, tools, and accelerators. Technical Requirements Education: B.E./B.Tech/M.C.A./M.Sc (CS) 6+ years ETL Lead / developer experience and a minimum of 3-4 Years’ experience in Oracle Data Integrator (ODI) Expertise in the Oracle ODI toolset and Oracle PL/SQL, ODI Minimum 2-3 end to end DWH Implementation experience. Should have experience in developing ETL processes - ETL control tables, error logging, auditing, data quality, etc. Should be able to implement reusability, parameterization, workflow design, etc. knowledge of ODI Master and work repository Knowledge of data modelling and ETL design Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Setting up topology, building objects in Designer, Monitoring Operator, different type of KM’s, Agents etc. Packaging components, database operations like Aggregate pivot, union etc. Using ODI mappings, error handling, automation using ODI, Load plans, Migration of Objects Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Experience of performance tuning of mappings Ability to design ETL unit test cases and debug ETL Mappings Expertise in developing Load Plans, Scheduling Jobs Integrate ODI with multiple Source / Target Experience in Data Migration using SQL loader, import/export Consulting Requirements 6-10 years of relevant consulting, industry or technology experience Proven experience assessing client’s workloads and technology landscape for Cloud suitability. Experience in defining new architectures and ability to drive project from architecture standpoint. Ability to quickly establish credibility and trustworthiness with key stakeholders in client organization. Strong problem solving and troubleshooting skills. Strong communicator Willingness to travel in case of project requirement. Preferred Experience in Oracle BI Apps Exposure to one or more of the following: Python, R or UNIX shell scripting. Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle objects such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents Systematic problem-solving approach, coupled with strong communication skills Ability to debug and optimize code and automate routine tasks. Experience writing scripts in one or more languages such as Python, UNIX Scripting and/or similar. Experience working with technical customers. How You’ll Grow At Deloitte, we’ve invested a great deal to create a rich environment in which our professionals can grow. We want all our people to develop in their own way, playing to their own strengths as they hone their leadership skills. And, as a part of our efforts, we provide our professionals with a variety of learning and networking opportunities—including exposure to leaders, sponsors, coaches, and challenging assignments—to help accelerate their careers along the way. No two people learn in the same way. So, we provide a range of resources including live classrooms, team-based learning, and eLearning. DU: The Leadership Center in India, our state-of-the-art, world-class learning Center in the Hyderabad offices is an extension of the Deloitte University (DU) in Westlake, Texas, and represents a tangible symbol of our commitment to our people’s growth and development. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Deloitte’s culture Our positive and supportive culture encourages our people to do their best work every day. We celebrate individuals by recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered, confident, and aware. We offer well-being programs and are continuously looking for new ways to maintain a culture that is inclusive, invites authenticity, leverages our diversity, and where our people excel and lead healthy, happy lives. Corporate citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people and our communities. We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our community. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 302894

Posted 1 month ago

Apply

3.0 - 6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. Oracle Data Integrator (ODI)/ PL SQL Specialist As an ODI Specialist, you will work with technical teams and projects to deliver ETL solutions on-premises and Oracle cloud platforms for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building new ETL solutions, migrating an application to co-exist in the hybrid cloud (On-Premises and Cloud). Our teams have a diverse range of skills and we are always looking for new ways to innovate and help our clients succeed. Work You’ll Do As an ODI developer you will have multiple responsibilities depending on project type. One type of project may involve migrating existing ETL to Oracle cloud infrastructure. Another type of project might involve building ETL solution on both on-premises and Oracle Cloud. The key responsibilities may involve some or all the areas listed below: Engage with clients to Conduct workshops, understand business requirements and identify business problems to solve with integrations. Lead and build Proof-of-concept to showcase value of ODI vs other platforms. Socialize solution design and enable knowledge transfer. Drive train-the trainer sessions to drive adoption of ODI. Partner with clients to drive outcome and deliver value. Collaborate with cross functional teams to Understand source applications and how it can be integrated. Analyze data sets to understand functional and business context. Create Data Warehousing data model and integration design. Understand cross functional process such as Record to Report (RTR), Procure to Pay (PTP), Order to Cash (OTC),Acquire to Retire (ATR), Project to Complete (PTC) Communicate development status and risks to key stakeholders. Lead the team to design, build, test and deploy. Support client needs by delivering ODI jobs and frameworks. Merge, Customize and Deploy ODI data model as per client business requirements. Deliver large/medium DWH programs, demonstrate expert core consulting skills and advanced level of ODI, SQL, PL/SQL knowledge and industry expertise to support delivery to clients. Focus on designing, building, and documenting re-usable code artifacts. Track, report and optimize ODI jobs performance to meet client SLA. Designing and architecting ODI projects including upgrade/migrations to cloud. Design and implement security in ODI. Identify risks and suggest mitigation plan. Ability to lead the team and mentor junior practitioners. Produce high-quality code resulting from knowledge of the tool, code peer review, and automated unit test scripts. Perform system analysis, follow technical design and work on development activities. Participate in design meetings, daily standups, backlog grooming. Lead respective tracks in Scrum team meetings, including all Agile and Scrum related activities. Reviews and evaluates designs and project activities for compliance with systems design and development guidelines and standards; provides tangible feedback to improve product quality and mitigate failure risk. Develop environment strategy, Build the environment & execute migration plans. Validate the environment to meets all security and compliance controls. Lead the testing efforts during SIT and UAT by coordinating with functional teams and all stakeholders. Contribute to sales pursuits by helping the pursuit team to understand the client request and propose robust solutions. I deally, you should also have Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle objects such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files. Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents. Analytics & Cognitive Our Analytics & Cognitive team focuses on enabling our client’s end-to-end journey from On-Premises to Cloud, with opportunities in the areas of: Cloud Strategy, Op Model Transformation, Cloud Development, Cloud Integration & APIs, Cloud Migration, Cloud Infrastructure & Engineering, and Cloud Managed Services. We help our clients see the transformational capabilities of Cloud as an opportunity for business enablement and competitive advantage. Analytics & Cognitive team supports our clients as they improve agility and resilience and identifies opportunities to reduce IT operations spend through automation by enabling Cloud. We accelerate our clients towards a technology-driven future, leveraging vendor solutions and Deloitte-developed software products, tools, and accelerators. Technical Requirements Education: B.E./B.Tech/M.C.A./M.Sc (CS) 3 -6 years ETL Lead / developer experience and a minimum of 3-4 Years’ experience in Oracle Data Integrator (ODI) Expertise in the Oracle ODI toolset and Oracle PL/SQL, ODI Minimum 2-3 end to end DWH Implementation experience. Should have experience in developing ETL processes - ETL control tables, error logging, auditing, data quality, etc. Should be able to implement reusability, parameterization, workflow design, etc. Knowledge of ODI Master and work repository Knowledge of data modelling and ETL design Design and develop complex mappings, Process Flows and ETL scripts. Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Setting up topology, building objects in Designer, Monitoring Operator, different type of KM’s, Agents etc. Packaging components, database operations like Aggregate pivot, union etc. Using ODI mappings, error handling, automation using ODI, Load plans, Migration of Objects Design and develop complex mappings, Process Flows and ETL scripts. Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Experience of performance tuning of mappings Ability to design ETL unit test cases and debug ETL Mappings Expertise in developing Load Plans, Scheduling Jobs Integrate ODI with multiple Source / Target Experience in Data Migration using SQL loader, import/export. Consulting Requirements 3-10 years of relevant consulting, industry or technology experience Proven experience assessing client’s workloads and technology landscape for Cloud suitability. Experience in defining new architectures and ability to drive project from architecture standpoint. Ability to quickly establish credibility and trustworthiness with key stakeholders in client organization. Strong problem solving and troubleshooting skills. Strong communicator Willingness to travel in case of project requirement. Preferred Experience in Oracle BI Apps Exposure to one or more of the following: Python, R or UNIX shell scripting. Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle objects such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions. Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files. Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents Systematic problem-solving approach, coupled with strong communication skills. Ability to debug and optimize code and automate routine tasks. Experience writing scripts in one or more languages such as Python, UNIX Scripting and/or similar. Experience working with technical customers. How You’ll Grow At Deloitte, we’ve invested a great deal to create a rich environment in which our professionals can grow. We want all our people to develop in their own way, playing to their own strengths as they hone their leadership skills. And, as a part of our efforts, we provide our professionals with a variety of learning and networking opportunities—including exposure to leaders, sponsors, coaches, and challenging assignments—to help accelerate their careers along the way. No two people learn in exactly the same way. So, we provide a range of resources including live classrooms, team-based learning, and eLearning. DU: The Leadership Center in India, our state-of-the-art, world-class learning Center in the Hyderabad offices is an extension of the Deloitte University (DU) in Westlake, Texas, and represents a tangible symbol of our commitment to our people’s growth and development. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Deloitte’s culture Our positive and supportive culture encourages our people to do their best work every day. We celebrate individuals by recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered, confident, and aware. We offer well-being programs and are continuously looking for new ways to maintain a culture that is inclusive, invites authenticity, leverages our diversity, and where our people excel and lead healthy, happy lives. Corporate citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people and our communities. We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our community. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 303091

Posted 1 month ago

Apply

0 years

0 Lacs

India

Remote

Job Title: SQL Developer Trainee Location: Remote Job Type: Internship (Full-Time) Duration: 1–3 Months Stipend: ₹25,000/month Department: Data & Engineering Job Summary: We are seeking a detail-oriented and motivated SQL Developer Trainee to join our team remotely. This internship is designed for recent graduates or students who want to gain practical experience in database development, writing SQL queries, and working with data in real-world applications. Key Responsibilities: Write, test, and optimize SQL queries for data extraction and reporting Assist in designing and maintaining database structures (tables, views, indexes, etc.) Help ensure data integrity, accuracy, and security across systems Support the team in troubleshooting and debugging database-related issues Collaborate with developers and analysts to fulfill data requirements for projects Document query logic and database-related processes Qualifications: Bachelor’s degree (or final year student) in Computer Science, Information Technology, or related field Strong understanding of SQL and relational databases (e.g., MySQL, PostgreSQL, SQL Server) Familiarity with database design and normalization Analytical mindset with good problem-solving skills Ability to work independently in a remote setting Eagerness to learn and grow in a data-driven environment Preferred Skills (Nice to Have): Experience with procedures, triggers, or functions in SQL Exposure to BI/reporting tools (Power BI, Tableau, etc.) Understanding of data warehousing concepts Familiarity with cloud-based databases or platforms What We Offer: Monthly stipend of ₹25,000 Remote work opportunity Hands-on experience with real-world datasets and projects Mentorship and structured learning sessions Certificate of Completion Potential for full-time employment based on performance

Posted 1 month ago

Apply

0 years

0 Lacs

India

Remote

Job Title: SQL Developer Trainee Location: Remote Job Type: Internship (Full-Time) Duration: 1–3 Months Stipend: ₹25,000/month Department: Data & Engineering Job Summary: We are seeking a detail-oriented and motivated SQL Developer Trainee to join our team remotely. This internship is designed for recent graduates or students who want to gain practical experience in database development, writing SQL queries, and working with data in real-world applications. Key Responsibilities: Write, test, and optimize SQL queries for data extraction and reporting Assist in designing and maintaining database structures (tables, views, indexes, etc.) Help ensure data integrity, accuracy, and security across systems Support the team in troubleshooting and debugging database-related issues Collaborate with developers and analysts to fulfill data requirements for projects Document query logic and database-related processes Qualifications: Bachelor’s degree (or final year student) in Computer Science, Information Technology, or related field Strong understanding of SQL and relational databases (e.g., MySQL, PostgreSQL, SQL Server) Familiarity with database design and normalization Analytical mindset with good problem-solving skills Ability to work independently in a remote setting Eagerness to learn and grow in a data-driven environment Preferred Skills (Nice to Have): Experience with procedures, triggers, or functions in SQL Exposure to BI/reporting tools (Power BI, Tableau, etc.) Understanding of data warehousing concepts Familiarity with cloud-based databases or platforms What We Offer: Monthly stipend of ₹25,000 Remote work opportunity Hands-on experience with real-world datasets and projects Mentorship and structured learning sessions Certificate of Completion Potential for full-time employment based on performance

Posted 1 month ago

Apply

7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

About Client: Our Client is a multinational IT services and technology, products, and learning organization founded in 2002. They provide a range of services including consulting, staffing, and managed services for platforms like Salesforce, as well as digital engineering solutions. The company emphasizes collaboration, innovation, and adherence to industry standards to deliver high-quality solutions across various sectors and a global presence with offices in the US, India, Europe, Canada, and Singapore. Job Title: Sr Java/ Bigdata Location: Chennai Experience: 7+ Years Job Type : Contract to hire. Notice Period: Immediate joiners. Key skills: Sr Java/ Bigdata 1.Core Java OOP principles, Collections (HashMap, List, Set, etc.), Multithreading Java Memory Management, Garbage Collection Exception Handling, Design Patterns 2.Spring Boot Dependency Injection, Annotations (@Component, @Service, etc.) RESTful API development, Exception handling, Spring Security Auto Configuration, Actuators, Profiles 3.SQL Joins, Aggregations, Subqueries Query optimization, Indexes, Execution plans Complex SQL writing, Stored Procedure 4.Big Data Tools Hadoop, Spark (RDD, DataFrame APIs, transformations/actions), Hive Batch vs Streaming, Partitioning, Performance tuning Data ingestion tools (Kafka, Sqoop, Flume) 5.Architecture Microservices, Message Queues, Event-Driven Design Scalability, Fault Tolerance, Transaction Management

Posted 1 month ago

Apply

2.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Job Description Position : Software Engineer / Sr. Software Engineer Education Qualification : Any Graduate Minimum Years of Experience : 2+ Years Key Skills : MS SQL Server Type of Employment : Permanent Requirement : Immediate or Max 15 days Location : Ahmedabad Responsibilities Responsible to work with development team to develop, implement, and manage data base models for core product development. Responsible to write SQL database views, tables, and stored procedures to support engineering product development. Responsible for designing and maintaining SSIS, T-SQL, and SQL jobs. Responsible for developing and maintaining complex stored procedures for loading data into staging tables from OLTP, and other intermediary systems. Responsible for analysis, design specifications, development, implementation, and maintenance of DB. Responsible for designing partitioning of DB for Archive data. Responsible to ensure that the best practices and standards established for the use of tools like SQL Server, SSIS, SSRS, Excel Power Pivot/View/Map are incorporated in Data Analytics solutions design. Responsible for documenting complex processes, business requirements and specifications. Requirements Technical Skills : Experience in database design, normalization, query design, performance tuning Proficient in writing complex Transact SQL code. Proficient in MS SQL Server query tuning. Experience in writing stored procedures, functions, views and triggers. Experience in Indexes, column store index, SQL server column storage, Query execution plan. Provide authentication and authorizations for Database. Develop best practices for database design and development activities. Experience in database migration activities. Strong analytical, multi-tasking and problem-solving skills. (ref:hirist.tech)

Posted 1 month ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description About Oracle Customer Success Services(CSS): As a key member of Oracle Customer Success Services, you will join an international network of experts dedicated to driving customer success through innovation and expertise. Our One Oracle approach ensures you will be part of a team delivering comprehensive, end-to-end services and solutions that accelerate the entire customer journey. Work alongside certified and experienced professionals, gaining exposure to cutting-edge technologies and methodologies, and enhancing your skills and credentials. Engage with a diverse range of customers, managing the full lifecycle of delivery and services, and ensuring each project has a tangible impact. Benefit from robust team support in a collaborative environment that prioritizes teamwork and mutual success. Join us in Oracle Customer Success Services and elevate your career with a company that values innovation, expertise, and customer-centric solutions Career Level - IC4 Mandatory Skills: Should have 8+ years of experience in Oracle SQL query tuning expertise. Strong expertise in writing and understanding of Oracle SQL/PLSQL Functional knowledge of Oracle Fusion Middleware products, Weblogic & Database products. Interpreting explain plans and able to read AWRs and ASH for identifying SQL query bottle necks. Ability to interpret explain plans for access path, predicate filtering and high knowledge of joins and orders in table access. Advanced knowledge of SQLHC, parallel query knowledge. Should have rich experience in factored / scalar sub-queries and ability to utilise them in sql queries. Strong decision making, convincing skills to bring down the complex discussions to closure supported by right arguments and facts. Understanding of database initialization parameters, logs and traces. Experience working on complex software development/integration projects Experience in scripting tool like Unix shell, Python, Perl Optional hands on troubleshooting Oracle OTBI & BIP report issues and performance related techniques on components like Data models/sets, LOV, parameters , bursting, FF, scheduling. Good to Have Skills: Understanding of AI and Machine learning will be an added advantage. Oracle SQL performance diagnostic tools like SQLHC Experience in using monitoring tools like OEM. Building synthetic test cases. Responsibilities Responsibilities: Optimize Oracle SQL: Diagnose and resolve individual SQL performance issues (you would need to apply hints, define new indexes, investigate optimiser/environment issues) . Early Adopter of Innovation: Explore and adopt new technologies like Oracle 23ai, Autonomous DB, Auto indexing, AI related tools. Analysis and Optimize Oracle BIP Reports: Should be able to help customer to identify performance issues with Oracle BIP reports functionality and sql queries involved with them. Optional knowledge of various components of BIP reports like Data models, Parameters, LOVs, Flex Fields, Bursting. Should be able to advise customers to use right sql queries as per Oracle best practices and which aids in completing the report optimally. OTBI and BIP reports: Knowledge of troubleshooting OTBI reports for advising customers to use performance enhancing configuration parameters. Qualifications Career Level - IC4 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.

Posted 1 month ago

Apply

5.0 years

10 - 14 Lacs

Hyderabad, Telangana, India

On-site

Minimum of 5 years experience with MSSQL Server (experience in version 2016 or above is desired) Strong experience with stored procedures, triggers, Views, CTE, Correlated SubQueries Expert in troubleshooting and resolving database problems Experience in Performance Tuning and Optimization (PTO), using native monitoring and troubleshooting tools. Knowledge of indexes, index management, and statistics Experience working with small teams, with individual project responsibility Ability to think strategically and abstractly Detail-oriented and organizational skills Excellent verbal and written communication skills Nice to have SSRS and SSIS Skills:- MS SQLServer, Performance tuning, Dynamic Management Views, DML, DDL, Stored Procedures, TSQL, JOINS, CTE, CROSS APPY, Temp Tables and Tables

Posted 1 month ago

Apply

3.0 - 5.0 years

6 - 11 Lacs

Thiruvananthapuram

On-site

Experience Required: 3-5 years of hands-on experience in full-stack development, system design, and supporting AI/ML data-driven solutions in a production environment. Key Responsibilities Implementing Technical Designs: Collaborate with architects and senior stakeholders to understand high-level designs and break them down into detailed engineering tasks. Implement system modules and ensure alignment with architectural direction. Cross-Functional Collaboration: Work closely with software developers, data scientists, and UI/UX teams to translate system requirements into working code. Clearly communicate technical concepts and implementation plans to internal teams. Stakeholder Support: Participate in discussions with product and client teams to gather requirements. Provide regular updates on development progress and raise flags early to manage expectations. System Development & Integration: Develop, integrate, and maintain components of AI/ML platforms and data-driven applications. Contribute to scalable, secure, and efficient system components based on guidance from architectural leads. Issue Resolution: Identify and debug system-level issues, including deployment and performance challenges. Proactively collaborate with DevOps and QA to ensure resolution. Quality Assurance & Security Compliance: Ensure that implementations meet coding standards, performance benchmarks, and security requirements. Perform unit and integration testing to uphold quality standards. Agile Execution: Break features into technical tasks, estimate efforts, and deliver components in sprints. Participate in sprint planning, reviews, and retrospectives with a focus on delivering value. Tool & Framework Proficiency: Use modern tools and frameworks in your daily workflow, including AI/ML libraries, backend APIs, front-end frameworks, databases, and cloud services, contributing to robust, maintainable, and scalable systems. Continuous Learning & Contribution: Keep up with evolving tech stacks and suggest optimizations or refactoring opportunities. Bring learnings from the industry into internal knowledge-sharing sessions. Proficiency in using AI-copilots for Coding: Adaptation to emerging tools and knowledge of prompt engineering to effectively use AI for day-to-day coding needs. Technical Skills Hands-on experience with Python-based AI/ML development using libraries such as TensorFlow , PyTorch , scikit-learn , or Keras . Hands-on exposure to self-hosted or managed LLMs , supporting integration and fine-tuning workflows as per system needs while following architectural blueprints. Practical implementation of NLP/CV modules using tools like SpaCy , NLTK , Hugging Face Transformers , and OpenCV , contributing to feature extraction, preprocessing, and inference pipelines. Strong backend experience using Django , Flask , or Node.js , and API development (REST or GraphQL). Front-end development experience with React , Angular , or Vue.js , with a working understanding of responsive design and state management. Development and optimization of data storage solutions , using SQL (PostgreSQL, MySQL) and NoSQL (MongoDB, Cassandra), with hands-on experience configuring indexes, optimizing queries, and using caching tools like Redis and Memcached . Working knowledge of microservices and serverless patterns , participating in building modular services, integrating event-driven systems, and following best practices shared by architectural leads. Application of design patterns (e.g., Factory, Singleton, Observer) during implementation to ensure code reusability, scalability, and alignment with architectural standards. Exposure to big data tools like Apache Spark , and Kafka for processing datasets. Familiarity with ETL workflows and cloud data warehouse , using tools such as Airflow , dbt , BigQuery , or Snowflake . Understanding of CI/CD , containerization (Docker), IaC (Terraform), and cloud platforms (AWS, GCP, or Azure). Implementation of cloud security guidelines , including setting up IAM roles , configuring TLS/SSL , and working within secure VPC setups, with support from cloud architects. Exposure to MLOps practices , model versioning, and deployment pipelines using MLflow , FastAPI , or AWS SageMaker . Configuration and management of cloud services such as AWS EC2 , RDS , S3 , Load Balancers , and WAF , supporting scalable infrastructure deployment and reliability engineering efforts. Personal Attributes Proactive Execution and Communication: Able to take architectural direction and implement it independently with minimal rework with regular communication with stakeholders Collaboration: Comfortable working across disciplines with designers, data engineers, and QA teams. Responsibility: Owns code quality and reliability, especially in production systems. Problem Solver: Demonstrated ability to debug complex systems and contribute to solutioning. Preferred Skills: Key : Python, Django, Django ORM, HTML, CSS, Bootstrap, JavaScript, jQuery, Multi-threading, Multi-processing, Database Design, Database Administration, Cloud Infrastructure, Data Science, self-hosted LLMs Qualifications Bachelor’s or Master’s degree in Computer Science, Information Technology, Data Science, or a related field. Relevant certifications in cloud or machine learning are a plus. Package: 6-11 LPA Job Types: Full-time, Permanent Pay: ₹600,000.00 - ₹1,100,000.00 per year Schedule: Day shift Monday to Friday

Posted 1 month ago

Apply

10.0 years

4 Lacs

India

On-site

The Senior Platform Architect will work closely with cross-functional teams to ensure the platform is used effectively across multiple teams and ensures the successful deployment of features that make it easy and motivating for healthcare consumers to find best-fit care when they need it most. The ideal candidate will have a strong background in healthcare and a deep understanding of the needs of patients, providers, and payers. This role serves as the primary Applications/Systems Platform Architect for all related projects and tasks on the platform across the Claims Cost Solutions division. This role will communicate and work collaboratively with internal software engineers, quality assurance engineers, product owners, project managers and project stakeholders for the implementation and sustainability of the platform, supporting existing data needs and contributing to resolution of major incidents. Major areas of focus will include interoperability, scalability, privacy and security. As a Senior Platform Architect, you’ll be a key member of a dynamic team of IT professionals responsible for the leading the way on platform development and performance. The Senior Platform Architect will work closely with cross-functional teams to ensure the platform is used effectively across multiple teams and ensures the successful deployment of features that make it easy and motivating for healthcare consumers to find best-fit care when they need it most. The ideal candidate will have a strong background in healthcare and a deep understanding of the needs of patients, providers, and payers. This role serves as the primary Applications/Systems Platform Architect for all related projects and tasks on the platform across the Claims Cost Solutions division. This role will communicate and work collaboratively with internal software engineers, quality assurance engineers, product owners, project managers and project stakeholders for the implementation and sustainability of the platform, supporting existing data needs and contributing to resolution of major incidents. Major areas of focus will include interoperability, scalability, privacy and security. As a Senior Platform Architect, you’ll be a key member of a dynamic team of IT professionals responsible for the leading the way on platform development and performance. Responsibilities Subject Matter Expert for the engineering team focused on all aspects of the product. Align vision for the platform development and execute on the strategy. Lead journey mapping activities Develop and review solution artifacts and manage adherence to architecture standards. Perform risk analysis to include: defining strategic direction, architecture leadership and determine, expose and communicate opportunities as needed Drive adherence to existing and evolving security requirements to ensure a robust and secure environment for which to deploy applications. Develop high quality object-oriented code and a mixture of internal and popular external frameworks. Analyze datasets, collaborate with analysts to devise ways to ingest data from a large number of data sources. Develop tools and features to enable customization of system functions and workflows without the need to create additional code. Produce technical specifications. Design and develop relational data structures needed to support new development efforts (including keys, indexes, triggers and stored procedures). Develop and maintain coding standards such as style guides. Facilitate security assessments to ensure system meets overall security posture. Conduct code reviews for work submitted by junior team members. Coach and develop the skills of other team members. Perform application performance analysis and tuning. Conduct and participate in training sessions. Identify and recommend process improvements. Accurately estimate level of effort and forecast completion targets: Provide technology expertise in a variety of areas and provide direction and assistance to development team members and monitor application architecture related to other areas of IT, the business community, and outside vendors to ensure the relevance of architecture to meet business needs. Define and manage KPI's that measure success for platform initiatives. Additional Responsibilities Collaborate with the broader architecture community to provide input into IT strategies and standards, ensure solution reuse and to eliminate solution redundancy across the enterprise. Research and stay abreast of technology trends and assist with the development of solution, integration, workflow, cloud, infrastructure (including software defined storage and networking) and/or web service standards. * Required Experience 10+ years of progressive experience in a combination of development, design, architecture and/or related IT disciplines Software development using Agile methodologies Experience in requirements gathering from project stakeholders Technical Leadership experience, with proven success in the design and delivery of quality, scalable IT solutions, on budget and in compliance with applicable requirements Understanding of computer and network systems Building solutions with a focus on Service architecture (e.g., microservices, SOA) Database design and optimization Experience with Application and service modernization including migration to cloud (e.g., AWS, Azure, GCP) CI/CD pipelines implementation using Azure DevOps or similar platforms Required technical tools: .NET, C#, SQL Server, Visual Studio technologies Kubernetes or similar container technologies Linux operating systems Open source technologies Windows operating systems Experience working with technical offshore teams Preferred Experience Healthcare IT products knowledge NET to .NET Core migration knowledge/experience Exposure to multiple, diverse technical integrations, technologies and processing environments, including cloud and SaaS technologies Experience in design principles and practices, system development methodologies, and software life cycles Passion and ability to create and drive the strategy and architecture of enterprise-level technology strategies Ability to work with tight timelines including working on multiple projects at the same time Familiarity of Enterprise service bus architectures Knowledge of ANSI X12 (837, 834, 835) standards Understanding of emerging technology landscape Queueing and RPC technologies Knowledge of: SAN, Network, Firewalls Job Type: Full-time Pay: From ₹35,000.00 per month Work Location: In person Speak with the employer +91 9925552425

Posted 1 month ago

Apply

50.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Your Team Responsibilities We are building cutting-edge software and data workflows to identify and analyze the exposure and impact to climate change and financially relevant ESG risks. We leverage artificial intelligence (AI) and alternative data to deliver dynamic investment-relevant insights to power your investment decisions. Clients from across the capital ecosystem use our integrated data, analytical tools, indexes and insights for a clear view of the impact of ESG and Climate risks to their investment portfolios. We are seeking an outstanding Software Engineer to join our ESG&Climate Application Development team in the Pune/Mumbai or Budapest offices. As part of a global team you will collaborate in in cross-functional teams to build and improve our industry-leading ESG and Climate solutions. Your Key Responsibilities Design, develop, test, and maintain software applications to meet project requirements. Collaborate with product managers and other stakeholders to gather and refine requirements Participate in code reviews to maintain high coding standards and best practices. Troubleshoot and debug applications to resolve issues and improve performance. Document software designs, architectures, and processes for future reference. Support deployment and integration activities to ensure smooth implementation of software solutions Expected Your skills and experience that will help you excel Bachelor’s degree in computer science, Mathematics, Engineering, related field, or equivalent experience Strong communication, interpersonal and problem-solving skills Good hands-on working experience in Python or Java Experience building RESTful Web Services using Fast API, Django or Flask. Good Understanding and hands on experience with SQL/NoSQL Databases Good understanding of the importance of testing in software development and the usage of unit testing framework like pytest/unittest. Hands on cloud technologies – Google or Azure preferred and experience in developing and managing microservices on cloud. Experience with Source code control systems, especially Git. Preferred Hands on experience with data engineering technologies like Azure Databricks, Spark, or similar framework Some DevOps experience, knowledge of security best practices. Exposure to use of AI, LLM to solve business problems is added advantage. About MSCI What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer committed to diversifying its workforce. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com

Posted 1 month ago

Apply

3.0 - 6.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Oracle Data Integrator (ODI)/ PL SQL Specialist As an ODI Specialist, you will work with technical teams and projects to deliver ETL solutions on-premises and Oracle cloud platforms for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building new ETL solutions, migrating an application to co-exist in the hybrid cloud (On-Premises and Cloud). Our teams have a diverse range of skills, and we are always looking for new ways to innovate and help our clients succeed. Work You’ll Do As an ODI developer you will have multiple responsibilities depending on project type. One type of project may involve migrating existing ETL to Oracle cloud infrastructure. Another type of project might involve building ETL solution on both on-premises and Oracle Cloud. The key responsibilities may involve some or all the areas listed below: Engage with clients to understand business requirements, document user stories and focus on user experience build Proof-of-concept to showcase value of Oracle Analytics vs other platforms socialize solution design and enable knowledge transfer drive train-the trainer sessions to drive adoption of OAC partner with clients to drive outcome and deliver value Collaborate with cross functional teams to understand dependencies on source applications analyze data sets to understand functional and business context understand Data Warehousing data model and integration design understand cross functional process such as Record to Report (RTR), Procure to Pay (PTP), Order to Cash (OTC), Acquire to Retire (ATR), Project to Complete (PTC) communicate development status to key stakeholders Technical Requirements: Education: B.E./B.Tech/M.C.A./M.Sc (CS) 3-6 years ETL Lead / developer experience and a minimum of 3-4 Years’ experience in Oracle Data Integrator (ODI) Expertise in the Oracle ODI toolset and Oracle PL/SQL, ODI Minimum 2-3 end to end DWH Implementation experience. Should have experience in developing ETL processes - ETL control tables, error logging, auditing, data quality, etc. Should be able to implement reusability, parameterization, workflow design, etc. knowledge of ODI Master and work repository Knowledge of data modelling and ETL design Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Setting up topology, building objects in Designer, Monitoring Operator, different type of KM’s, Agents etc. Packaging components, database operations like Aggregate pivot, union etc. Using ODI mappings, error handling, automation using ODI, Load plans, Migration of Objects Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Experience of performance tuning of mappings Ability to design ETL unit test cases and debug ETL Mappings Expertise in developing Load Plans, Scheduling Jobs Integrate ODI with multiple Source / Target Experience in Data Migration using SQL loader, import/export Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle object such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents. Preferred: Experience in Oracle BI Apps Exposure to one or more of the following: Python, R or UNIX shell scripting. Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle object such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents Systematic problem-solving approach, coupled with strong communication skills Ability to debug and optimize code and automate routine tasks. Experience writing scripts in one or more languages such as Python, UNIX Scripting and/or similar. Experience working with technical customers Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 302893

Posted 1 month ago

Apply

3.0 - 6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Oracle Data Integrator (ODI)/ PL SQL Specialist As an ODI Specialist, you will work with technical teams and projects to deliver ETL solutions on-premises and Oracle cloud platforms for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building new ETL solutions, migrating an application to co-exist in the hybrid cloud (On-Premises and Cloud). Our teams have a diverse range of skills, and we are always looking for new ways to innovate and help our clients succeed. Work You’ll Do As an ODI developer you will have multiple responsibilities depending on project type. One type of project may involve migrating existing ETL to Oracle cloud infrastructure. Another type of project might involve building ETL solution on both on-premises and Oracle Cloud. The key responsibilities may involve some or all the areas listed below: Engage with clients to understand business requirements, document user stories and focus on user experience build Proof-of-concept to showcase value of Oracle Analytics vs other platforms socialize solution design and enable knowledge transfer drive train-the trainer sessions to drive adoption of OAC partner with clients to drive outcome and deliver value Collaborate with cross functional teams to understand dependencies on source applications analyze data sets to understand functional and business context understand Data Warehousing data model and integration design understand cross functional process such as Record to Report (RTR), Procure to Pay (PTP), Order to Cash (OTC), Acquire to Retire (ATR), Project to Complete (PTC) communicate development status to key stakeholders Technical Requirements: Education: B.E./B.Tech/M.C.A./M.Sc (CS) 3-6 years ETL Lead / developer experience and a minimum of 3-4 Years’ experience in Oracle Data Integrator (ODI) Expertise in the Oracle ODI toolset and Oracle PL/SQL, ODI Minimum 2-3 end to end DWH Implementation experience. Should have experience in developing ETL processes - ETL control tables, error logging, auditing, data quality, etc. Should be able to implement reusability, parameterization, workflow design, etc. knowledge of ODI Master and work repository Knowledge of data modelling and ETL design Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Setting up topology, building objects in Designer, Monitoring Operator, different type of KM’s, Agents etc. Packaging components, database operations like Aggregate pivot, union etc. Using ODI mappings, error handling, automation using ODI, Load plans, Migration of Objects Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Experience of performance tuning of mappings Ability to design ETL unit test cases and debug ETL Mappings Expertise in developing Load Plans, Scheduling Jobs Integrate ODI with multiple Source / Target Experience in Data Migration using SQL loader, import/export Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle object such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents. Preferred: Experience in Oracle BI Apps Exposure to one or more of the following: Python, R or UNIX shell scripting. Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle object such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents Systematic problem-solving approach, coupled with strong communication skills Ability to debug and optimize code and automate routine tasks. Experience writing scripts in one or more languages such as Python, UNIX Scripting and/or similar. Experience working with technical customers Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 302893

Posted 1 month ago

Apply

3.0 - 6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Oracle Data Integrator (ODI)/ PL SQL Specialist As an ODI Specialist, you will work with technical teams and projects to deliver ETL solutions on-premises and Oracle cloud platforms for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building new ETL solutions, migrating an application to co-exist in the hybrid cloud (On-Premises and Cloud). Our teams have a diverse range of skills, and we are always looking for new ways to innovate and help our clients succeed. Work You’ll Do As an ODI developer you will have multiple responsibilities depending on project type. One type of project may involve migrating existing ETL to Oracle cloud infrastructure. Another type of project might involve building ETL solution on both on-premises and Oracle Cloud. The key responsibilities may involve some or all the areas listed below: Engage with clients to understand business requirements, document user stories and focus on user experience build Proof-of-concept to showcase value of Oracle Analytics vs other platforms socialize solution design and enable knowledge transfer drive train-the trainer sessions to drive adoption of OAC partner with clients to drive outcome and deliver value Collaborate with cross functional teams to understand dependencies on source applications analyze data sets to understand functional and business context understand Data Warehousing data model and integration design understand cross functional process such as Record to Report (RTR), Procure to Pay (PTP), Order to Cash (OTC), Acquire to Retire (ATR), Project to Complete (PTC) communicate development status to key stakeholders Technical Requirements: Education: B.E./B.Tech/M.C.A./M.Sc (CS) 3-6 years ETL Lead / developer experience and a minimum of 3-4 Years’ experience in Oracle Data Integrator (ODI) Expertise in the Oracle ODI toolset and Oracle PL/SQL, ODI Minimum 2-3 end to end DWH Implementation experience. Should have experience in developing ETL processes - ETL control tables, error logging, auditing, data quality, etc. Should be able to implement reusability, parameterization, workflow design, etc. knowledge of ODI Master and work repository Knowledge of data modelling and ETL design Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Setting up topology, building objects in Designer, Monitoring Operator, different type of KM’s, Agents etc. Packaging components, database operations like Aggregate pivot, union etc. Using ODI mappings, error handling, automation using ODI, Load plans, Migration of Objects Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Experience of performance tuning of mappings Ability to design ETL unit test cases and debug ETL Mappings Expertise in developing Load Plans, Scheduling Jobs Integrate ODI with multiple Source / Target Experience in Data Migration using SQL loader, import/export Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle object such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents. Preferred: Experience in Oracle BI Apps Exposure to one or more of the following: Python, R or UNIX shell scripting. Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle object such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents Systematic problem-solving approach, coupled with strong communication skills Ability to debug and optimize code and automate routine tasks. Experience writing scripts in one or more languages such as Python, UNIX Scripting and/or similar. Experience working with technical customers Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 302893

Posted 1 month ago

Apply

3.0 - 6.0 years

0 Lacs

Greater Kolkata Area

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Oracle Data Integrator (ODI)/ PL SQL Specialist As an ODI Specialist, you will work with technical teams and projects to deliver ETL solutions on-premises and Oracle cloud platforms for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building new ETL solutions, migrating an application to co-exist in the hybrid cloud (On-Premises and Cloud). Our teams have a diverse range of skills, and we are always looking for new ways to innovate and help our clients succeed. Work You’ll Do As an ODI developer you will have multiple responsibilities depending on project type. One type of project may involve migrating existing ETL to Oracle cloud infrastructure. Another type of project might involve building ETL solution on both on-premises and Oracle Cloud. The key responsibilities may involve some or all the areas listed below: Engage with clients to understand business requirements, document user stories and focus on user experience build Proof-of-concept to showcase value of Oracle Analytics vs other platforms socialize solution design and enable knowledge transfer drive train-the trainer sessions to drive adoption of OAC partner with clients to drive outcome and deliver value Collaborate with cross functional teams to understand dependencies on source applications analyze data sets to understand functional and business context understand Data Warehousing data model and integration design understand cross functional process such as Record to Report (RTR), Procure to Pay (PTP), Order to Cash (OTC), Acquire to Retire (ATR), Project to Complete (PTC) communicate development status to key stakeholders Technical Requirements: Education: B.E./B.Tech/M.C.A./M.Sc (CS) 3-6 years ETL Lead / developer experience and a minimum of 3-4 Years’ experience in Oracle Data Integrator (ODI) Expertise in the Oracle ODI toolset and Oracle PL/SQL, ODI Minimum 2-3 end to end DWH Implementation experience. Should have experience in developing ETL processes - ETL control tables, error logging, auditing, data quality, etc. Should be able to implement reusability, parameterization, workflow design, etc. knowledge of ODI Master and work repository Knowledge of data modelling and ETL design Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Setting up topology, building objects in Designer, Monitoring Operator, different type of KM’s, Agents etc. Packaging components, database operations like Aggregate pivot, union etc. Using ODI mappings, error handling, automation using ODI, Load plans, Migration of Objects Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Experience of performance tuning of mappings Ability to design ETL unit test cases and debug ETL Mappings Expertise in developing Load Plans, Scheduling Jobs Integrate ODI with multiple Source / Target Experience in Data Migration using SQL loader, import/export Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle object such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents. Preferred: Experience in Oracle BI Apps Exposure to one or more of the following: Python, R or UNIX shell scripting. Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle object such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents Systematic problem-solving approach, coupled with strong communication skills Ability to debug and optimize code and automate routine tasks. Experience writing scripts in one or more languages such as Python, UNIX Scripting and/or similar. Experience working with technical customers Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 302893

Posted 1 month ago

Apply

3.0 - 6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements Oracle Data Integrator (ODI)/ PL SQL Specialist As an ODI Specialist, you will work with technical teams and projects to deliver ETL solutions on-premises and Oracle cloud platforms for some of our Fortune 1000 clients. You will have the opportunity to contribute to work that may involve building new ETL solutions, migrating an application to co-exist in the hybrid cloud (On-Premises and Cloud). Our teams have a diverse range of skills, and we are always looking for new ways to innovate and help our clients succeed. Work You’ll Do As an ODI developer you will have multiple responsibilities depending on project type. One type of project may involve migrating existing ETL to Oracle cloud infrastructure. Another type of project might involve building ETL solution on both on-premises and Oracle Cloud. The key responsibilities may involve some or all the areas listed below: Engage with clients to understand business requirements, document user stories and focus on user experience build Proof-of-concept to showcase value of Oracle Analytics vs other platforms socialize solution design and enable knowledge transfer drive train-the trainer sessions to drive adoption of OAC partner with clients to drive outcome and deliver value Collaborate with cross functional teams to understand dependencies on source applications analyze data sets to understand functional and business context understand Data Warehousing data model and integration design understand cross functional process such as Record to Report (RTR), Procure to Pay (PTP), Order to Cash (OTC), Acquire to Retire (ATR), Project to Complete (PTC) communicate development status to key stakeholders Technical Requirements: Education: B.E./B.Tech/M.C.A./M.Sc (CS) 3-6 years ETL Lead / developer experience and a minimum of 3-4 Years’ experience in Oracle Data Integrator (ODI) Expertise in the Oracle ODI toolset and Oracle PL/SQL, ODI Minimum 2-3 end to end DWH Implementation experience. Should have experience in developing ETL processes - ETL control tables, error logging, auditing, data quality, etc. Should be able to implement reusability, parameterization, workflow design, etc. knowledge of ODI Master and work repository Knowledge of data modelling and ETL design Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Setting up topology, building objects in Designer, Monitoring Operator, different type of KM’s, Agents etc. Packaging components, database operations like Aggregate pivot, union etc. Using ODI mappings, error handling, automation using ODI, Load plans, Migration of Objects Design and develop complex mappings, Process Flows and ETL scripts Must be well versed and hands-on in using and customizing Knowledge Modules (KM) Experience of performance tuning of mappings Ability to design ETL unit test cases and debug ETL Mappings Expertise in developing Load Plans, Scheduling Jobs Integrate ODI with multiple Source / Target Experience in Data Migration using SQL loader, import/export Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle object such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents. Preferred: Experience in Oracle BI Apps Exposure to one or more of the following: Python, R or UNIX shell scripting. Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. Experience in designing and developing Oracle object such as Tables, Views, Indexes, Partitions, Stored Procedures & Functions in PL/SQL, Packages, Materialized Views and Analytical functions Working knowledge of GIT or similar source code control system Experience of creating PL/SQL packages, procedures, Functions, Triggers, views, and exception handling for retrieving, manipulating, checking and migrating complex datasets in oracle Experience in SQL tuning and optimization using explain plan and SQL trace files Partitioning and Indexing strategy for optimal performance Good verbal and written communication in English, Strong interpersonal, analytical and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents Systematic problem-solving approach, coupled with strong communication skills Ability to debug and optimize code and automate routine tasks. Experience writing scripts in one or more languages such as Python, UNIX Scripting and/or similar. Experience working with technical customers Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 302893

Posted 1 month ago

Apply

7.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Country India Working Schedule Full-Time Work Arrangement Hybrid Relocation Assistance Available Yes Posted Date 22-May-2025 Job ID 8827 Description And Requirements Position Summary We are seeking a forward-thinking and enthusiastic Engineering and Operations Specialist to manage and optimize our MongoDB and Splunk platforms. The ideal candidate will have in-depth experience in at least one of these technologies, with a preference for experience in both. Job Responsibilities Worked with engineering and operational tasks for MongoDB and Splunk platforms, ensuring high availability and stability. Continuously improve the stability of the environments, leveraging automation, self-healing mechanisms, and AIOps. Develop and implement automation using technologies such as Ansible, Python, Shell. Manage CI/CD deployments and maintain code repositories. Utilize Infrastructure/Configuration as Code practices to streamline processes. Work closely with development teams to integrate database and observability/logging tools effectively Manages design, distribution, performance, replication, security, availability, and access requirements for large and complex MongoDB databases version (6.0,7.0 ,8.0 and above) on Linux OS on (on-premises, cloud-based). Designs and develops physical layers of databases to support various application needs; Implements back-up, recovery, archiving, conversion strategies, and performance tuning; Manages job scheduling, application release, database change and implement best Database and infrastructure security to meet the compliance. Monitor and tune MongoDB and Splunk clusters for optimal performance, identifying bottlenecks and troubleshooting issues. Analyze database queries, indexing, and storage to ensure minimal latency and maximum throughput. The Senior Splunk System Administrator will build, maintain, and standardize the Splunk platform, including forwarder deployment, configuration, dashboards, and maintenance across Linux OS . Able to debug production issues by analyzing the logs directly and using tools like Splunk. Work in Agile model with the understanding of Agile concepts and Azure DevOps. Learn new technologies based on demand and help team members by coaching and assisting. Education, Technical Skills & Other Critical Requirement Education Bachelor’s degree in computer science, Information Systems, or another related field with 7+ years of IT and Infrastructure engineering work experience. MongoDB Certified DBA or Splunk Certified Administrator is a plus Experience with cloud platforms like AWS, Azure, or Google Cloud. Experience (In Years) 7+ Years Total IT experience & 4+ Years relevant experience in MongoDB and working experience Splunk Administrator Technical Skills In-depth experience with either MongoDB or Splunk, with a preference for exposure to both. Strong enthusiasm for learning and adopting new technologies. Experience with automation tools like Ansible, Python and Shell. Proficiency in CI/CD deployments, DevOps practices, and managing code repositories. Knowledge of Infrastructure/Configuration as Code principles. Developer experience is highly desired. Data engineering skills are a plus. Experience with other DB technologies and observability tools are a plus. Extensive work experience Managed and optimized MongoDB databases, designed robust schemas, and implemented security best practices, ensuring high availability, data integrity, and performance for mission-critical applications. Working experience in database performance tuning with MongoDB tools and techniques. Management of database elements, including creation, alteration, deletion and copying of schemas, databases, tables, views, indexes, stored procedures, triggers, and declarative integrity constraints Extensive experience in Database Backup and recovery strategy by design, configuration and implementation using backup tools (Mongo dump, Mongo restore) and Rubrik. Extensive experience in Configuration and enforced SSL/TLS encryption for secure communication between MongoDB nodes Working experience to Configure and maintain Splunk environments, developed dashboards, and implemented log management solutions to enhance system monitoring and security across Linux OS. Experience Splunk migration and upgradation on Standalone Linux OS and Cloud platform is plus. Perform application administration for a single security information management system using Splunk. Working knowledge of Splunk Search Processing Language (SPL), architecture and various components (indexer, forwarder, search head, deployment server) Extensive experience in both MongoDB database and Splunk replication between Primary and Secondary servers to ensure high availability and fault tolerance. Managed Infrastructure security policy as per best industry standard by designing, configurating and implementing privileges and policy on database using RBAC as well as Splunk. Scripting skills and automation experience using DevOps, Repos and Infrastructure as code. Working experience in Container (AKS and OpenShift) is plus. Working experience in Cloud Platform experience (Azure, Cosmos DB) is plus. Strong knowledge in ITSM process and tools (ServiceNow). Ability to work 24*7 rotational shift to support the Database and Splunk platforms. Other Critical Requirements Strong problem-solving abilities and proactive approach to identifying and resolving issues. Excellent communication and collaboration skills. Ability to work in a fast-paced environment and manage multiple priorities effectively. About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible. Join us!

Posted 1 month ago

Apply

4.0 - 10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Role: Sr SQL Developer Experience : 4 to 10 Years Location: Hyderabad Work Timings: 1 PM to 10 PM Job Responsibilities: Mandatory Skills: 1. Strong Proficiency with T-SQL. 2. Experience with MS SQL Server relational database. 3. Experience in writing T-SQL queries, Custom Stored Procedures, Indexes, Functions and Triggers as per the client requirements. 4. Good Knowledge and working experience in Performance Tuning. Desired Skills: Good knowledge in SQL Server 2008, 2008 R2, 2012 and 2014 with hands on Migration experience. Good understanding of background process functionality. Create, manage, and maintain tables using appropriate storage settings and create database using the Database Configuration Assistant Backup / Recovery and Performance Tuning. Knowledge of Backup and Recovery options and can carry out Basic recovery under guidance. Good knowledge and hands on experience in tuning the Database at Memory level, able to tweak SQL queries. Good understanding of the SQL Server architecture and can trouble shoot connectivity issues. Should have good administrative knowledge of Windows OS. Should be able to administer and alter security and audit parameters under guidance. Good working knowledge of SQL Server profiler and Perform console and can carry out administrative job. Familiar with using most of the options available in Profiler. Working knowledge on Clustering, Mirroring and Log shipping Knowledge of SQL Server high availability like Clustering, Log shipping, Mirroring and Replication. Please share your updated CV to hiring@paradigmit.com

Posted 1 month ago

Apply

0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Are you looking for a new career challenge? With LTIMindtree, are you ready to embark on a data-driven career? Working for global leading manufacturing client for providing an engaging product experience through best-in-class PIM implementation and building rich, relevant, and trusted product information across channels and digital touchpoints so their end customers can make an informed purchase decision – will surely be a fulfilling experience. Location: Coimbatore Email: SUJATHA.GETARI@ltimindtree.com Gajula.Ramu@ltimindtree.com Shivalila.Yantettinawar@ltimindtree.com Diksha.Chauhan2@ltimindtree.com I.Balaji@ltimindtree.com Responsibilities Develop scalable pipelines to efficiently process transform data using Spark Design and develop a scalable and robust framework for generating PDF reports using Python Spark Utilize Snowflake Spark SQL to perform aggregations on high volume of data Develop Stored Procedures Views Indexes Triggers and Functions in Snowflake Database to maintain data and share with downstream applications in form of APIs Should use Snowflake features Streams Tasks Snowpipes etc wherever needed in the development flow Leverage Azure Databricks and Datalake for data processing and storage Develop APIs using Pythons Flask framework to support front end applications Collaborate with Architects and Business stakeholders to understand reporting requirements Maintain and improve existing reporting pipelines and infrastructure Qualifications Proven experience as a Data Engineer with a strong understanding of data pipelines and ETL processes Proficiency in Python with experience in data manipulation libraries such as Pandas and Numpy Experience with SQL Snowflake Spark for data querying and aggregations Familiarity with Azure cloud services such as Data Factory Databricks and Datalake Experience developing APIs using frameworks like Flask is a plus Excellent communication and collaboration skills Ability to work independently and manage multiple tasks effectively Mandatory Skills: Python, SQL, Spark, Azure Data Factory, Azure Datalake, Azure Databricks Azure Service Bus and Azure Event hubs Why join us? Work in industry leading implementations for Tier-1 clients Accelerated career growth and global exposure Collaborative, inclusive work environment rooted in innovation Exposure to best-in-class automation framework Innovation first culture: We embrace automation, AI insights and clean data Know someone who fits this perfectly? Tag them – let’s connect the right talent with right opportunity DM or email to know more Let’s build something great together

Posted 1 month ago

Apply

7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Skills: T-SQL, SQL, SSIS, SSRS, High Availability (HA), Disaster Recovery, ETL, Greetings from Colan Infotech!! Designation - SQL DBA Experience - 7+ Years Job Location - Chennai Notice Period - Immediate to 15 Days Key Responsibilities Manage and maintain high-performance SQL Server databases supporting critical capital markets applications Perform backup, recovery, high availability (HA), and disaster recovery (DR) planning and implementation Optimize SQL queries, indexes, and database performance for large datasets typical of trading and market data systems Ensure data integrity and security in line with regulatory and compliance requirements Work closely with application development and infrastructure teams to support database integration and deployment Monitor database health, generate reports, and provide proactive solutions to potential issues Lead database upgrade and migration projects Support real-time data flows and batch processes used in trading, settlements, and market data analysis Implement and maintain replication, clustering, log shipping, and Always On availability groups. Required Skills & Qualifications 7+ years of experience as a MS SQL DBA Strong knowledge of SQL Server 2016/2019/2022, including internals, query tuning, and HA/DR features Experience working in Capital Markets, with understanding of trading systems, order management, or market data Solid understanding of T-SQL, performance tuning, and execution plans Familiarity with financial data handling, compliance (e.g., MiFID, FINRA), and low-latency data operations Experience with automation and scripting using PowerShell Strong troubleshooting and problem-solving skills Knowledge of data warehousing, ETL processes, and reporting tools (SSIS, SSRS) Ability to work in fast-paced, high-pressure financial environments Preferred Qualifications Experience with cloud-based SQL solutions (Azure SQL, AWS RDS, etc.) Exposure to DevOps practices and CI/CD for databases Certification in Microsoft SQL Server (e.g., MCSA, MCSE) Interested candidates send your updated resume to kumudha.r@colanonine.com

Posted 1 month ago

Apply

7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Role: Database Developer Location: Offshore – (Chennai or Bangalore) We are looking for 7+ Years of database development experience should have min 5+ years of relevant experience. Strong SQL experience in creating database objects like Tables, Stored Procedures, DDL/DML Triggers, Views, Indexes, Cursors, Functions & User defined data types. Technical Skills: Looking for 7+ Years of database development experience and should have min 5+ years of relevant experience Strong PLSQL experience in creating database objects like Tables, Stored Procedures, DDL/DML Triggers, Views, Indexes, Cursors, Functions & User defined data types. Expertise in using Oracle Performance tuning concepts with Oracle hints and EXPLAIN PLAN tool Strong experience using SQL and PL/SQL features like Built In Functions, Analytical Functions, Cursors, Cursor variables, Native dynamic SQL, bulk binding techniques and Packages/Procedures/Functions wherever applicable to process data in an efficient manner Strong understanding of Data Warehousing and Extraction Transformation Loading (ETL) Sound understanding of RDBMS (Oracle) Should have used Oracle SQL Loader/External File Utilities to load files Good to have experience with Snowflake cloud data platform including Snowflake utilities like SnowSQL, SnowPipe, data loading within cloud (AWS or Azure) Strong written and oral communication skills Excellent problem-solving and quantitative skills Demonstrated ability to work as part of a team. Investment Management experience in the past Process Skills: Ability to evaluate, analyze, design and implement solutions based on technical requirements. Develop and peer review of LLD (Initiate/ participate in peer reviews) Strong design and technical skills, ability to translate business needs into technical solutions and able to analyze the impact. Behavioral Skills : Resolve technical issues of projects and explore alternate designs. Participates as a team member and fosters teamwork by inter-group coordination within the modules of the project. Effectively collaborates and communicates with the stakeholders and ensure client satisfaction. Train and coach members of project groups to ensure effective knowledge management activity. Qualification: Somebody who has at least 7+ years of work experience. Education qualification: Any degree from a reputed college

Posted 1 month ago

Apply

2.0 years

0 Lacs

India

On-site

Hi , Please find the below Job Description. Job Title: GCP Data Modeler Duration: Full Time Location: Hybrid Locations: Hyderabad, Chennai, Bengaluru, Pune, Nagpur. Job Description: Experience with with ( GCP, Bigquery, Dataflow, LookML, Looker, SQL, Python) Job Description: Senior Data Modeler with Expertise in GCP and Looker Overview: We are seeking a highly skilled and experienced Data Modeler to join our data and analytics team. The ideal candidate will have deep expertise in data modeling, particularly with Google Cloud Platform (GCP), and a strong background in managing complex data projects. This role involves designing scalable data models, optimizing workflows, and ensuring seamless data integration to support strategic business decisions. Key Responsibilities: Data Modeling: Design, develop, and maintain conceptual, logical, and physical data models to support data warehousing and analytics needs. Ensure data models are scalable, efficient, and aligned with business requirements. Database Design: Create and optimize database schemas, tables, views, indexes, and other database objects in Google BigQuery. Implement best practices for database design to ensure data integrity and performance. ETL Processes: Design and implement ETL (Extract, Transform, Load) processes to integrate data from various source systems into BigQuery. Use tools like Google Cloud Dataflow, Apache Beam, or other ETL tools to automate data pipelines. Data Integration: Work closely with data engineers to ensure seamless integration and consistency of data across different platforms. Integrate data from on-premises systems, third-party applications, and other cloud services into GCP. Data Governance: Implement data governance practices to ensure data quality, consistency, and security. Define and enforce data standards, naming conventions, and documentation. Performance Optimization: Optimize data storage, processing, and retrieval to ensure high performance and scalability. Use partitioning, clustering, and other optimization techniques in BigQuery. Collaboration: Collaborate with business stakeholders, data scientists, and analysts to understand data requirements and translate them into effective data models. Provide technical guidance and mentorship to junior team members. Data Visualization: Work with data visualization tools like Looker, Looker Studio, or Tableau to create interactive dashboards and reports. Develop LookML models in Looker to enable efficient data querying and visualization. Documentation: Document data models, ETL processes, and data integration workflows. Maintain up-to-date documentation to facilitate knowledge sharing and onboarding of new team members. Required Expertise: Looker: 2-5+ Years of Strong proficiency in Looker, including LookML, dashboard creation, and report development. BigQuery: 5+ Extensive experience with Google BigQuery, including data warehousing, SQL querying, and performance optimization. SQL& Python: 10+ years of SQL and Advanced SQL and Python skills for data manipulation, querying, and modelling. ETL: 10+ years of hands-on experience with ETL processes and tools for data integration from various source systems. Cloud Services: Familiarity with Google Cloud Platform (GCP) services, particularly BigQuery, Cloud Storage, and Dataflow. Data Modelling Techniques: Proficiency in various data modelling techniques such as star schema, snowflake schema, normalized and denormalized models, and dimensional modelling. Knowledge of data modelling frameworks, including Data Mesh, Data Vault, Medallion architecture, and methodologies by Kimball and Inmon, is highly advantageous. Problem-Solving: Excellent problem-solving skills and the ability to work on complex, ambiguous projects. Communication: Strong communication and collaboration skills, with the ability to work effectively in a team environment. Project Delivery: Proven track record of delivering successful data projects and driving business value through data insights. Preferred Qualifications: Education: Bachelor's or Master's degree in Data Science, Computer Science, Information Systems, or a related field. Certifications: Google Cloud certification in relevance to Data Modeler or engineering capabilities. Visualization Tools: Experience with other data visualization tools such as Looker, Looker Studio and Tableau. Programming: Familiarity with programming languages such as Python for data manipulation and analysis. Data Warehousing: Knowledge of data warehousing concepts and best practices.

Posted 1 month ago

Apply

0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Company Description WNS (Holdings) Limited (NYSE: WNS), is a leading Business Process Management (BPM) company. We combine our deep industry knowledge with technology and analytics expertise to co-create innovative, digital-led transformational solutions with clients across 10 industries. We enable businesses in Travel, Insurance, Banking and Financial Services, Manufacturing, Retail and Consumer Packaged Goods, Shipping and Logistics, Healthcare, and Utilities to re-imagine their digital future and transform their outcomes with operational excellence.We deliver an entire spectrum of BPM services in finance and accounting, procurement, customer interaction services and human resources leveraging collaborative models that are tailored to address the unique business challenges of each client. We co-create and execute the future vision of 400+ clients with the help of our 44,000+ employees. Job Description Key Responsibilities: ETL Development and Maintenance: Design, develop, and implement ETL processes using SSIS to support data integration and warehousing requirements. Maintain and enhance existing ETL workflows to ensure data accuracy and integrity. Collaborate with data analysts, data architects, and other stakeholders to understand data requirements and translate them into technical specifications. Extract, transform, and load data from various source systems into the data warehouse. Perform data profiling, validation, and cleansing to ensure high data quality. Monitor ETL processes to ensure timely and accurate data loads. Write and optimize complex SQL queries to extract and manipulate data. Work with SQL Server to manage database objects, indexes, and performance tuning. Ensure data security and compliance with industry standards and regulations. Business Intelligence and Reporting: Develop and maintain interactive dashboards and reports using Power BI or SSRS. Collaborate with business users to gather requirements and create visualizations that provide actionable insights. Integrate Power BI with other data sources and platforms for comprehensive reporting. Scripting and Automation: Utilize Python for data manipulation, automation, and integration tasks. Develop scripts to automate repetitive tasks and improve efficiency. Insurance Domain Expertise: Leverage knowledge of insurance industry processes and terminology to effectively manage and interpret insurance data. Work closely with business users and stakeholders within the insurance domain to understand their data needs and provide solutions. Qualifications Required Skills and Qualifications: Technical Skills: Proficient in SQL and experience with SQL Server. Strong experience with SSIS for ETL development and data integration. Proficiency in Python for data manipulation and scripting. Experience with Power BI/SSRS for developing interactive dashboards and reports. Knowledge of data warehousing concepts and best practices. Domain Knowledge: Solid understanding of insurance industry processes, terminology, and data structures. Experience working with insurance-related data, such as policies, claims, underwriting, and actuarial data. Additional Skills: Strong problem-solving and analytical skills. Excellent communication and collaboration abilities.

Posted 1 month ago

Apply

6.0 - 11.0 years

15 - 27 Lacs

Gurugram

Hybrid

Hi, Wishes from GSN!!! Pleasure connecting with you!!! We been into Corporate Search Services for Identifying & Bringing in Stellar Talented Professionals for our reputed IT / Non-IT clients in India. We have been successfully providing results to various potential needs of our clients for the last 20 years. At present, GSN is hiring SSIS Developer for one of our leading MNC client. PFB the details for your better understanding : ****** Looking for SHORT JOINERS ****** 1. WORK LOCATION : Gurgaon 2. Job Role: SSIS Developer 3. EXPERIENCE : 6+ Yrs 4. CTC Range: Rs. 15 to Rs. 27 LPA 5. Work Type : WFO HYBRID ****** Looking for SHORT JOINERS ****** Job Description : • Experience in MS SQL Server • Designing, creating and maintaining database s • Creating stores procedures and function s • Hands-on writing complex queries • Ability to debug SQL procedures • Ability to tune the performance of SQL server • Understanding of indexes, partitions • Understanding of distributed database system like (snowflake, Hyperscale) • Understanding of azure data factory concepts (ADF) • SSIS ****** Looking for SHORT JOINERS ****** Note : kindly go through GOOGLE reviews on www.gsnhr.net Kindly feel free to contact us for any queries. Apply ONLINE for IMEMDIATE response. Thanks & Rgds Kaviya K GSN CONSULTING Mob : 9150016092 Email : kaviya@gsnhr.net Web : www.gsnhr.net Google Review : https://g.co/kgs/UAsF9W

Posted 1 month ago

Apply

3.0 - 8.0 years

15 - 27 Lacs

Kolkata, Bengaluru, Mumbai (All Areas)

Hybrid

Designing & developing Oracle objects such as Tables, Views, Indexes, Stored Procedures & Functions in PL/SQL, Packages in PL/SQL, Materialized Views, Dynamic SQL.Design & develop Oracle objects for high-performing database batch processes, Required Candidate profile Exp. in analyzing, designing building of complex database transaction, reporting solutions Performance tuning, monitoring slow performing SQL.Database design for transactional &reporting applications

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies