Home
Jobs
Companies
Resume

996 Adf Jobs - Page 4

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Location: PAN India Duration: 6 Months Experience Required: 7–8 years Job Summary We are looking for an experienced SSAS Developer with strong expertise in developing both OLAP and Tabular models using SQL Server Analysis Services (SSAS), alongside advanced ETL development skills using tools like SSIS, Informatica, or Azure Data Factory . The ideal candidate will be well-versed in T-SQL , dimensional modeling, and building high-performance, scalable data solutions. Key Responsibilities Design, build, and maintain SSAS OLAP cubes and Tabular models Create complex DAX and MDX queries for analytical use cases Develop robust ETL workflows and pipelines using SSIS, Informatica, or ADF Collaborate with cross-functional teams to translate business requirements into BI solutions Optimize SSAS models for scalability and performance Implement best practices in data modeling, version control, and deployment automation Support dashboarding and reporting needs via Power BI, Excel, or Tableau Maintain and troubleshoot data quality, performance, and integration issues Must-Have Skills Hands-on experience with SSAS (Tabular & Multidimensional) Proficient in DAX, MDX, and T-SQL Advanced ETL skills using SSIS / Informatica / Azure Data Factory Knowledge of dimensional modeling (star & snowflake schema) Experience with Azure SQL / MS SQL Server Familiarity with Git and CI/CD pipelines Nice to Have Exposure to cloud data platforms (Azure Synapse, Snowflake, AWS Redshift) Working knowledge of Power BI or similar BI tools Understanding of Agile/Scrum methodology Bachelor's degree in Computer Science, Information Systems, or equivalent Show more Show less

Posted 4 days ago

Apply

8.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Location: PAN India Duration: 6 Months Experience Required: 7–8 years Job Summary We are looking for an experienced SSAS Developer with strong expertise in developing both OLAP and Tabular models using SQL Server Analysis Services (SSAS), alongside advanced ETL development skills using tools like SSIS, Informatica, or Azure Data Factory . The ideal candidate will be well-versed in T-SQL , dimensional modeling, and building high-performance, scalable data solutions. Key Responsibilities Design, build, and maintain SSAS OLAP cubes and Tabular models Create complex DAX and MDX queries for analytical use cases Develop robust ETL workflows and pipelines using SSIS, Informatica, or ADF Collaborate with cross-functional teams to translate business requirements into BI solutions Optimize SSAS models for scalability and performance Implement best practices in data modeling, version control, and deployment automation Support dashboarding and reporting needs via Power BI, Excel, or Tableau Maintain and troubleshoot data quality, performance, and integration issues Must-Have Skills Hands-on experience with SSAS (Tabular & Multidimensional) Proficient in DAX, MDX, and T-SQL Advanced ETL skills using SSIS / Informatica / Azure Data Factory Knowledge of dimensional modeling (star & snowflake schema) Experience with Azure SQL / MS SQL Server Familiarity with Git and CI/CD pipelines Nice to Have Exposure to cloud data platforms (Azure Synapse, Snowflake, AWS Redshift) Working knowledge of Power BI or similar BI tools Understanding of Agile/Scrum methodology Bachelor's degree in Computer Science, Information Systems, or equivalent Show more Show less

Posted 4 days ago

Apply

8.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Location: PAN India Duration: 6 Months Experience Required: 7–8 years Job Summary We are looking for an experienced SSAS Developer with strong expertise in developing both OLAP and Tabular models using SQL Server Analysis Services (SSAS), alongside advanced ETL development skills using tools like SSIS, Informatica, or Azure Data Factory . The ideal candidate will be well-versed in T-SQL , dimensional modeling, and building high-performance, scalable data solutions. Key Responsibilities Design, build, and maintain SSAS OLAP cubes and Tabular models Create complex DAX and MDX queries for analytical use cases Develop robust ETL workflows and pipelines using SSIS, Informatica, or ADF Collaborate with cross-functional teams to translate business requirements into BI solutions Optimize SSAS models for scalability and performance Implement best practices in data modeling, version control, and deployment automation Support dashboarding and reporting needs via Power BI, Excel, or Tableau Maintain and troubleshoot data quality, performance, and integration issues Must-Have Skills Hands-on experience with SSAS (Tabular & Multidimensional) Proficient in DAX, MDX, and T-SQL Advanced ETL skills using SSIS / Informatica / Azure Data Factory Knowledge of dimensional modeling (star & snowflake schema) Experience with Azure SQL / MS SQL Server Familiarity with Git and CI/CD pipelines Nice to Have Exposure to cloud data platforms (Azure Synapse, Snowflake, AWS Redshift) Working knowledge of Power BI or similar BI tools Understanding of Agile/Scrum methodology Bachelor's degree in Computer Science, Information Systems, or equivalent Show more Show less

Posted 4 days ago

Apply

8.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Location: PAN India Duration: 6 Months Experience Required: 7–8 years Job Summary We are looking for an experienced SSAS Developer with strong expertise in developing both OLAP and Tabular models using SQL Server Analysis Services (SSAS), alongside advanced ETL development skills using tools like SSIS, Informatica, or Azure Data Factory . The ideal candidate will be well-versed in T-SQL , dimensional modeling, and building high-performance, scalable data solutions. Key Responsibilities Design, build, and maintain SSAS OLAP cubes and Tabular models Create complex DAX and MDX queries for analytical use cases Develop robust ETL workflows and pipelines using SSIS, Informatica, or ADF Collaborate with cross-functional teams to translate business requirements into BI solutions Optimize SSAS models for scalability and performance Implement best practices in data modeling, version control, and deployment automation Support dashboarding and reporting needs via Power BI, Excel, or Tableau Maintain and troubleshoot data quality, performance, and integration issues Must-Have Skills Hands-on experience with SSAS (Tabular & Multidimensional) Proficient in DAX, MDX, and T-SQL Advanced ETL skills using SSIS / Informatica / Azure Data Factory Knowledge of dimensional modeling (star & snowflake schema) Experience with Azure SQL / MS SQL Server Familiarity with Git and CI/CD pipelines Nice to Have Exposure to cloud data platforms (Azure Synapse, Snowflake, AWS Redshift) Working knowledge of Power BI or similar BI tools Understanding of Agile/Scrum methodology Bachelor's degree in Computer Science, Information Systems, or equivalent Show more Show less

Posted 4 days ago

Apply

8.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Location: PAN India Duration: 6 Months Experience Required: 7–8 years Job Summary We are looking for an experienced SSAS Developer with strong expertise in developing both OLAP and Tabular models using SQL Server Analysis Services (SSAS), alongside advanced ETL development skills using tools like SSIS, Informatica, or Azure Data Factory . The ideal candidate will be well-versed in T-SQL , dimensional modeling, and building high-performance, scalable data solutions. Key Responsibilities Design, build, and maintain SSAS OLAP cubes and Tabular models Create complex DAX and MDX queries for analytical use cases Develop robust ETL workflows and pipelines using SSIS, Informatica, or ADF Collaborate with cross-functional teams to translate business requirements into BI solutions Optimize SSAS models for scalability and performance Implement best practices in data modeling, version control, and deployment automation Support dashboarding and reporting needs via Power BI, Excel, or Tableau Maintain and troubleshoot data quality, performance, and integration issues Must-Have Skills Hands-on experience with SSAS (Tabular & Multidimensional) Proficient in DAX, MDX, and T-SQL Advanced ETL skills using SSIS / Informatica / Azure Data Factory Knowledge of dimensional modeling (star & snowflake schema) Experience with Azure SQL / MS SQL Server Familiarity with Git and CI/CD pipelines Nice to Have Exposure to cloud data platforms (Azure Synapse, Snowflake, AWS Redshift) Working knowledge of Power BI or similar BI tools Understanding of Agile/Scrum methodology Bachelor's degree in Computer Science, Information Systems, or equivalent Show more Show less

Posted 4 days ago

Apply

8.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Location: PAN India Duration: 6 Months Experience Required: 7–8 years Job Summary We are looking for an experienced SSAS Developer with strong expertise in developing both OLAP and Tabular models using SQL Server Analysis Services (SSAS), alongside advanced ETL development skills using tools like SSIS, Informatica, or Azure Data Factory . The ideal candidate will be well-versed in T-SQL , dimensional modeling, and building high-performance, scalable data solutions. Key Responsibilities Design, build, and maintain SSAS OLAP cubes and Tabular models Create complex DAX and MDX queries for analytical use cases Develop robust ETL workflows and pipelines using SSIS, Informatica, or ADF Collaborate with cross-functional teams to translate business requirements into BI solutions Optimize SSAS models for scalability and performance Implement best practices in data modeling, version control, and deployment automation Support dashboarding and reporting needs via Power BI, Excel, or Tableau Maintain and troubleshoot data quality, performance, and integration issues Must-Have Skills Hands-on experience with SSAS (Tabular & Multidimensional) Proficient in DAX, MDX, and T-SQL Advanced ETL skills using SSIS / Informatica / Azure Data Factory Knowledge of dimensional modeling (star & snowflake schema) Experience with Azure SQL / MS SQL Server Familiarity with Git and CI/CD pipelines Nice to Have Exposure to cloud data platforms (Azure Synapse, Snowflake, AWS Redshift) Working knowledge of Power BI or similar BI tools Understanding of Agile/Scrum methodology Bachelor's degree in Computer Science, Information Systems, or equivalent Show more Show less

Posted 4 days ago

Apply

8.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Location: PAN India Duration: 6 Months Experience Required: 7–8 years Job Summary We are looking for an experienced SSAS Developer with strong expertise in developing both OLAP and Tabular models using SQL Server Analysis Services (SSAS), alongside advanced ETL development skills using tools like SSIS, Informatica, or Azure Data Factory . The ideal candidate will be well-versed in T-SQL , dimensional modeling, and building high-performance, scalable data solutions. Key Responsibilities Design, build, and maintain SSAS OLAP cubes and Tabular models Create complex DAX and MDX queries for analytical use cases Develop robust ETL workflows and pipelines using SSIS, Informatica, or ADF Collaborate with cross-functional teams to translate business requirements into BI solutions Optimize SSAS models for scalability and performance Implement best practices in data modeling, version control, and deployment automation Support dashboarding and reporting needs via Power BI, Excel, or Tableau Maintain and troubleshoot data quality, performance, and integration issues Must-Have Skills Hands-on experience with SSAS (Tabular & Multidimensional) Proficient in DAX, MDX, and T-SQL Advanced ETL skills using SSIS / Informatica / Azure Data Factory Knowledge of dimensional modeling (star & snowflake schema) Experience with Azure SQL / MS SQL Server Familiarity with Git and CI/CD pipelines Nice to Have Exposure to cloud data platforms (Azure Synapse, Snowflake, AWS Redshift) Working knowledge of Power BI or similar BI tools Understanding of Agile/Scrum methodology Bachelor's degree in Computer Science, Information Systems, or equivalent Show more Show less

Posted 4 days ago

Apply

8.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Location: PAN India Duration: 6 Months Experience Required: 7–8 years Job Summary We are looking for an experienced SSAS Developer with strong expertise in developing both OLAP and Tabular models using SQL Server Analysis Services (SSAS), alongside advanced ETL development skills using tools like SSIS, Informatica, or Azure Data Factory . The ideal candidate will be well-versed in T-SQL , dimensional modeling, and building high-performance, scalable data solutions. Key Responsibilities Design, build, and maintain SSAS OLAP cubes and Tabular models Create complex DAX and MDX queries for analytical use cases Develop robust ETL workflows and pipelines using SSIS, Informatica, or ADF Collaborate with cross-functional teams to translate business requirements into BI solutions Optimize SSAS models for scalability and performance Implement best practices in data modeling, version control, and deployment automation Support dashboarding and reporting needs via Power BI, Excel, or Tableau Maintain and troubleshoot data quality, performance, and integration issues Must-Have Skills Hands-on experience with SSAS (Tabular & Multidimensional) Proficient in DAX, MDX, and T-SQL Advanced ETL skills using SSIS / Informatica / Azure Data Factory Knowledge of dimensional modeling (star & snowflake schema) Experience with Azure SQL / MS SQL Server Familiarity with Git and CI/CD pipelines Nice to Have Exposure to cloud data platforms (Azure Synapse, Snowflake, AWS Redshift) Working knowledge of Power BI or similar BI tools Understanding of Agile/Scrum methodology Bachelor's degree in Computer Science, Information Systems, or equivalent Show more Show less

Posted 4 days ago

Apply

8.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Location: PAN India Duration: 6 Months Experience Required: 7–8 years Job Summary We are looking for an experienced SSAS Developer with strong expertise in developing both OLAP and Tabular models using SQL Server Analysis Services (SSAS), alongside advanced ETL development skills using tools like SSIS, Informatica, or Azure Data Factory . The ideal candidate will be well-versed in T-SQL , dimensional modeling, and building high-performance, scalable data solutions. Key Responsibilities Design, build, and maintain SSAS OLAP cubes and Tabular models Create complex DAX and MDX queries for analytical use cases Develop robust ETL workflows and pipelines using SSIS, Informatica, or ADF Collaborate with cross-functional teams to translate business requirements into BI solutions Optimize SSAS models for scalability and performance Implement best practices in data modeling, version control, and deployment automation Support dashboarding and reporting needs via Power BI, Excel, or Tableau Maintain and troubleshoot data quality, performance, and integration issues Must-Have Skills Hands-on experience with SSAS (Tabular & Multidimensional) Proficient in DAX, MDX, and T-SQL Advanced ETL skills using SSIS / Informatica / Azure Data Factory Knowledge of dimensional modeling (star & snowflake schema) Experience with Azure SQL / MS SQL Server Familiarity with Git and CI/CD pipelines Nice to Have Exposure to cloud data platforms (Azure Synapse, Snowflake, AWS Redshift) Working knowledge of Power BI or similar BI tools Understanding of Agile/Scrum methodology Bachelor's degree in Computer Science, Information Systems, or equivalent Show more Show less

Posted 4 days ago

Apply

8.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Location: PAN India Duration: 6 Months Experience Required: 7–8 years Job Summary We are looking for an experienced SSAS Developer with strong expertise in developing both OLAP and Tabular models using SQL Server Analysis Services (SSAS), alongside advanced ETL development skills using tools like SSIS, Informatica, or Azure Data Factory . The ideal candidate will be well-versed in T-SQL , dimensional modeling, and building high-performance, scalable data solutions. Key Responsibilities Design, build, and maintain SSAS OLAP cubes and Tabular models Create complex DAX and MDX queries for analytical use cases Develop robust ETL workflows and pipelines using SSIS, Informatica, or ADF Collaborate with cross-functional teams to translate business requirements into BI solutions Optimize SSAS models for scalability and performance Implement best practices in data modeling, version control, and deployment automation Support dashboarding and reporting needs via Power BI, Excel, or Tableau Maintain and troubleshoot data quality, performance, and integration issues Must-Have Skills Hands-on experience with SSAS (Tabular & Multidimensional) Proficient in DAX, MDX, and T-SQL Advanced ETL skills using SSIS / Informatica / Azure Data Factory Knowledge of dimensional modeling (star & snowflake schema) Experience with Azure SQL / MS SQL Server Familiarity with Git and CI/CD pipelines Nice to Have Exposure to cloud data platforms (Azure Synapse, Snowflake, AWS Redshift) Working knowledge of Power BI or similar BI tools Understanding of Agile/Scrum methodology Bachelor's degree in Computer Science, Information Systems, or equivalent Show more Show less

Posted 4 days ago

Apply

8.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Location: PAN India Duration: 6 Months Experience Required: 7–8 years Job Summary We are looking for an experienced SSAS Developer with strong expertise in developing both OLAP and Tabular models using SQL Server Analysis Services (SSAS), alongside advanced ETL development skills using tools like SSIS, Informatica, or Azure Data Factory . The ideal candidate will be well-versed in T-SQL , dimensional modeling, and building high-performance, scalable data solutions. Key Responsibilities Design, build, and maintain SSAS OLAP cubes and Tabular models Create complex DAX and MDX queries for analytical use cases Develop robust ETL workflows and pipelines using SSIS, Informatica, or ADF Collaborate with cross-functional teams to translate business requirements into BI solutions Optimize SSAS models for scalability and performance Implement best practices in data modeling, version control, and deployment automation Support dashboarding and reporting needs via Power BI, Excel, or Tableau Maintain and troubleshoot data quality, performance, and integration issues Must-Have Skills Hands-on experience with SSAS (Tabular & Multidimensional) Proficient in DAX, MDX, and T-SQL Advanced ETL skills using SSIS / Informatica / Azure Data Factory Knowledge of dimensional modeling (star & snowflake schema) Experience with Azure SQL / MS SQL Server Familiarity with Git and CI/CD pipelines Nice to Have Exposure to cloud data platforms (Azure Synapse, Snowflake, AWS Redshift) Working knowledge of Power BI or similar BI tools Understanding of Agile/Scrum methodology Bachelor's degree in Computer Science, Information Systems, or equivalent Show more Show less

Posted 4 days ago

Apply

8.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Location: PAN India Duration: 6 Months Experience Required: 7–8 years Job Summary We are looking for an experienced SSAS Developer with strong expertise in developing both OLAP and Tabular models using SQL Server Analysis Services (SSAS), alongside advanced ETL development skills using tools like SSIS, Informatica, or Azure Data Factory . The ideal candidate will be well-versed in T-SQL , dimensional modeling, and building high-performance, scalable data solutions. Key Responsibilities Design, build, and maintain SSAS OLAP cubes and Tabular models Create complex DAX and MDX queries for analytical use cases Develop robust ETL workflows and pipelines using SSIS, Informatica, or ADF Collaborate with cross-functional teams to translate business requirements into BI solutions Optimize SSAS models for scalability and performance Implement best practices in data modeling, version control, and deployment automation Support dashboarding and reporting needs via Power BI, Excel, or Tableau Maintain and troubleshoot data quality, performance, and integration issues Must-Have Skills Hands-on experience with SSAS (Tabular & Multidimensional) Proficient in DAX, MDX, and T-SQL Advanced ETL skills using SSIS / Informatica / Azure Data Factory Knowledge of dimensional modeling (star & snowflake schema) Experience with Azure SQL / MS SQL Server Familiarity with Git and CI/CD pipelines Nice to Have Exposure to cloud data platforms (Azure Synapse, Snowflake, AWS Redshift) Working knowledge of Power BI or similar BI tools Understanding of Agile/Scrum methodology Bachelor's degree in Computer Science, Information Systems, or equivalent Show more Show less

Posted 4 days ago

Apply

8.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Location: PAN India Duration: 6 Months Experience Required: 7–8 years Job Summary We are looking for an experienced SSAS Developer with strong expertise in developing both OLAP and Tabular models using SQL Server Analysis Services (SSAS), alongside advanced ETL development skills using tools like SSIS, Informatica, or Azure Data Factory . The ideal candidate will be well-versed in T-SQL , dimensional modeling, and building high-performance, scalable data solutions. Key Responsibilities Design, build, and maintain SSAS OLAP cubes and Tabular models Create complex DAX and MDX queries for analytical use cases Develop robust ETL workflows and pipelines using SSIS, Informatica, or ADF Collaborate with cross-functional teams to translate business requirements into BI solutions Optimize SSAS models for scalability and performance Implement best practices in data modeling, version control, and deployment automation Support dashboarding and reporting needs via Power BI, Excel, or Tableau Maintain and troubleshoot data quality, performance, and integration issues Must-Have Skills Hands-on experience with SSAS (Tabular & Multidimensional) Proficient in DAX, MDX, and T-SQL Advanced ETL skills using SSIS / Informatica / Azure Data Factory Knowledge of dimensional modeling (star & snowflake schema) Experience with Azure SQL / MS SQL Server Familiarity with Git and CI/CD pipelines Nice to Have Exposure to cloud data platforms (Azure Synapse, Snowflake, AWS Redshift) Working knowledge of Power BI or similar BI tools Understanding of Agile/Scrum methodology Bachelor's degree in Computer Science, Information Systems, or equivalent Show more Show less

Posted 4 days ago

Apply

40.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description As a member of the Support organization, your focus is to deliver post-sales support and solutions to the Oracle customer base while serving as an advocate for customer needs. This involves resolving post-sales non-technical customer inquiries via phone and electronic means, as well as, technical questions regarding the use of and troubleshooting for our Electronic Support Services. A primary point of contact for customers, you are responsible for facilitating customer relationships with Support and providing advice and assistance to internal Oracle employees on diverse customer situations and escalated issues. The Fusion Supply Chain / Manufacturing Support Team is expanding to support our rapidly increasing customer base in the Cloud (SaaS), as well as growing numbers of on-premise accounts. The team partners with Oracle Development in supporting early adopters and many other new customers. This is a unique opportunity to be part of the future of Oracle Support and help shape the product and the organization to benefit our customers and our employees. This position is for supporting Fusion Applications, particularly under the Fusion SCM modules - Fusion SCM Planning, Fusion SCM Manufacturing, Fusion SCM Maintenance. Research, resolve and respond to complex issues across the Application product lines and product boundaries in accordance with current standards Demonstrate strong follow-through and consistently keep commitments to customers and employees Ensure that each and every customer is handled with a consummately professional attitude and the highest possible level of service Take ownership and responsibility for priority customer cases where and when required Review urgent and critical incidents for quality Queue reviews with analysts to ensure quality and efficiency of support Report high visibility cases, escalation, customer trends to management Act as information resource to the management team Contribute to an environment that encourages information sharing, team-based resolution activity, cross training and an absolute focus on resolving customer cases as quickly and effectively as possible Participate in projects that enhance the quality or efficiency of support Participate in system and release testing, as needed Act as a role model and mentor for other analysts Perform detailed technical analysis and troubleshooting using SQL, PL/SQL,Java, ADF, Redwood, VBCS, SOA and Rest API Participate in after hour support as required. Work with Oracle Development/Support Development for product related issues Demonstrate core competencies (employ sound business judgment, creative and innovative ways to solve problems, strong work ethic and do whatever it takes to get the job done) Knowledge of Business process and functional knowledge required for our support organization for Maintenance Module Asset Management: Oversee the entire lifecycle of physical assets to optimize utilization and visibility into maintenance operations.Track and manage enterprise-owned and customer-owned assets, including Install Base Assets. Preventive maintenance/Maintenance Program: Define and generate daily preventive maintenance forecasts for affected assets within maintenance-enabled organizations. Utilize forecasts to create preventive maintenance work orders, reducing workload for planners and enhancing program auditing, optimization, and exception management. Work Definition: Identify and manage Maintenance Work Areas based on physical, geographical, or logical groupings of work centers. Define and manage resources, work centers, and standard operations. Develop reusable operation templates (standard operations) detailing operation specifics and required resources. Apply standard operations to multiple maintenance work definitions and work orders. Work Order creation, scheduling and Dispatch: Track material usage and labour hours against planned activities. Manage component installation and removal. Conduct inspections and ensure seamless execution of work orders. Work Order Transactions: Apply knowledge of operation pull, assembly pull, and backflush concepts. Execute operation transactions to update dispatch status in count point operations. Manage re- sequenced operations within work order processes. Charge maintenance work orders for utilized resources and ensure accurate transaction recording. Technical skills required for our support organization for Maintenance Module SQL and PL/SQL REST API - creating, different methods and testing via POSTMAN Knowledge of JSON format Knowledge of WSDL, XML and SOAP Webservices Oracle SOA - Composites, Business Events, debugging via SOA Composite trace and logs Java and Oracle ADF Oracle Visual Builder Studio (good to have) Page Composer(Fusion Apps) : Customize existing UI (good to have) Application Composer(Fusion Apps) - sandbox, creating custom object and fields, dynamic page layout and Object Functions (good to have) Career Level - IC3 Responsibilities RESPONSIBILITIES As a Sr. Support Engineer, you will be the technical interface to customer) for resolution of problems related to the maintenance and use of Oracle products. Have an understanding of all Oracle products in their competencies and in-depth knowledge of several products and/or platforms. Also, you should be highly experienced in multiple platforms and be able to complete assigned duties with minimal direction from management. In this position, you will routinely act independently while researching and developing solutions to customer issues. Research, resolve and respond to complex issues across the Application product lines and product boundaries in accordance with current standards Demonstrate strong follow-through and consistently keep commitments to customers and employees Ensure that each and every customer is handled with a consummately professional attitude and the highest possible level of service Take ownership and responsibility for priority customer cases where and when required Review urgent and critical incidents for quality Queue reviews with analysts to ensure quality and efficiency of support Report high visibility cases, escalation, customer trends to management Act as information resource to the management team Contribute to an environment that encourages information sharing, team-based resolution activity, cross training and an absolute focus on resolving customer cases as quickly and effectively as possible Participate in projects that enhance the quality or efficiency of support Participate in system and release testing, as needed Act as a role model and mentor for other analysts Perform detailed technical analysis and troubleshooting using SQL, Java, ADF, Redwood, VBCS, SOA and Rest API Participate in after hour support as required. Work with Oracle Development/Support Development for product related issues Demonstrate core competencies (employ sound business judgment, creative and innovative ways to solve problems, strong work ethic and do whatever it takes to get the job done) Knowledge of Business process and functional knowledge required for our support organization for Maintenance Module Asset Management: Oversee the entire lifecycle of physical assets to optimize utilization and visibility into maintenance operations.Track and manage enterprise-owned and customer-owned assets, including Install Base Assets. Preventive maintenance/Maintenance Program: Define and generate daily preventive maintenance forecasts for affected assets within maintenance-enabled organizations. Utilize forecasts to create preventive maintenance work orders, reducing workload for planners and enhancing program auditing, optimization, and exception management. Work Definition: Identify and manage Maintenance Work Areas based on physical, geographical, or logical groupings of work centers. Define and manage resources, work centers, and standard operations. Develop reusable operation templates (standard operations) detailing operation specifics and required resources. Apply standard operations to multiple maintenance work definitions and work orders. Work Order creation, scheduling and Dispatch: Track material usage and labour hours against planned activities. Manage component installation and removal. Conduct inspections and ensure seamless execution of work orders. Work Order Transactions: Apply knowledge of operation pull, assembly pull, and backflush concepts. Execute operation transactions to update dispatch status in count point operations. Manage re- sequenced operations within work order processes. Charge maintenance work orders for utilized resources and ensure accurate transaction recording. Technical skills required for our support organization for Maintenance Module SQL and PL/SQL REST API - creating, different methods and testing via POSTMAN Knowledge of JSON format Knowledge of WSDL, XML and SOAP Webservices Oracle SOA - Composites, Business Events, debugging via SOA Composite trace and logs Java and Oracle ADF Oracle Visual Builder Studio (good to have) Page Composer(Fusion Apps) : Customize existing UI (good to have) Application Composer(Fusion Apps) - sandbox, creating custom object and fields, dynamic page layout and Object Functions (good to have) Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 4 days ago

Apply

20.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Linkedin logo

We are seeking an experienced Enterprise Architect with expertise in SAP ECC and Success Factors to lead the development and maintenance of our enterprise architecture strategy. This strategic role involves collaborating with stakeholders, aligning technology with business needs, and ensuring scalable, secure, and efficient enterprise-level implementations. About About RWS Technology Services – India RWS Technology Services provide end-to-end business technology solutions. Our team of experts provides a wide portfolio of services around digital technologies and technology operations to help organizations stay ahead of the curve, lower their total cost of ownership, and improve efficiencies. How we help - RWS Technology Services offer state-of-the-art technology solutions across the product lifecycle management process – all the way from consulting, concept, design, development to maintenance and optimization. We specialize in helping companies excel in the global, fast-paced technology landscape by supporting them in every aspect of customer interaction: Globalization, Digitization, Customer Experiences Management, Business Processes Automation, and Technology Infrastructure Modernization. Why choose RWS? - Innovative: RWS understands the needs of our customers to use the best talent, latest technologies, and solutions to help create connected customer experiences. We help our clients differentiate themselves by making their product engineering capabilities more data driven, powered by AI, and supported by cloud services and intelligent edge devices. Tailored: RWS Technology Services has been delivering technology services and solutions to start-ups, mid-sized and Fortune 500 corporations for over 20 years now. Our technology experience across all key industries ensures tailored applications development to meet the unique business needs of our clients. Our group is led by dedicated on-shore and off-shore project management teams of highly experienced professionals specializing in both agile and waterfall methodologies. We understand complex technology deployments and have a proven record to manage business critical, time-sensitive, and highly secure deployments that scale with your business growth. Key Responsibilities Job Overview Define and maintain the enterprise architecture strategy and roadmap. Collaborate with stakeholders to translate business requirements into scalable technical solutions. Ensure alignment with industry standards, IT best practices, and security frameworks. Design and implement secure, scalable, and high-performing enterprise solutions. Evaluate emerging technologies and recommend adoption where beneficial. Establish and enforce technical standards, policies, and best practices. Provide architectural guidance to development teams for optimal solution design. Ensure solutions align with business continuity and disaster recovery plans. Skills & Experience RWS is looking for 15+ years of relevant experience candidates, Who can join us as a Part time/Freelancer/Contract. Bachelor’s degree in Computer Science, Information Technology, or a related field. 15+ years of experience in technology architecture, including 5+ years in an enterprise architect role. Strong expertise in SAP ECC and SuccessFactors architecture, data models, and integrations. Familiarity with Azure, ADF or AppFabric for data integration. Experience with Power BI for data visualization. Proficiency in cloud computing, microservices architecture, and containerization. Experience with enterprise integration technologies such as ESBs and API gateways. Strong understanding of IT security and experience designing secure solutions. Experience in agile environments and DevOps methodologies. Excellent communication, stakeholder management, and problem-solving skills. Ability to work effectively in cross-functional, fast-paced environments. Life at RWS RWS is a content solutions company, powered by technology and human expertise. We grow the value of ideas, data and content by making sure organizations are understood. Everywhere. Our proprietary technology, 45+ AI patents and human experts help organizations bring ideas to market faster, build deeper relationships across borders and cultures, and enter new markets with confidence – growing their business and connecting them to a world of opportunities. It’s why over 80 of the world’s top 100 brands trust RWS to drive innovation, inform decisions and shape brand experiences. With 60+ global locations, across five continents, our teams work with businesses across almost all industries. Innovating since 1958, RWS is headquartered in the UK and publicly listed on AIM, the London Stock Exchange regulated market (RWS.L). RWS Values We Partner, We Pioneer, We Progress – and we´ll Deliver together. For further information, please visit: RWS RWS embraces DEI and promotes equal opportunity, we are an Equal Opportunity Employer and prohibit discrimination and harassment of any kind. RWS is committed to the principle of equal employment opportunity for all employees and to providing employees with a work environment free of discrimination and harassment. All employment decisions at RWS are based on business needs, job requirements and individual qualifications, without regard to race, religion, nationality, ethnicity, sex, age, disability, or sexual orientation. RWS will not tolerate discrimination based on any of these characteristics Recruitment Agencies: RWS Holdings PLC does not accept agency resumes. Please do not forward any unsolicited resumes to any RWS employees. Any unsolicited resume received will be treated as the property of RWS and Terms & Conditions associated with the use of such resume will be considered null and void. RWS. Smarter content starts here. www.rws.com Show more Show less

Posted 4 days ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Description Key Responsibilities Develop and maintain supply chain analytics to monitor operational performance and trends. Lead and participate in Six Sigma and supply chain improvement initiatives. Ensure data integrity and consistency across all analytics and reporting platforms. Design and implement reporting solutions for key supply chain KPIs. Analyze KPIs to identify improvement opportunities and develop actionable insights. Build and maintain repeatable, scalable analytics using business systems and BI tools. Conduct scenario modeling and internal/external benchmarking. Provide financial analysis to support supply chain decisions. Collaborate with global stakeholders to understand requirements and deliver impactful solutions. Responsibilities Qualifications Bachelor’s degree in Engineering, Computer Science, Supply Chain, or a related field. Relevant certifications in BI tools, Agile methodologies, or cloud platforms are a plus. This position may require licensing for compliance with export controls or sanctions regulations. Qualifications Experience 8–10 years of total experience, with at least 6 years in a relevant analytics or supply chain role. Proven experience in leading small teams and managing cross-functional projects. Technical Skills Expertise in : SQL, SQL Server, SSIS, SSAS, Power BI. Advanced DAX development for complex reporting needs. Performance optimization for SQL and SSAS environments. Cloud and Data Engineering : Azure Synapse, Azure Data Factory (ADF), Python, Snowflake Agile methodology : Experience working in Agile teams and sprints. Job Supply Chain Planning Organization Cummins Inc. Role Category Hybrid Job Type Exempt - Experienced ReqID 2415717 Relocation Package No Show more Show less

Posted 4 days ago

Apply

5.0 years

0 Lacs

Bhopal, Madhya Pradesh, India

On-site

Linkedin logo

At Iron Mountain we know that work, when done well, makes a positive impact for our customers, our employees, and our planet. That’s why we need smart, committed people to join us. Whether you’re looking to start your career or make a change, talk to us and see how you can elevate the power of your work at Iron Mountain. We provide expert, sustainable solutions in records and information management, digital transformation services, data centers, asset lifecycle management, and fine art storage, handling, and logistics. We proudly partner every day with our 225,000 customers around the world to preserve their invaluable artifacts, extract more from their inventory, and protect their data privacy in innovative and socially responsible ways. Are you curious about being part of our growth stor y while evolving your skills in a culture that will welcome your unique contributions? If so, let's start the conversation. About the role: As a Senior Executive – Digital Solutions at Iron Mountain, you will be primarily responsible for managing scanning and digitization projects at both customer sites and IMI facilities. This includes supervising and coordinating in-house teams as well as vendor resources, ensuring seamless, high-quality, and on-time project delivery aligned with the defined scope of work. You will also handle key project milestones such as Proof of Concept (POC), User Acceptance Testing (UAT), and Work Completion Certifications (WCC). Additionally, you will support vertical leads in achieving monthly, quarterly, and annual revenue targets. You should be collaborative, open to automation opportunities, and comfortable working with advanced scanning and production imaging equipment. Qualifications and Skills: Target-driven and self-motivated team player with a strong understanding of scanning, digitization, metadata handling, Document Management Systems (DMS), workflow processes, and automation of repetitive tasks. Prior experience managing scanning and digitization projects involving both in-house and outsourced/vendor teams. Minimum 2–5 years of relevant industry experience, preferably having led teams of 50+ members. Proficient in Google Sheets and skilled in MIS reporting. Education: Graduation is mandatory; an MBA in Operations is preferred. Familiarity with production scanners such as ADF, Overhead, Flatbed, BookEye, etc. Customer-focused mindset with a willingness to relocate based on project requirements. A proven track record in digitization projects will be an added advantage. Category: Operations Group Iron Mountain is a global leader in storage and information management services trusted by more than 225,000 organizations in 60 countries. We safeguard billions of our customers’ assets, including critical business information, highly sensitive data, and invaluable cultural and historic artifacts. Take a look at our history here. Iron Mountain helps lower cost and risk, comply with regulations, recover from disaster, and enable digital and sustainable solutions, whether in information management, digital transformation, secure storage and destruction, data center operations, cloud services, or art storage and logistics. Please see our Values and Code of Ethics for a look at our principles and aspirations in elevating the power of our work together. If you have a physical or mental disability that requires special accommodations, please let us know by sending an email to accommodationrequest@ironmountain.com. See the Supplement to learn more about Equal Employment Opportunity. Iron Mountain is committed to a policy of equal employment opportunity. We recruit and hire applicants without regard to race, color, religion, sex (including pregnancy), national origin, disability, age, sexual orientation, veteran status, genetic information, gender identity, gender expression, or any other factor prohibited by law. To view the Equal Employment Opportunity is the Law posters and the supplement, as well as the Pay Transparency Policy Statement, CLICK HERE Requisition: J0088899 Show more Show less

Posted 4 days ago

Apply

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Summary: We are hiring an experienced Application Security Engineer specializing in Java ADF and Jasper Reports, with a strong track record of resolving Vulnerability Assessment and Penetration Testing (VAPT) findings. The ideal candidate must have secured complex enterprise applications, including online payments and eCommerce systems, particularly on legacy stacks such as Java 1.7, MySQL 5.5, and JBoss 7.1. This role is hands-on and remediation-focused, requiring deep understanding of secure development and hardening in deprecated environments. Key Responsibilities: Lead remediation of high-priority VAPT findings in large-scale enterprise systems. Secure passwords and PII data at all stages: At view/input: masking, form validation, secure front-end patterns In transit: TLS, secure headers, HTTPS enforcement At rest: encryption, proper salting and hashing (e.g., bcrypt, SHA-256) Fix injection attacks (SQLi, XSS, LDAPi, command injection), CSRF, clickjacking, IDOR, and other OWASP Top 10 issues. Apply secure API integration practices: auth tokens, rate limiting, input validation. Harden session and cookie management (HttpOnly, Secure, SameSite attributes, session fixation prevention). Review and fix insecure code in ADF Faces, Task Flows, Bindings, BC4J, and Jasper Reports. Secure Jasper Reports generation and access (parameter validation, report-level authorization, export sanitization). Work hands-on with legacy platforms: Java 1.7, MySQL 5.5, JBoss 7.1 — applying secure remediation without disrupting production. Strengthen security of online payment/eCommerce systems with proven compliance (e.g., PCI-DSS). Maintain detailed remediation logs, documentation, and evidence for audits and compliance (GDPR, DPDPA, STQC, etc.). Technical Skills: Java EE, Oracle ADF (ADF Faces, Task Flows, BC4J), Jasper Reports Studio/XML Strong debugging skills in Java 1.7, MySQL 5.5, JBoss 7.1 Secure development lifecycle practices with a focus on legacy modernization Strong grounding in OWASP Top 10, SANS 25, CVSS, and secure coding principles Experience in PII handling, data masking, salting, and hashing Proficiency in OAuth2, SAML, JWT, and RBAC security models Performance improvement and application profiling Expertise in analyzing application, system, and security logs to identify and fix issues Ability to ensure application stability and high availability Be the champion/lead and guide the team to fix the issues PHP experience is a plus, especially in legacy web app environments Required Experience: 5–10+ years in application development and security Demonstrated experience remediating security vulnerabilities in eCommerce and payment platforms Ability to work independently in production environments with deprecated technologies Preferred Qualifications / Plus: B.E./B.Tech/MCA in Computer Science, IT, or Cybersecurity Use of AI tools for identification and fixing the issues is real plus Any VAPT or Application Security Certification is a plus (e.g., CEH, OSCP, CSSLP, GWAPT, Oracle Certified Expert) Familiarity with compliance standards: PCI-DSS, GDPR, DPDPA, STQC Proficiency with security tools: Fortify, ZAP, SonarQube, Checkmarx, Burp Suite Soft Skills: Strong problem-solving and diagnostic capabilities, especially in large monolithic codebases Good documentation and communication skills for cross-functional collaboration Able to work under pressure, troubleshoot complex issues, and deliver secure code fixes rapidly Show more Show less

Posted 4 days ago

Apply

3.0 years

0 Lacs

Greater Chennai Area

On-site

Linkedin logo

Who You'll Work With Driving lasting impact and building long-term capabilities with our clients is not easy work. You are the kind of person who thrives in a high performance/high reward culture - doing hard things, picking yourself up when you stumble, and having the resilience to try another way forward. In return for your drive, determination, and curiosity, we'll provide the resources, mentorship, and opportunities you need to become a stronger leader faster than you ever thought possible. Your colleagues—at all levels—will invest deeply in your development, just as much as they invest in delivering exceptional results for clients. Every day, you'll receive apprenticeship, coaching, and exposure that will accelerate your growth in ways you won’t find anywhere else. When you join us, you will have Continuous learning Our learning and apprenticeship culture, backed by structured programs, is all about helping you grow while creating an environment where feedback is clear, actionable, and focused on your development. The real magic happens when you take the input from others to heart and embrace the fast-paced learning experience, owning your journey. A voice that matters From day one, we value your ideas and contributions. You’ll make a tangible impact by offering innovative ideas and practical solutions. We not only encourage diverse perspectives, but they are critical in driving us toward the best possible outcomes. Global community With colleagues across 65+ countries and over 100 different nationalities, our firm’s diversity fuels creativity and helps us come up with the best solutions for our clients. Plus, you’ll have the opportunity to learn from exceptional colleagues with diverse backgrounds and experiences. World-class benefits On top of a competitive salary (based on your location, experience, and skills), we provide a comprehensive benefits package, which includes medical, dental, mental health, and vision coverage for you, your spouse/partner, and children. Your Impact As a Data Engineer I at McKinsey & Company, you will play a key role in designing, building, and deploying scalable data pipelines and infrastructure that enable our analytics and AI solutions. You will work closely with product managers, developers, asset owners, and client stakeholders to turn raw data into trusted, structured, and high-quality datasets used in decision-making and advanced analytics. Your core responsibilities will include Developing robust, scalable data pipelines for ingesting, transforming, and storing data from multiple structured and unstructured sources using Python/SQL. Creating and optimizing data models and data warehouses to support reporting, analytics, and application integration. Working with cloud-based data platforms (AWS, Azure, or GCP) to build modern, efficient, and secure data solutions. Contributing to R&D projects and internal asset development. Contributing to infrastructure automation and deployment pipelines using containerization and CI/CD tools. Collaborating across disciplines to integrate data engineering best practices into broader analytical and generative AI (gen AI) workflows. Supporting and maintaining data assets deployed in client environments with a focus on reliability, scalability, and performance. Furthermore, you will have opportunity to explore and contribute to solutions involving generative AI, such as vector embeddings, retrieval-augmented generation (RAG), semantic search, and LLM-based prompting, especially as we integrate gen AI capabilities into our broader data ecosystem. Your Qualifications and Skills Bachelor’s degree in computer science, engineering, mathematics, or a related technical field (or equivalent practical experience). 3+ years of experience in data engineering, analytics engineering, or a related technical role. Strong Python programming skills with demonstrated experience building scalable data workflows and ETL/ELT pipelines. Proficient in SQL with experience designing normalized and denormalized data models. Hands-on experience with orchestration tools such as Airflow, Kedro, or Azure Data Factory (ADF). Familiarity with cloud platforms (AWS, Azure, or GCP) for building and managing data infrastructure. Discernable communication skills, especially around breaking down complex structures into digestible and relevant points for a diverse set of clients and colleagues, at all levels. High-value personal qualities including critical thinking and creative problem-solving skills; an ability to influence and work in teams. Entrepreneurial mindset and ownership mentality are must; desire to learn and develop, within a dynamic, self-led organization. Hands-on experience with containerization technologies (Docker, Docker-compose). Hands on experience with automation frameworks (Github Actions, CircleCI, Jenkins, etc.). Exposure to generative AI tools or concepts (e.g., OpenAI, Cohere, embeddings, vector databases). Experience working in Agile teams and contributing to design and architecture discussions. Contributions to open-source projects or active participation in data engineering communities. Show more Show less

Posted 4 days ago

Apply

5.0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Senior Data Engineer Experience: 5+ Years Location: Remote Contract Duration: Short Term Work Time: IST Shift Job Description We are seeking a skilled and experienced Senior Data Engineer to develop scalable and optimized data pipelines using the Databricks Lakehouse platform. The role requires proficiency in Apache Spark, PySpark, cloud data services (AWS, Azure, GCP), and solid programming knowledge in Python and Java. The engineer will collaborate with cross-functional teams to design and deliver high-performing data solutions. Responsibilities Data Pipeline Development Build efficient ETL/ELT workflows using Databricks and Spark for batch and streaming data Utilize Delta Lake and Unity Catalog for structured data management Optimize Spark jobs using tuning techniques such as caching, partitioning, and serialization Cloud-Based Implementation Develop and deploy data workflows on AWS (S3, EMR, Glue), Azure (ADLS, ADF, Synapse), and/or GCP (GCS, Dataflow, BigQuery) Manage and optimize data storage, access control, and orchestration using native cloud tools Implement data ingestion and querying with Databricks Auto Loader and SQL Warehousing Programming and Automation Write clean, reusable, and production-grade code in Python and Java Automate workflows using orchestration tools like Airflow, ADF, or Cloud Composer Implement testing, logging, and monitoring mechanisms Collaboration and Support Work closely with data analysts, scientists, and business teams to meet data requirements Support and troubleshoot production workflows Document solutions, maintain version control, and follow Agile/Scrum methodologies Required Skills Technical Skills Databricks: Experience with notebooks, cluster management, Delta Lake, Unity Catalog, and job orchestration Spark: Proficient in transformations, joins, window functions, and tuning Programming: Strong in PySpark and Java, with data validation and error handling expertise Cloud: Experience with AWS, Azure, or GCP data services and security frameworks Tools: Familiarity with Git, CI/CD, Docker (preferred), and data monitoring tools Experience 5–8 years in data engineering or backend development Minimum 1–2 years of hands-on experience with Databricks and Spark Experience with large-scale data migration, processing, or analytics projects Certifications (Optional but Preferred) Databricks Certified Data Engineer Associate Working Conditions Full-time remote work with availability during IST hours Occasional on-site presence may be required during client visits No regular travel required On-call support expected during deployment phases Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Required Skills: YOE-8+ Mode Of work: Remote Design, develop, modify, and test software applications for the healthcare industry in agile environment. Duties include: Develop. support/maintain and deploy software to support a variety of business needs Provide technical leadership in the design, development, testing, deployment and maintenance of software solutions Design and implement platform and application security for applications Perform advanced query analysis and performance troubleshooting Coordinate with senior-level stakeholders to ensure the development of innovative software solutions to complex technical and creative issues Re-design software applications to improve maintenance cost, testing functionality, platform independence and performance Manage user stories and project commitments in an agile framework to rapidly deliver value to customers deploy and operate software solutions using DevOps model. Required skills: Azure Deltalake, ADF, Databricks, PySpark, Oozie, Airflow, Big Data technologies( HBASE, HIVE), CI/CD (GitHub/Jenkins) Show more Show less

Posted 4 days ago

Apply

2.0 - 3.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Job Title JOB Description – Middleware L1 Middleware Administrator – L1 Roles And Responsibility We are looking for a passionate candidate who can perform Middleware L1 level tasks. Eligibility SN Skill Experience 1 Middleware Administrator – L1 B.E. / B. Tech/BCA (On-Site), Relevant Certification (Preference). 2-3 Years relevant experience, ITIL Trained, OEM Certified on Minimum One Technology. Technology: Oracle Forms, Oracle Fusion Middleware Desired Skills & Experience ✓ Should be a Team player ✓ Communication and Problem-Solving – should have good communication skills and the ability to solve problems ✓ Process Knowledge – Working knowledge of ITSM tool & knowledge on ITIL process i.e. SR, Incident, Change, Release & Problem Management etc. ✓ Should have Collaborative approach, Adapatibility Technical Skills ✓ Oracle Applications: Oracle Forms 10g ,Oracle SSO 10g, OID 10g ,Oracle Portal 10g, Oracle Reports 10g, Internet Application Server (OAS) 10.1.2.2.0, Oracle Web Server (OWS) 10.1.2.2.0, Oracle WebCenter Portal 12.2.1.3, Oracle Access Manager 12.2.1.3, Oracle Internet Directory 12.2.1.3, Oracle WebLogic Server 12.2.1.3, Oracle HTTP Server 12.2.1.3, Oracle ADF 12.2.1.3 (Fusion middleware), Oracle Forms 12.2.1.3, Oracle Reports12.2.1.3, Mobile apps, tomcat etc ✓ Microsoft Applications: Windows IIS, portal, Web cache, BizTalk application and DNS applications ✓ Operating systems: RHEL 7, 8, 9 ✓ Tools & Utilities: ITSM Tools (ServiceNow, Symphony SUMMIT), JIRA Key Responsibilities Application Monitoring Services ✓ Monitor application response times from the end-user perspective in real time and alert organizations when performance is unacceptable. By alerting the user to problems and intelligently segmenting response times, it should quickly expose problem sources and minimizes the time necessary for resolution. ✓ It should allow specific application transactions to be captured and monitored separately. This allows administrators to select the most important operations within business-critical applications to be measured and tracked individually. ✓ It should use baseline-oriented thresholds to raise alerts when application response times deviate from acceptable levels. This allows IT administrators to respond quickly to problems and minimize the impact on service delivery. ✓ It should automatically segment response-time information into network, server and local workstation components to easily identify the source of bottlenecks. ✓ Monitoring of applications, including Oracle Forms 10g ,Oracle SSO 10g ,OID 10g, Oracle Portal 10g ,Oracle Reports 10g ,Internet Application Server (OAS) 10.1.2.2.0, Oracle Web Server (OWS) 10.1.2.2.0, Oracle WebCenter Portal 12.2.1.3 ,Oracle Access Manager 12.2.1.3,Oracle Internet Directory 12.2.1.3,Oracle WebLogic Server 12.2.1.3,Oracle HTTP Server 12.2.1.3, Oracle ADF 12.2.1.3 (Fusion middleware) ,Oracle Forms 12.2.1.3,Oracle Reports12.2.1.3,mobile apps, Windows IIS, portal, web cache, BizTalk application and DNS applications, tomcat etc. ✓ Shutdown and start-up of applications, generation of MIS reports, monitoring of application load user account management scripts execution, analysing system events, monitoring of error logs etc. ✓ Compliance to daily health checklist, portal Updation ✓ Logging of system events and incidents ✓ SR, Incidents tickets Updation in Symphony iServe Tool Application Release Management ✓ Scheduling, coordinating and managing releases for application ✓ Take application code backup, place new code and restart the services Show more Show less

Posted 4 days ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D And A) – Azure Data Engineer - Senior As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for candidates with strong technology and data understanding in big data engineering space, having proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Develop & deploy big data pipelines in a cloud environment using Azure Cloud services ETL design, development and migration of existing on-prem ETL routines to Cloud Service Interact with senior leaders, understand their business goals, contribute to the delivery of the workstreams Design and optimize model codes for faster execution Skills And Attributes For Success Overall 3+ years of IT experience with 2+ Relevant experience in Azure Data Factory (ADF) and good hands-on with Exposure to latest ADF Version Hands-on experience on Azure functions & Azure synapse (Formerly SQL Data Warehouse) Should have project experience in Azure Data Lake / Blob (Storage purpose) Should have basic understanding on Batch Account configuration, various control options Sound knowledge in Data Bricks & Logic Apps Should be able to coordinate independently with business stake holders and understand the business requirements, implement the requirements using ADF To qualify for the role, you must have Be a computer science graduate or equivalent with 3-7 years of industry experience Have working experience in an Agile base delivery methodology (Preferable) Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support. Ideally, you’ll also have Client management skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 4 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Project Description: Our client is an EU subsidiary of a Global Financial Bank working in multiple markets and asset classes. The Bank's Data Store has been transformed to a Data warehouse (DWH) which is the central source for Regulatory Reporting. It is also intended to be the core data integration platform which not only provide date for regulatory reporting but also provide data for Risk Modelling, Portfolio Analysis, Ad Hoc Analysis & Reporting (Finance, Risk, other), MI Reporting, Data Quality Management, etc. Due to high demand of regulatory requirements, a lot of regulatory projects are in progress to reflect regulatory requirements on existing regulatory reports and to develop new regulatory reports on MDS. Examples are IFRS9, AnaCredit, IRBB, the new Deposit Guarantee Directive (DGSD), Bank Data Retrieval Portal (BDRP) and the Fundamental Review of the Trading Book (FRTB). DWH / ETL Tester will work closely with the Development Team to design, build interfaces and integrate data from a variety from internal and external data sources into the new Enterprise Data Warehouse environment. The ETL Tester will be primarily responsible for testing Enterprise Data Warehouse using Automation within industry recognized ETL standards, architecture, and best practices. Responsibilities: Testing the Bank's data warehouse system changes, testing the changes (user stories), support IT integration testing in TST and support business stakeholders with User Acceptance Testing. It is hands-on position: you will be required to write and execute test cases, build test automation where it is applicable. Overall Purpose of Job - Test the MDS data warehouse system - Validate regulatory reports - Supporting IT and Business stakeholders during UAT phase - Contribute to improvement of testing and development processes - Work as part of a cross-functional team and take ownership of tasks - Contribute in Testing Deliverables. - Ensure the implementation of test standards and best practices for the agile model & contributes to their development. - Engage with internal stakeholders in various areas of the organization to seek alignment and collaboration. - Deals with external stakeholders / Vendors. - Identify risks / issues and present associated mitigating actions taking into account the critically of the domain of the underlying business. - Contribute to continuous improvement of testing standard processes. Additional responsibilities include work closely with the systems analysts and the application developers, utilize functional design documentation and technical specifications to facilitate the creation and execution of manual and automated test scripts, perform data analysis and creation of test data, track and help resolve defects and ensure that all testing is conducted and documented in adherence with the bank's standard. Mandatory Skills: Data Warehouse (DWH) ETL Test Management Mandatory Skills Description: Must have experience/expertise : Tester, Test Automation, Data Warehouse, Banking Technical: - At least 5 years of testing experience of which at least 2 years in the finance industry with good level knowledge on Data Warehouse, RDBMS concepts. - Strong SQL scripting knowledge and hands-on experience and experience with ETL & Databases. - Expertise on new age cloud based Data Warehouse solutions - ADF, SnowFlake, GCP etc. - Hands-On expertise in writing complex SQL using multiple JOINS and highly complex functions to test various transformations and ETL requirements. - Knowledge and Experience on creating Test Automation for Database and ETL Testing Regression Suite. - Automation using Selenium with Python (or Java Script), Python Scripts, Shell Script. - Knowledge of framework designing, REST API Testing of databases using Python. - Experience using Atlassian tool set, Azure DevOps and code & Version Management - GIT, Bitbucket, Azure Repos etc. - Help and provide inputs for the creation of Test Plan to address the needs of Cloud Based ETL Pipelines. Non-Technical: - Able to work in an agile environment - Experience in working in high priority projects (high pressure on delivery) - Some flexibility outside 9-5 working hours (Netherlands Time zone) - Able to work in demanding environment and have pragmatic approach with "can do" attitude. - Able to work independently and also to collaborate across the organization - Highly developed problem-solving skills with minimal supervision - Able to easily adapt to new circumstances / technologies / procedures. - Stress resistant and constructive - whatever the context. - Able to align with existing standards and acting with attention to detail. Nice-to-Have Skills Description: - Experience of financial regulatory reports - Experience in test automation for data warehouse (using bamboo) Software skills: - Bitbucket - Bamboo - Azure Tech Stack - Azure Data Factory - WKFS OneSumX reporting generator - Analytics tool such as Power BI / Excel / SSRS / SSAS, WinSCP Show more Show less

Posted 4 days ago

Apply

3.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D And A) – Azure Data Engineer - Senior As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for candidates with strong technology and data understanding in big data engineering space, having proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Develop & deploy big data pipelines in a cloud environment using Azure Cloud services ETL design, development and migration of existing on-prem ETL routines to Cloud Service Interact with senior leaders, understand their business goals, contribute to the delivery of the workstreams Design and optimize model codes for faster execution Skills And Attributes For Success Overall 3+ years of IT experience with 2+ Relevant experience in Azure Data Factory (ADF) and good hands-on with Exposure to latest ADF Version Hands-on experience on Azure functions & Azure synapse (Formerly SQL Data Warehouse) Should have project experience in Azure Data Lake / Blob (Storage purpose) Should have basic understanding on Batch Account configuration, various control options Sound knowledge in Data Bricks & Logic Apps Should be able to coordinate independently with business stake holders and understand the business requirements, implement the requirements using ADF To qualify for the role, you must have Be a computer science graduate or equivalent with 3-7 years of industry experience Have working experience in an Agile base delivery methodology (Preferable) Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support. Ideally, you’ll also have Client management skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 4 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies