Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 4.0 years
5 - 6 Lacs
Calcutta
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC , we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle fusion Technical to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients *Responsibilities: Extensive experience in Oracle ERP/Fusion SaaS/PaaS project implementations as a technical developer . Completed at least 2 full Oracle Cloud (Fusion) Implementation Extensive Knowledge on database structure for ERP/Oracle Cloud (Fusion) Extensively worked on BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Mandatory skill sets BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Preferred skill sets database structure for ERP/Oracle Cloud (Fusion) Years of experience required • Minimum 2 to 4 Years of Oracle fusion experience Education Qualification • Graduate /Post Graduate Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor Degree Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Optional Skills Accepting Feedback, Active Listening, Business Transformation, Communication, Design Automation, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Strategic Technology Planning, Teamwork, Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 1 month ago
0 years
0 Lacs
Bengaluru East, Karnataka, India
On-site
Primary skills:Technology->Oracle Industry Solutions->Retail Merchandise Technical Skills: o Proficiency in programming languages like Python and R for data manipulation and analysis o Expertise in machine learning algorithms and statistical modeling techniques o Familiarity with data warehousing and data pipelines o Experience with data visualization tools like Tableau or Power BI o Experience in Cloud platforms (e.g., ADF, Data bricks, Azure) and their AI services. Consulting Skills: o Hypothesis-driven problem solving o Go-to-market pricing and revenue growth execution o Advisory, Presentation, Data Storytelling o Project Leadership and Execution A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to lead the engagement effort of providing high-quality and value-adding consulting solutions to customers at different stages- from problem definition to diagnosis to solution design, development and deployment. You will review the proposals prepared by consultants, provide guidance, and analyze the solutions defined for the client business problems to identify any potential risks and issues. You will identify change Management requirements and propose a structured approach to client for managing the change using multiple communication mechanisms. You will also coach and create a vision for the team, provide subject matter training for your focus areas, motivate and inspire team members through effective and timely feedback and recognition for high performance. You would be a key contributor in unit-level and organizational initiatives with an objective of providing high-quality, value-adding consulting solutions to customers adhering to the guidelines and processes of the organization. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Good knowledge on software configuration management systems Strong business acumen, strategy and cross-industry thought leadership Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Two or three industry domain knowledge Understanding of the financial processes for various types of projects and the various pricing models available Client Interfacing skills Knowledge of SDLC and agile methodologies Project and Team management Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Overview Seeking an Associate Manager, Data Operations, to support our growing data organization. In this role, you will assist in maintaining data pipelines and corresponding platforms (on-prem and cloud) while working closely with global teams on DataOps initiatives. Support the day-to-day operations of data pipelines, ensuring data governance, reliability, and performance optimization on Microsoft Azure. Hands-on experience with Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and real-time streaming architectures is preferred. Assist in ensuring the availability, scalability, automation, and governance of enterprise data pipelines supporting analytics, AI/ML, and business intelligence. Contribute to DataOps programs, aligning with business objectives, data governance standards, and enterprise data strategy. Help implement real-time data observability, monitoring, and automation frameworks to improve data reliability, quality, and operational efficiency. Support the development of governance models and execution roadmaps to enhance efficiency across Azure, AWS, GCP, and on-prem environments. Work on CI/CD integration, data pipeline automation, and self-healing capabilities to improve enterprise-wide DataOps processes. Collaborate with cross-functional teams to support and maintain next-generation Data & Analytics platforms while promoting an agile and high-performing DataOps culture. Assist in the adoption of Data & Analytics technology transformations, ensuring automation for proactive issue identification and resolution. Partner with cross-functional teams to support process improvements, best practices, and operational efficiencies within DataOps. Responsibilities Assist in the implementation and optimization of enterprise-scale data pipelines using Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and Azure Stream Analytics. Support data ingestion, transformation, orchestration, and storage workflows, ensuring data reliability, integrity, and availability. Help ensure seamless batch, real-time, and streaming data processing, focusing on high availability and fault tolerance. Contribute to DataOps automation efforts, including CI/CD for data pipelines, automated testing, and version control using Azure DevOps and Terraform. Collaborate with Data Engineering, Analytics, AI/ML, CloudOps, and Business Intelligence teams to support data-driven decision-making. Assist in aligning DataOps practices with regulatory and security requirements by working with IT, data stewards, and compliance teams. Support data operations and sustainment activities, including testing and monitoring processes for global products and projects. Participate in data capture, storage, integration, governance, and analytics efforts, working alongside cross-functional teams. Assist in managing day-to-day DataOps activities, ensuring adherence to service-level agreements (SLAs) and business requirements. Engage with SMEs and business stakeholders to ensure data platform capabilities align with business needs. Contribute to Agile work intake and execution processes, helping to maintain efficiency in data platform teams. Help troubleshoot and resolve issues related to cloud infrastructure and data services in collaboration with technical teams. Support the development and automation of operational policies and procedures, improving efficiency and resilience. Assist in incident response and root cause analysis, contributing to self-healing mechanisms and mitigation strategies. Foster a customer-centric approach, advocating for operational excellence and continuous improvement in service delivery. Help build a collaborative, high-performing team culture, promoting automation and efficiency within DataOps. Adapt to shifting priorities and support cross-functional teams in maintaining productivity and achieving business goals. Utilize technical expertise in cloud and data operations to support service reliability and scalability. Qualifications 5+ years of technology work experience in a large-scale global organization, with CPG industry experience preferred. 5+ years of experience in Data & Analytics roles, with hands-on expertise in data operations and governance. 2+ years of experience working within a cross-functional IT organization, collaborating with multiple teams. Experience in a lead or senior support role, with a focus on DataOps execution and delivery. Strong communication skills, with the ability to collaborate with stakeholders and articulate technical concepts to non-technical audiences. Analytical and problem-solving abilities, with a focus on prioritizing customer needs and operational improvements. Customer-focused mindset, ensuring high-quality service delivery and operational efficiency. Growth mindset, with a willingness to learn and adapt to new technologies and methodologies in a fast-paced environment. Experience supporting data operations in a Microsoft Azure environment, including data pipeline automation. Familiarity with Site Reliability Engineering (SRE) principles, such as monitoring, automated issue remediation, and scalability improvements. Understanding of operational excellence in complex, high-availability data environments. Ability to collaborate across teams, building strong relationships with business and IT stakeholders. Basic understanding of data management concepts, including master data management, data governance, and analytics. Knowledge of data acquisition, data catalogs, data standards, and data management tools. Strong execution and organizational skills, with the ability to follow through on operational plans and drive measurable results. Adaptability in a dynamic, fast-paced environment, with the ability to shift priorities while maintaining productivity. Show more Show less
Posted 1 month ago
10.0 - 12.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Overview The Software Engineering Manager will play a pivotal role in software development activities and long-term initiative planning and collaboration across the Strategy & Transformation (S&T) organization. Software Engineering is the corner stone of scalable digital transformation across PepsiCo’s value chain. This leader will deliver the end-to-end software development experience, deliver high quality software as part of the DevOps process, and have accountability for our business operations. The leader in this role will be highly experienced Software Engineering Manager and hands-on with Java/Python/Azure technologies to lead the design, development and support of our Integration platform. This role is critical in shaping our integration landscape, establishing development best practices, and mentoring a world-class engineering team. This role will play a key leadership role in a product-focused, high-growth startup/enterprise environment, owning end to end integration services. Responsibilities Support and guide a team of engineers in developing and maintaining Digital Products and Applications (DPA). Oversee the comprehensive development of integration services for the Integration platform utilizing Java and Python on Azure. Design scalable, performant, and secure systems ensuring maintainability and quality. Establish code standards and best practices; conduct code reviews and technical audits. Advise on the selection of tools, libraries, and frameworks. Research emerging technologies and provide recommendations for their adoption. Uphold high standards of Integration services and performance across platforms. Foster partnerships with User Experience, Product Management, IT, Data & Analytics, Emerging Tech, Innovation, and Process Engineering teams to deliver the Digital Products portfolio. Create a roadmap and schedule for implementation based on business requirements and strategy. Demonstrate familiarity with AI tools and platforms such as OpenAI (GPT-3/4, Assistants API), Anthropic, or similar LLM providers. Integrate AI capabilities into applications, including AI copilots and AI agents, smart chatbots, automated data processors, and content generators. Understand prompt engineering, context handling, and AI output refinement. Lead multi-disciplinary, high-performance work teams distributed across remote locations effectively. Build, manage, develop, and mentor a team of engineers. Engage with executives throughout the company to advocate the narrative surrounding software engineering. Expand DPA capabilities through a customer-focused, services-driven digital solutions platform leveraging data and AI to deliver automated and personalized experiences. Manage and appropriately escalate delivery impediments, risks, issues, and changes associated with engineering initiatives to stakeholders. Collaborate with key business partners to recommend solutions that best meet the strategic needs of the business. Qualifications Bachelor's or master's in computer science, engineering, or related field 10-12 years of software design and development (Java, Spring Boot, Python) 8-10 years of Java/Python development, enterprise-grade applications expertise 3-5 years of microservices development and RESTful API design 3-5 years with cloud-native solutions (Azure preferred, AWS, Google Cloud) Strong understanding of web protocols, REST APIs, SOA 3-5 years as lead developer, mentoring teams, driving technical direction Proficient with relational databases (Oracle, MSSQL, MySQL) and NoSQL databases (Couchbase, MongoDB) Exposure to ADF or ADB Experience with Azure Kubernetes Service or equivalent Knowledge of event-driven architecture and message brokers (Kafka, ActiveMQ) Data integration experience across cloud and on-prem systems Deep understanding of CI/CD pipelines, DevOps automation Ability to write high-quality, secure, scalable code Experience delivering mission-critical, high-throughput systems Strong problem-solving, communication, stakeholder collaboration skills Experience in Scaled Agile (SAFe) as technical lead Knowledge of Salesforce ecosystem (Sales Cloud/CRM) is a plus Show more Show less
Posted 1 month ago
7.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Manager – Azure Data Architect As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for Managers (Big Data Architects) with strong technology and data understanding having proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Develop standardized practices for delivering new products and capabilities using Big Data & cloud technologies, including data acquisition, transformation, analysis, Modelling, Governance & Data management skills Interact with senior client technology leaders, understand their business goals, create, propose solution, estimate effort, build architectures, develop and deliver technology solutions Define and develop client specific best practices around data management within a cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using ADB, ADF, PySpark, Python, Snypase Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Have managed team and have experience in end to end delivery Have experience of building technical capability and teams to deliver Skills And Attributes For Success Strong understanding & familiarity with all Cloud Ecosystem components Strong understanding of underlying Cloud Architectural concepts and distributed computing paradigms Experience in the development of large scale data processing. Experience with CI/CD pipelines for data workflows in Azure DevOps Hands-on programming experience in ADB, ADF, Synapse, Python, PySpark, SQL Hands-on expertise in cloud services like AWS, and/or Microsoft Azure eco system Solid understanding of ETL methodologies in a multi-tiered stack with Data Modelling & Data Governance Experience with BI, and data analytics databases Experience in converting business problems/challenges to technical solutions considering security, performance, scalability etc. Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Strong stakeholder, client, team, process & delivery management skills To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 8-11 years industry experience Ideally, you’ll also have Project management skills Client management skills Solutioning skills Nice to have: Knowledge in data security best practices Knowledge in Data Architecture Design Patterns What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 month ago
8.0 years
0 Lacs
Gurugram, Haryana, India
On-site
JOB SUMMARY: The Data Engineer interprets data requirements for a specific data/tech product & drives the design, development & implementation of relevant data models based on both external/internal assets. They develop & maintain required enablers and platforms in data lake environment, securing data accessibility & integrity throughout all relevant processes KEY RESPONSIBILITIES: Engage with key stakeholders to identify data requirements for a specific data/tech product Design, build & maintain systems that capture, collect, manage, and convert raw data into usable information, securing quality & integrity (implementation of specific software for appropriate data management) Develop mechanisms to ingest, analyze, validate, normalize and clean data , supporting key user needs (standardization, customization), build interfaces & retention models which requires synthesizing or anonymizing Implement & maintain relevant procedures to secure data accessibility & quality (on new data sources uncovered by data scientists) Secure effective integration of built models/systems within PR environment, connecting with relevant architects/engineers, and drive continuous improvement initiatives (including maintenance.) Support data teams at key steps , sharing relevant insights/expertise (advice on data sourcing and preparation to data scientists, on data analytics & visualization concepts, methods & techniques.) Provide data engineering best practices & bring forward new ways of thinking around data to improve business outcomes Mentor other Data Engineers supporting them in complex scenarios leveraging past experiences and developing new standards Participate in transversal data engineering initiatives (market intelligence, cross-product/family initiatives.) as needed, and continuously develop their own skills based on industry trends/enterprise needs GEOGRAPHICAL SCOPE : Scope : Global Travel : Very Limited INTERACTIONS : Reporting Line (direct/indirect) : Reports to Data Engineering Chapter Lead, working in a matrix organization Key internal stakeholders : Squad Members (Data or GES, including Data Scientists/Analysts, Data Architect), BI Analysts, Data Governance Team, Product Owners. Product Managers etc. Key external stakeholders : Data Engineering Supplier, External Data Providers for product scopes FUNCTIONAL SKILLS: Core On-Cloud Data Engineering skills, including data extracting & storage, data transform & load. Data tools: Azure, SQL, Snowflake, Python, DBT, Lakehouse Architecture, Databricks, ADF, LogicApp, API Mgmt. and Azure Functions Project management & support : JIRA projects & service desk, Confluence, Sharepoint Mastery of data governance, architecture & security principles Background in software engineering/development (scripting & querying...) Knowledge of innovative technologies is a plus Strong communication skills, with the ability to talk with both technical & non-technical stakeholders Agile ways of working (collaboration, CD/CI) PAST EXPERIENCE: Bachelors or Masters in Computer Sciences 8 Years of experience as Data Engineer Experience in an FMCG/CPG company is a strong plus Lead & Co-ordination experience for other data engineers Show more Show less
Posted 1 month ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Requirements Description and Requirements We are seeking a skilled and experienced Azure Data Factory/ Synapse Engineer with expertise in SQL and Pipelines to join our dynamic team. As an Azure Data Engineer, you will be responsible for developing and implementing dynamic pipelines for data integration platform. Interact with the stakeholders/ data engineering manager to understand the ad-hoc and strategic data / project requirements and provide logical and long-term technical solutions. Work independently on basic to intermediate level data extraction, validation, and manipulation assignments using SQL, Python and ADF/Synapse. Work on maintaining and supporting the day-to-day operations revolving around DW management, cleanups on Azure Cloud platform. Write SQL scripts to update, verify the data loads and perform data validations. Using Git, GitHub to log the development work and manage deployments Effectively manage the evolving priorities and maintain clear communication with the stakeholders and/or Data Engineering Manager involved. About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible . Join us! Show more Show less
Posted 1 month ago
6.0 - 11.0 years
8 - 13 Lacs
Mumbai
Work from Office
Intermediate Azure Developer Were the obstacle overcomers, the problem get-arounders. From figuring it out to getting it done our innovative culture demands yes and how! We are UPS. We are the United Problem Solvers. About this role: The Intermediate Azure Developer will analyze business requirements, translating those requirements into Azure specific solutions using the Azure toolsets (Out of the Box, Configuration, Customization). He/She should have the following: experience in designing & building a solution using Azure Declarative & Programmatic Approach, knowledge with Integrating Azure with Salesforce, on-premise legacy systems, and other cloud solutions, experience with integration middleware and Enterprise Service Bus. He/She should also have experience in translating design requirements or agile user stories into Azure specific solutions, consuming or sending the message in XML/JSON format to 3rd party using SOAP and REST APIs, expertise in Azure PaaS Service SDKs, Storage, Fluent API, Integrating with Azure App Services, Microservices etc. on Azure, API Management, Event Hub, Service Bus & Message Queues, Azure Storage, Key Vaults and Application Insights, Azure Jobs, etc. He/She collaborates with teams and supports emerging technologies to ensure effective communication and achievement of objectives. Additional details: Will be working on a global deployment of Azure Platform Management to 40 countries and corresponding languages, 1000 locations, and 25,000 users. Develop large-scale distributed software services and solutions using Azure technologies. Develop best-in-class engineering services that are well-defined, modularized, secure, reliable, configurable, flexible, diagnosable, actively monitored, and reusable. Experience with the use of various Azure PaaS Service SDKs, Web API, Storage, App Insights, Fluent API, etc. Experience on integrations with Azure App Services, Azure Serverless, Microservices on Azure, Event Hub, Service Bus & Message Queues, Azure Storage, Key Vaults and Application Insights. Hands-on experience with Azure Jobs, Databricks, Notebooks, PySpark Scripting, ADF (Azure Data Factory), SQL, and Power-BI. Hands-on experience with Azure DevOps building CI/CD, Azure support, Code management branching, etc. Good knowledge of programming and querying SQL Server databases. Ensure comprehensive test coverage to validate the functionality and performance of developed solutions. Performs tasks within planned durations and established deadlines. Collaborates with teams to ensure effective communication in supporting the achievement of objectives. Strong ability to debug and resolve issues/defects. Author technical approach and design documentation. Collaborate with the offshore team on design discussions and development items. Minimum Qualifications: Experience in designing & building a solution using Azure Declarative & Programmatic Approach. Experience with integration middleware and Enterprise Service Bus. Experience in consuming or sending the message in XML/JSON format to 3rd party using SOAP and REST APIs. Experience with the use of various Azure PaaS Service SDKs, SQL, Web API, like Storage, App Insights, Fluent API, etc. Preferably 6+ years of Development experience. Minimum 4+ years of hands-on experience in development and integrations with Azure App Services, Azure Serverless, Microservices on Azure, API Management, Event Hub, Function Apps, Web Jobs, Service Bus & Message Queues, Azure Storage, Key Vaults and Application Insights, Azure Jobs, Databricks, Notebooks, PySpark Scripting, ADF (Azure Data Factory), Runbooks, and Power-BI. Experience with Azure DevOps building CI/CD, Azure support, Code management branching, Jenkins, Kubernetes, etc. Good knowledge of programming and querying SQL Server databases. Experience with Agile Development. Must be detail-oriented. Self-Motivated Learner. Ability to collaborate with others. Excellent written and verbal communication skills. Bachelor's degree and/or Master's degree in Computer Science or related discipline or the equivalent in education and work experience. Azure Certifications: Azure Fundamentals (mandatory) Azure Administrator Associate (desired) Azure Developer Associate (mandatory) This position offers an exceptional opportunity to work for a Fortune 50 industry leader. If you are selected, you will join our dynamic technology team in making a difference to our business and customers. Do you think you have what it takes? Prove it! At UPS, ambition knows no time zone. BASIC QUALIFICATIONS: If required and where permitted by applicable law, employees must be fully vaccinated for COVID-19 by their date of hire/placement to be considered for employment. Fully vaccinated means two weeks after receiving the second shot for Pfizer and Moderna, or two weeks after Johnson & Johnson.
Posted 1 month ago
4.0 - 6.0 years
6 - 8 Lacs
Pune
Work from Office
Ability to take full ownership and deliver component or functionality. Supporting the team to deliver project features with high quality and providing technical guidance. Responsible to work effectively individually and with team members toward customer satisfaction and success Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise SQL ADF Azure Data Bricks Preferred technical and professional experience PostgreSQL, MSSQL Eureka Hystrix, zuul/API gateway In-Memory storage
Posted 1 month ago
0 years
0 Lacs
Bengaluru East, Karnataka, India
On-site
Req ID: 321505 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Test Analyst to join our team in Bangalore, Karnātaka (IN-KA), India (IN). Job Duties: Understand business requirements , develop test cases. Work with tech team and client to validate and finalise test cases.. Use Jira or equivalent test management tool to record test cases, expected results, outcomes, assign defects Run in testing phase – SIT and UAT Test Reporting & Documentation Basic knowledge to use snowflake, SQL, ADF (optional) and Fivetran (optional) Minimum Skills Required: Test Cases development Jira knowledge for record test cases, expected results, outcomes, assign defects) Test Reporting & Documentation Basic knowledge to use snowflake, SQL, ADF (optional) and Fivetran (optional) About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here . Show more Show less
Posted 1 month ago
12.0 - 15.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Role : Technical Architect – Microsoft Dynamics 365 & Power Apps Must have skills: Dynamics 365, Power Platform Experience: 12 - 15Years Location: Hyderabad, India We are seeking a dynamic Technical Architect with expertise in Microsoft Dynamics 365 and Power Apps to lead innovative solutions, drive system integrations, and optimize architectures for business transformation. Key Responsibilities: • Proficient in Dynamics 365 Customer Engagement (Sales and Service). • Hands-on experience with Dynamics 365 in both Online and On-Premises environments. • Strong skills in C#, .NET, and JavaScript. • Familiarity with MS SQL, CRM SDK, and MSD Developer Toolkit. • Knowledgeable in SSRS (SQL Server Reporting Services). • Expertise in integration architecture, covering REST, OData, WebAPI, Middleware Tools, and SSIS packages. • Experience with Azure components for integration (ADF, Azure Service Bus, Azure Apps, Azure Functions, Azure Data Lake, etc.). • Familiarity with Azure Active Directory and Azure DevOps. • Proficiency in Power Apps and Power Automate for system integrations. • Experience with data migration, analysis, mappings, and harmonization using both out-of-the-box (OOB) and third-party tools. • Understanding of reporting architecture, including SSRS and Power BI. • Experience with CRM instance management and cloud services • Utilize expertise in Dynamics 365 CE CRM to enhance project efficiency. • Plan and establish Dynamics 365 Customer Engagement solutions, focusing on Sales and Service modules. • Lead the design of architecture, configuration, and customization in Dynamics 365 CE. • Provide technical insights and best practices for seamless system integration and data migration. • Collaborate closely with diverse teams to gather and analyze business requirements. • Respond to requests for proposals (RFPs) and quotes (RFQs) by formulating solution proposals that showcase potential outcomes. • Lead technical workshops to translate business needs into practical architectural and development solutions. • Develop and present proof of concepts (PoCs) to illustrate proposed solutions to clients. • Ensure the technical success of Dynamics 365 CE projects, overseeing the process from pre-sales to delivery. • Contribute to the development of scalable solutions aligned with the product roadmap. • Stay abreast of the latest features in the Dynamics 365 platform, including Sales Insights, Customer Service Insights, and Customer Voice. • Strong problem-solving and analytical skills • Ability to think independently and be solution-driven • Familiarity with agile methodologies Show more Show less
Posted 1 month ago
15.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
AES_325 Technical Architect – Microsoft Dynamics 365 & Power Apps Why Join Us Lead transformative projects with cutting-edge Microsoft technologies. Collaborate with high-performing teams across digital innovation initiatives. Competitive compensation and opportunities for career advancement. Engage with complex enterprise architectures and solve meaningful business challenges. Role Overview We are looking for a seasoned Technical Architect with 12–15 years of experience in Microsoft Dynamics 365 and Power Platform . The role is ideal for a visionary professional ready to lead solution design, integration, and architecture for enterprise-level CRM and Power Apps implementations. Location: East Coast, USA Key Responsibilities Architect, design, and implement scalable Dynamics 365 Customer Engagement (Sales and Service) solutions. Lead customization, configuration, and integration efforts for Dynamics 365 Online and On-Premise environments. Drive end-to-end system integration using REST, OData, WebAPI, and Middleware tools. Leverage Power Platform capabilities (Power Apps, Power Automate) for process automation and app development. Lead technical workshops and present solution architectures, PoCs, and technical proposals to clients. Collaborate with cross-functional teams to gather business requirements and translate them into technical designs. Ensure seamless data migration, harmonization, and reporting through tools like SSRS and Power BI. Utilize Azure services (ADF, Service Bus, Apps, Functions, Data Lake) for cloud-based integration and deployment. Oversee Dynamics 365 CRM instance management and lifecycle in cloud environments. Maintain up-to-date knowledge on platform advancements including Sales Insights, Customer Service Insights, and Customer Voice. Technical Skills Deep expertise in Dynamics 365 CE architecture and deployment. Proficiency in C#, .NET, JavaScript, MS SQL, CRM SDK, and MSD Developer Toolkit. Experience with Azure Active Directory, Azure DevOps, and SSIS packages. Solid grasp of agile development methodologies. What We’re Looking For A problem-solver with strong analytical and independent thinking abilities. A collaborative leader who can bridge business needs with technical solutions. Excellent communication and stakeholder engagement skills. Ready to make an impact? Apply now and join us on our journey! Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Our technology services client is seeking multiple Azure Data & Analytics Engineer to join their team on a contract basis. These positions offer a strong potential for conversion to full-time employment upon completion of the initial contract period. Below are further details about the role: Role: Azure Data & Analytics Engineer Mandatory Skills: Agile Methodoligies, Python, Databricks, Azure Cloud, Data Factory, Data Validations Experience: 5- 8 Years Location: PAN India Notice Period: 0-15 days Job Description: 5 years of software solution development using agile, devops, product model that includes designing, developing, and implementing large-scale applications or data engineering solutions. 5+ years of Data Analytics experience using SQL 5+ years full-stack development experience, preferably in Azure 5+ years of cloud development (prefer Microsoft Azure) including Azure EventHub, Azure Data Factory, Azure Functions, ADX, ASA, Azure Databricks, Azure DevOps, Azure Blob Storage, Azure Power Apps, and Power BI. 1+ years of FAST API experience is a plus Airline Industry Experience Skills, Licenses & Certifications Expertise with the Azure Technology stack for data management, data ingestion, capture, processing, curation and creating consumption layers. Azure Development Track Certification (preferred) Spark Certification (preferred) If you are interested, share the updated resume to sushmitha.r@s3staff.com Show more Show less
Posted 1 month ago
10.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Mars Data Hiring for Fulltime Dot Net Developer Positions in Trivandrum/Kochi locations Skills : .Net/.Net Core 6/8+/T-SQL/Azure Cloud Service/ Azure DevOps, React.JS/Angular.JS, C#, X-Unit, MS-Test, RDBMS, AWS, CI/CD, SDLC, Restful API, PowerShell, Agile/Scrum/Jira. Job Title: Dot Net Developer Location: Trivandrum/Kochi Job type: Full Time Working hours : 8 hours, Mid Shift Notice Period: Immediate Rel Experience : 10+ years Introduction Candidates with 10+ years of experience in IT industry and with strong .Net/.Net Core/Azure Cloud Service/ Azure DevOps. This is a client facing role and hence should have strong communication skills. This is for a US client and the resource should be hands-on - experience in coding and Azure Cloud. Responsibilities include • Develop, enhance, document, and maintain application features in .Net Core 6/8+ , C#, /REST API/T-SQL and AngularJS/React JS • Application Support & API Integrations with third party solutions/services • Understand technical project priorities, implementation dependencies, risks and issues • Participate and develop code as part of a unified development group, working the whole technological stack • Identify, prioritize and execute tasks in the software development life cycle • Work with the team to define, design, and deliver on new features • Broad and extensive knowledge of the software development life cycle (SDLC) with software development models like Agile, Scrum model, Jira models. Primary Skills • Develop high-quality software design and architecture • 10+ years of development experience in C#, .Net technologies, SQL and at least 2 years working with Azure Cloud Services • Expertise in C#, .Net Core 6.0/8.0 or higher, Entity framework, EF core, Microservices, Azure Cloud services, Azure DevOps and SOA • Ability to lead, inspire and motivate teams through effective communication and established credibility • Guide team to write reusable, testable, performant and efficient code • Proficient in writing Unit Test Cases using X-Unit, MS-Test • Build standards-based frameworks and libraries to support a large-scale application • Expertise in RDBMS including MS SQL Server with thorough knowledge in writing SQL queries, Stored Procedures, Views, Functions, Packages, Cursors & tables and object types. • Experience in large scale software development. • Prior experience in Application Support & API Integrations • Knowledge of architectural styles and design patterns, experience in designing solutions • Strong debugging and problem-solving skills • Effective communication skill, technical documentation, leadership and ownership quality Azure Skills • Azure Messaging services - Service Bus or Event Grid, Event hub • Azure Storage Account - Blobs, Tables, Queue etc • Azure Function / Durable Functions • Azure ADF and Logic APP • Azure DevOps - CI/CD pipelines (classic / YAML) • Application Insights, Azure Monitoring, KeyVault and SQL Azure Secondary Skills • Good knowledge of JavaScript, React JS, jQuery, Angular and other front end technologies • API Management - APIM • Azure Containerization and Container Orchestration Contact #8825984917 send your resume to hr@marsdata.in Show more Show less
Posted 1 month ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Description We are seeking an experienced Azure Cloud & Database Administrator to join our dynamic team. The ideal candidate will be responsible for configuring, managing, and supporting Azure cloud services and SQL Server databases, along with handling cloud migrations and DevOps pipelines. This role requires a proactive professional with strong problem-solving skills and hands-on experience in cloud solutions, database administration, and Responsibilities : Configure, manage, and maintain various Azure services to ensure optimal performance and availability. Perform hands-on administration of SQL Server databases, including backup and disaster recovery planning. Manage and support Azure Data Lake (ADLS) and Azure Data Factory (ADF) solutions. Lead the migration of on-premises instances to sustainable and scalable cloud-based architectures. Design, implement, and manage pipelines using Terraform and Azure DevOps for infrastructure automation and deployment. Execute remediation activities, including the creation and configuration of new resources such as Azure Databricks (ADB), ADLS, and ADF. Configure mount points and manage storage solutions within Azure environments. Facilitate notebook migrations between different Databricks instances. Conduct proactive monitoring, troubleshooting, and resolution of real-time issues in Azure environments and SQL databases. Analyze system requirements, refine and automate recurring processes, and maintain clear documentation of changes and procedures. Collaborate with development teams to assist in query tuning and schema optimization. Provide 24x7 support for critical production systems to ensure minimal downtime and business continuity. Implement and manage security processes and services within Azure Storage and related cloud : Bachelor's degree in Information Technology, Computer Science, or a related field. Proven hands-on experience in Azure cloud services, database administration, and DevOps practices. Strong understanding of cloud migration, storage solutions, security best practices, and disaster recovery planning. Experience with infrastructure as code tools such as Terraform. Strong analytical and problem-solving abilities with attention to detail. Excellent communication and collaboration skills. (ref:hirist.tech) Show more Show less
Posted 1 month ago
3.0 - 6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description We are seeking an experienced and highly skilled Oracle Fusion Technical Consultant to join our dynamic team. The ideal candidate will have a strong technical foundation in Oracle technologies with hands-on experience in Oracle Fusion Cloud Applications. This role demands excellent problem-solving abilities, a deep understanding of Oracle Fusion Financial modules, and the capability to work on integration and reporting requirements for our business applications. Responsibilities As a Technical Consultant, you will be responsible for designing, developing, and supporting various Oracle Fusion technical components including integrations, reports, and data migrations. You will collaborate closely with business analysts, functional consultants, and end-users to understand business processes and translate them into technical Responsibilities : Develop and maintain custom and standard reports using BI Publisher (BIP), OTBI (Oracle Transactional Business Intelligence), and FRS (Financial Reporting Studio) tools. Build and manage integrations using Oracle Integration Cloud (OIC) for both inbound and outbound scenarios, including handling of seeded and custom integrations. Perform technical development using Oracle Database SQL and PL/SQL for various data extraction, transformation, and reporting needs. Customize and personalize Fusion Applications UI through Sandbox-based customization techniques. Conduct data migration activities using FBDI (File-Based Data Import) and ADFDi (ADF Desktop Integrator) tools for bulk data uploads and updates. Design, develop, and manage REST APIs and SOAP Web Services for interfacing Oracle Fusion Applications with external systems. Understand and work with Oracle Fusion Financials data structures and standard tables to build efficient queries, reports, and integrations. Provide technical production support, troubleshooting, and performance tuning of Fusion Applications and integrations. Work closely with cross-functional teams to analyze business requirements and provide optimal technical solutions. Stay updated with the latest Oracle Cloud Infrastructure (OCI) offerings and implement best practices for secure and scalable solutions. Prepare technical design documents, unit test cases, and deployment documentation for all deliverables. Participate in system upgrades, patches, and enhancement projects as Skills & Qualifications : 3 to 6 years of hands-on experience as an Oracle Fusion Technical Consultant. Strong technical expertise in Oracle Database SQL and PL/SQL. Extensive experience in report development using BI Publisher (BIP), OTBI, and FRS. Solid experience working with Oracle Integration Cloud (OIC) for custom and seeded integrations. Well-versed with Sandbox-based customizations/personalizations in Oracle Fusion Applications. Practical experience in data migration activities using FBDI and ADFDi tools. Hands-on experience with REST APIs and SOAP Web Services. Good understanding of Oracle Fusion Financial Modules data structures and business processes. Familiarity with Oracle Cloud Infrastructure (OCI) fundamentals. Strong analytical, problem-solving, and troubleshooting skills. Excellent communication skills both written and verbal. Ability to work independently as well as collaboratively in a team Skills (Nice to Have) : Oracle Fusion Exposure to Oracle Fusion HCM or SCM modules. Experience in working with Agile/Scrum methodologies. Familiarity with Application Security Console, HCM Extracts, or VBCS will be a plus. (ref:hirist.tech) Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Greater Kolkata Area
On-site
Job Description Responsibilities : Architect and design end-to-end data solutions on Cloud Platform, focusing on data warehousing and big data platforms. Collaborate with clients, developers, and architecture teams to understand requirements and translate them into effective data solutions. Develop high-level and detailed data architecture and design documentation. Implement data management and data governance strategies, ensuring compliance with industry standards. Architect both batch and real-time data solutions, leveraging cloud native services and technologies. Design and manage data pipeline processes for historic data migration and data integration. Collaborate with business analysts to understand domain data requirements and incorporate them into the design deliverables. Drive innovation in data analytics by leveraging cutting-edge technologies and methodologies. Demonstrate excellent verbal and written communication skills to communicate complex ideas and concepts effectively. Stay updated on the latest advancements in Data Analytics, data architecture, and data management techniques. Requirements Minimum of 5 years of experience in a Data Architect role, supporting warehouse and Cloud data platforms/environments (Azure). Extensive Experience with common Azure services such as ADLS, Synapse, Databricks, Azure SQL etc. Experience on Azure services such as ADF, Polybase, Azure Stream Analytics Proven expertise in Databricks architecture, Delta Lake, Delta sharing, Unity Catalog, Data pipelines, and Spark tuning. Strong knowledge of Power BI architecture, DAX, and dashboard optimization. In-depth experience with SQL, Python, and/or PySpark. Hands-on knowledge of data governance, lineage, and cataloging tools such as Azure Purview and Unity Catalog. Experience in implementing CI/CD pipelines for data and BI components (e.g., using DevOps or GitHub). Experience on building symantec modeling in Power BI. Strong knowledge of Power BI architecture, DAX, and dashboard optimization. Strong expertise in data exploration using SQL and a deep understanding of data relationships. Extensive knowledge and implementation experience in data management, governance, and security frameworks. Proven experience in creating high-level and detailed data architecture and design documentation. Strong aptitude for business analysis to understand domain data requirements. Proficiency in Data Modeling using any Modeling tool for Conceptual, Logical, and Physical models is preferred Hands-on experience with architecting end-to-end data solutions for both batch and real-time designs. Ability to collaborate effectively with clients, developers, and architecture teams to implement enterprise-level data solutions. Familiarity with Data Fabric and Data Mesh architecture is a plus. Excellent verbal and written communication skills. (ref:hirist.tech) Show more Show less
Posted 1 month ago
0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Job Description Role : Azure Data bricks Developer Overview : Data Engineer having good experience on Azure Databricks and Python Must Have : Databricks, Python, Azure Good to have : ADF Requirements Candidate must be proficient in Databricks Understands where to obtain information needed to make the appropriate decisions Demonstrates ability to break down a problem to manageable pieces and implement effective timely solutions Identifies the problem versus the symptoms Manages problems that require involvement of others to solve Reaches sound decisions quickly Develops solutions to meet business needs that reflect a clear understanding of the objectives practices and procedures of the corporation department and business unit Roles Responsibilities Provides innovative and cost effective solution using databricks Optimizes the use of all available resources Develops solutions to meet business needs that reflect a clear understanding of the objectives practices and procedures of the corporation department and business unit Learn adapt quickly to new Technologies as per the business need Develop a team of Operations Excellence building tools and capabilities that the Development teams leverage to maintain high levels of performance scalability security and availability Skills The Candidate must have 7-10 yrs of experience in databricks delta lake Hands on experience on Azure Experience on Python scripting Relevant experience with ETL methods and with retrieving data from dimensional data models and data warehouses Strong experience with relational databases and data access methods especially SQL Knowledge of Azure architecture and design (ref:hirist.tech) Show more Show less
Posted 1 month ago
3.0 - 5.0 years
0 Lacs
Serilingampalli, Telangana, India
On-site
Key Accountabilities Using Microsoft Azure data PaaS services, design, build, modify, and support data pipelines leveraging DataBricks and PowerBI in a medallion architecture setting. If necessary, create prototypes to validate proposed ideas and solicit input from stakeholders. Excellent grasp of and expertise with test-driven development and continuous integration processes. Analysis and Design – Converts high-level design to low-level design and implements it. Collaborate with Team Leads to define/clarify business requirements, estimate development costs, and finalize work plans. Run unit and integration tests on all created code – Create and run unit and integration tests throughout the development lifecycle Benchmark application code proactively to prevent performance and scalability concerns. Collaborate with the Quality Assurance Team on issue reporting, resolution, and change management. Support and Troubleshooting – Assist the Operations Team with any environmental issues that arise during application deployment in the Development, QA, Staging, and Production environments. Assist other teams in resolving issues that may develop as a result of applications or the integration of multiple component. Knowledge And Experience Understanding of design concepts and architectural basics. Knowledge of performance engineering. Understanding of quality processes and estimate methods. Fundamental grasp of the project domain. The ability to transform functional and nonfunctional needs into system requirements. The ability to develop and code complicated applications is required. The ability to create test cases and scenarios based on specifications. Solid knowledge of SDLC and agile techniques. Knowledge of current technology and trends. Logical thinking and problem-solving abilities, as well as the capacity to collaborate. Primary skills: Cloud Platform, Azure, Databricks, ADF, ADO. Sought: SQL, Python, PowerBI. General Knowledge: PowerApps, Java. 3-5 years of experience in software development with minimum 2 years of cloud computing. Education Bachelor of Science in Computer Science, Engineering, or related technical field. Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
India
On-site
Our technology services client is seeking multiple Data Analytics with SQL, Databricks, ADF to join their team on a contract basis. These positions offer a strong potential for conversion to full-time employment upon completion of the initial contract period. Below are further details about the role: Role: Data Analytics with SQL, Databricks, ADF Mandatory Skills: SQL, Databricks, ADF Experience: 5-7 Years Location: Pan India Notice Period: Immediate- 15 Days Required Qualifications: 5 years of software solution development using agile, DevOps, product model that includes designing, developing, and implementing large-scale applications or data engineering solutions. 5+ years of Data Analytics experience using SQL 5+ years full-stack development experience, preferably in Azure 5+ years of cloud development (prefer Microsoft Azure) including Azure EventHub, Azure Data Factory, Azure Functions, ADX, ASA, Azure Databricks, Azure DevOps, Azure Blob Storage, Azure Power Apps, and Power BI. 1+ years of FAST API experience is a plus Airline Industry Experience Expertise with the Azure Technology stack for data management, data ingestion, capture, processing, curation and creating consumption layers. Azure Development Track Certification (preferred) Spark Certification (preferred) If you are interested, kindly share the updated resume to Sathwik@s3staff.com Show more Show less
Posted 1 month ago
7.0 - 12.0 years
15 - 30 Lacs
Pune, Chennai, Bengaluru
Hybrid
Job Role: Lead - Data Engineer Location: Bangalore, Mumbai, Chennai, Hyderabad, Pune Work Mode: Hybrid Shift: 4 pm - 1.30 am (cab drop) Experience: 7+ to 12 years - (Technical Lead), Skill: Snowflake, ADF (Azure Data Factory), migration, technical lead/team handling (good to have) Job Overview: The Lead Data Engineer will implement monitoring solutions, optimize ETL jobs for performance, and provide comprehensive support from data ingestion, migration and to final output Role & responsibilities: Lead the design, development, and maintenance of a scalable Snowflake data solution serving our enterprise data & analytics team. Architect and implement data pipelines, ETL/ELT workflows, and data warehouse solutions using Snowflake and related technologies. Design and implement data pipelines for ingesting data from various sources (databases, cloud storage, streaming platforms) into Snowflake. Deep understanding of Snowflake's architecture, features, and best practices. Implement data transformations using ADF and quality checks within the pipelines & Experience around migrating SSIS job to ADF. Experience in various Ingestion methods in Snowflake (Copy, Snowpipe, Dynamic tables, streams and tasks) Deep understanding about EDW Concepts, CDC Mechanisms, Dimensional Modelling concepts Optimize and maintain Snowflake data warehouse and query performance. Integrate Snowflake with various data sources and third-party tools. Collaborate with cross-functional teams to define and implement data solutions. Provide technical leadership and mentorship to junior and mid-level data engineers. Preferred candidate profile Experience in implementing CI/CD pipelines for Snowflake using Azure DevOps. 7+ years of experience in-depth data engineering, with at least 3+ minimum years of dedicated experience engineering solutions in a Snowflake environment Proficiency in SQL and Python or other scripting languages. Strong experience with cloud platforms (preference to Azure) and their data services Proficiency in ETL/ELT development using tools such as Azure Data Factory Experience in Data Migration to Cloud. Knowledge of data security best practices and relevant regulations. Snowpro Core Certified Those serving notice/Immediate joiner highly preferred
Posted 1 month ago
2.0 - 4.0 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC , we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle fusion Technical to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients *Responsibilities: Extensive experience in Oracle ERP/Fusion SaaS/PaaS project implementations as a technical developer . Completed at least 2 full Oracle Cloud (Fusion) Implementation Extensive Knowledge on database structure for ERP/Oracle Cloud (Fusion) Extensively worked on BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) *Mandatory skill sets BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) *Preferred skill sets database structure for ERP/Oracle Cloud (Fusion) *Years of experience required • Minimum 2 to 4 Years of Oracle fusion experience *Education Qualification • Graduate /Post Graduate Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor Degree Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Optional Skills Accepting Feedback, Active Listening, Business Transformation, Communication, Design Automation, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Strategic Technology Planning, Teamwork, Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 1 month ago
8.0 - 12.0 years
12 - 16 Lacs
Pune
Work from Office
Roles & Responsibilities: Design end-to-end data code development using pyspark, python, SQL and Kafka leveraging Microsoft Fabric's capabilities. Requirements: Hands-on experience with Microsoft Fabric , including Lakehouse, Data Factory, and Synapse . Strong expertise in PySpark and Python for large-scale data processing and transformation. Deep knowledge of Azure data services (ADLS Gen2, Azure Databricks, Synapse, ADF, Azure SQL, etc.). Experience in designing, implementing, and optimizing end-to-end data pipelines on Azure. Understanding of Azure infrastructure setup (networking, security, and access management) is good to have. Healthcare domain knowledge is a plus but not mandatory.
Posted 1 month ago
3.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Oracle Technical Consultant (Fusion Middleware) – Senior Job Summary: We are seeking a proactive and experienced Oracle Technical Consultant (OCI) – Staff Level to join our team. The ideal candidate will have expertise in Oracle EBS/Cloud, a strong technical background, and experience in executing client transformation projects in ERP, Supply Chain, Procurement, and IT support.. Primary Responsibilities and Accountabilities: Assist in executing client transformation-related engagements in areas of Supply Chain, Procurement, Risk & Compliance, and ERP/IT support. Ensure high-quality work delivery. Identify engagement-related risks and escalate issues as appropriate. Establish and maintain strong relationships with clients (process owners/functional heads) and internal teams. Support Managers in meeting both client and internal KPIs Experience: Core experience in Oracle Technical activities Upto 3 years of relevant experience working in Oracle Fusion HCM (EBS / Fusion) Experience in at least one full life cycle implementation, with at least one implementation Experience with Oracle Fusion Middleware components such as Oracle SOA Suite, Oracle Service Bus (OSB), and Oracle BPM. Experience in major industry sectors like Retail, Government, Energy, Real Estate, Oil & Gas, Power & Utilities. Competencies / Skills: Proficiency in Java, J2EE, and WebLogic Server. Experience with Oracle WebCenter and Oracle Identity Management. Knowledge of Oracle ADF (Application Development Framework). Experience with Oracle API Gateway and Oracle API Management. Familiarity with Oracle Goldengate for data integration and replication. Experience with cloud-native development and microservices architecture. Knowledge of containerization technologies like Docker and Kubernetes. Experience with CI/CD pipelines and tools like Jenkins, Git, and Maven. Understanding of security best practices and compliance standards in middleware environments. Education: Should be a graduate along with a degree in as B.Tech or MBA. Oracle Certified. Bachelor’s degree in IT, Computer Science, or a related technical discipline. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Hello, Good Day ! Minimum 10 Yrs Hands-on with the Subject. Currently, looking for an Individual, possessing the following Skill-Set w.r.t to Employment. Kindly reach-out to me if you inherit these tools. Sharing the required profile for your reference. Responsibility of / Expectations from the Role : - Develop Data models and patterns for Data Platform Solutions . Conceptualize and solution a centralized data lake solution to bring data from disparate sources . Implement security on DL Solution . Present agreed solutions to business to get buy-in on the strategy and roadmap. Technical : - Experience in Data intensive activities ( Data Modeling, ER Diagram, ETL, Data mapping, Data Governance and security )· Experience in End-to-End implementation of large Data Lake or Data Warehouse solution · Experience in Azure data lake and unity catalog . Strong knowledge on Data Governance . Hands on experience with ETL tools (ADF) and Spark Pools . Strong knowledge and experience in data modeling · Strong knowledge on Data Security / Encryption / Monitoring . Knowledge on DevOps . Analytical and problem-solving skills with a high degree of initiative and flexibility to be available over extended hours. Ability to communicate with Business SME. Understanding of Design Patterns – Data Models . Experienced with Onsite – Offshore Delivery Model. Technical background ideally within Managed Services and IT outsourcing industry. Certification on Data Platform Solutions . Behavioral : - Creating a detailed requirement analysis and provide solution. Strong Analytical thinking and problem-solving skills. Strong communication, presentation and writing skills. Be proactive and have a can-do attitude. Good-to-Have : - Experience on Cloud Tool & Technology viz. ADO, ServiceNow, Azure Monitoring, Performance Management, Analytics etc. Understanding of Agile Process. Maintains a broad and current knowledge of the industry on Cloud Platform . Show more Show less
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France