Home
Jobs

472 Data Services Jobs - Page 17

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 - 5.0 years

1 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title:Data Engineer - DBT (Data Build Tool) Experience0-5 Years Location:Bengaluru : Job Responsibilities Assist in the design and implementation of Snowflake-based analytics solution(data lake and data warehouse) on AWS Requirements definition, source data analysis and profiling, the logical and physical design of the data lake and data warehouse as well as the design of data integration and publication pipelines Develop Snowflake deployment and usage best practices Help educate the rest of the team members on the capabilities and limitations of Snowflake Build and maintain data pipelines adhering to suggested enterprise architecture principles and guidelines Design, build, test, and maintain data management systems Work in sync with internal and external team members like data architects, data scientists, data analysts to handle all sorts of technical issue Act as technical leader within the team Working in Agile/Lean model Deliver quality deliverables on time Translating complex functional requirements into technical solutions. EXPERTISE AND QUALIFICATIONS Essential Skills, Education and Experience Should have a B.E. / B.Tech. / MCA or equivalent degree along with 4-7 years of experience in Data Engineering Strong experience in DBT concepts like Model building and configurations, incremental load strategies, macro, DBT tests. Strong experience in SQL Strong Experience in AWS Creation and maintenance of optimum data pipeline architecture for ingestion, processing of data Creation of necessary infrastructure for ETL jobs from a wide range of data sources using Talend, DBT, S3, Snowflake. Experience in Data storage technologies like Amazon S3, SQL, NoSQL Data modeling technical awareness Experience in working with stakeholders working in different time zones Good to have AWS data services development experience. Working knowledge on using Bigdata technologies. Experience in collaborating data quality and data governance team. Exposure to reporting tools like Tableau Apache Airflow, Apache Kafka (nice to have) Payments domain knowledge CRM, Accounting, etc. in depth understanding Regulatory reporting exposure Other skills Good Communication skills Team Player Problem solver Willing to learn new technologies, share your ideas and assist other team members as needed Strong analytical and problem-solving skills; ability to define problems, collect data, establish facts, and draw conclusions.

Posted 1 month ago

Apply

10.0 - 15.0 years

12 - 17 Lacs

Indore, Hyderabad, Ahmedabad

Work from Office

Naukri logo

Job Title: Technical Architect / Solution Architect / Data Architect (Data Analytics) ?? Notice Period: Immediate to 15 Days ?? Experience: 9+ Years Job Summary: We are looking for a highly technical and experienced Data Architect / Solution Architect / Technical Architect with expertise in Data Analytics. The candidate should have strong hands-on experience in solutioning, architecture, and cloud technologies to drive data-driven decisions. Key Responsibilities: ? Design, develop, and implement end-to-end data architecture solutions. ? Provide technical leadership in Azure, Databricks, Snowflake, and Microsoft Fabric. ? Architect scalable, secure, and high-performing data solutions. ? Work on data strategy, governance, and optimization. ? Implement and optimize Power BI dashboards and SQL-based analytics. ? Collaborate with cross-functional teams to deliver robust data solutions. Primary Skills Required: ? Data Architecture & Solutioning ? Azure Cloud (Data Services, Storage, Synapse, etc.) ? Databricks & Snowflake (Data Engineering & Warehousing) ? Power BI (Visualization & Reporting) ? Microsoft Fabric (Data & AI Integration) ? SQL (Advanced Querying & Optimization) ?? Looking for immediate to 15-day joiners!

Posted 1 month ago

Apply

2.0 - 5.0 years

3 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

Summary: Kroll Agency and Trustee Services provides conflict-free, flexible, and highly efficient administrative and trustee services to the global loan and bond markets. As a leading independent service provider, we specialize in the administration of privately placed notes, restructuring situations, syndicated, bi-lateral and private credit transactions. Our team of industry leading experts coupled with our high touch service, speed of execution and 24/7 responsiveness sets us apart from other providers. To learn more, please visit https://www. kroll. com / en / services / agency-and-trustee-services We are currently hiring a position within the Data services process. This role will require us to work closely with Transaction Managers (Front Office), External Clients, KYC and the Operations team to ensure the portfolio administration tasks are completed. The individual will have responsibility for the completion of many key portfolio and transaction tasks and for reporting of aged items and ownership of resolution. As such they are required to be detail orientated, organized and able to maintain accurate and complete records at all times. The ideal candidate will be a proactive and meticulous critical thinker. Must possess attributes of sound judgement, tact, and diplomacy. Strong analytical skills with an ability to identify issues. Ability to act independently (decision making) and be a team player as well. Responsibilities: Static Management: Data input and maintenance on a proprietary loan administration platform. This task involves accurately entering and updating data related to loans in a specialized software system designed specifically for managing loan information. Monitor the Data services inbox for receipt of documents and queries. The aim is to promptly identify and address any incoming requests or issues that may require attention, ensuring that all communications are responded to in a timely manner. Collaborate with various teams within the organization to understand data flows and process. Working closely with different departments or teams within the organization to gain a comprehensive understanding of how data moves through various systems and processes. Ongoing maintenance of Lender/Borrower contact static set up in LIQ and other systems. This includes ensuring that contact details are accurate and that any changes in contact information are reflected across all platforms to facilitate efficient communication. Manage any ad-hoc tasks and ensure completion within expected TAT. It is necessary to handle various unplanned or one-off tasks that may arise unexpectedly. This requires effective time management and organizational skills to ensure that all such tasks are completed within the expected turnaround time (TAT), maintaining overall workflow efficiency. Functional knowledge of Loan IQ and Lending Domain is a plus. Having a solid understanding of the Loan IQ system and the overall lending domain is advantageous. Knowledge of building different types of payment instruction on Loan IQ application. Static set up of remittance instructions, customers and performing callback to confirm the payment details of borrower/lenders. Additionally, Perform Callback to confirm the admin details with the customers. It involves conducting follow-up calls to confirm payment details with borrowers and lenders, ensuring accuracy and preventing any potential discrepancies in transactions. Support Functions: Attend daily WIP call and update the team on workstreams. Effective collaboration ensures that all teams are aligned and that any potential bottlenecks or inefficiencies in data handling are identified and addressed. Working closely with the management accounting and operations teams to assist in resolving any queries arising from the payments and operations teams. Working with Management to assist in the production of the Monthly MIS Report. Managing ad-hoc transaction activity - assisting transaction management team to administer ad-hoc and unscheduled transaction activity as directed and in accordance with procedures. Reporting and Compliance - completing and delivering regular reports and action points to management and working to improve procedures and processes based on regular findings. Process Management: Demonstrate high regard for organization s policy & procedures. Demonstrate accountability & ownership. Identify, Analyze, Prioritize, Treat & Monitor Risks Effectively manage risk & business goals Manage process controls effectively. Requirements Bachelor s degree in commerce / finance or relevant experience Experience in Lending operations. Experience with loans systems such as Loan IQ Strong oral and written communication skills Ability to work overtime as needed to support the team and ensure critical work is performed. Flexible working in shifts Ability to manage sensitive and confidential information. #LI-IK1 #LI-Hybrid

Posted 1 month ago

Apply

2.0 - 4.0 years

11 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Skills: Devops,Data Services->TDM (Test Data Management),Agile Coach->Consulting Process (Agile) , DevOps->TOSCA , Automation testing Responsibilities A day in the life of an finserv As part of the finserv consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!

Posted 1 month ago

Apply

5.0 - 8.0 years

7 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

A catastrophe modeling job involves analyzing and assessing the potential impact of catastrophic events (e.g., natural disasters like earthquakes, floods, hurricanes) on assets, infrastructure, and populations. The role typically includes developing, refining, and applying mathematical models to predict and evaluate risks, helping companies (such as insurers or government agencies) prepare for and mitigate the financial impact of such events. Responsibilities may also include data analysis, scenario testing, and collaborating with cross-functional teams to inform risk management strategies. Proficiency in data science, programming, and a strong understanding of geophysical or environmental factors are often required. Skills Set Required (Mandatory) 5 to 8 Years experience Hands on experience on AIR (Touchstone / TS Re and CATRADER) software Experience in CAT Modeling Industry Should understand & interpret CAT Modeling losses. Understanding of policy structure (layers, limits, deductibles) and how it works in insurance industry Insurance & Re-insurance Subject, Underwriting concepts Attention to detail and superior communication skills. Experience in Open Market & Binder account processing & auditing Proficiency in Excel and SQL & Analytical skills Desirable Skill Set Required (Add-on): Writing Macro s using VB scripts, Underwriting concepts. The position in Data Services team offers an interesting range of responsibilities includes Cleansing, augmenting, Validating, preparing catastrophe model exposure data for different Line of Business, Applying Insurance & Re-insurance policy conditions, analysis of client exposure data against to different perils, quantifying natural catastrophe risk based on catastrophe modeling software and reviewing of work (accounts) done by analysts, Maintain clients Turn Around Time and quality all the time. Should understand & interpret the losses, Understanding of Touchstone product and database structure. Maintain/manage account log sheet, Assign the work to team members, Audit/review the accounts done by risk analysts, manage the workflow in absence of Team Lead/Manager, raising client queries, attention to detail and superior communication skills.

Posted 1 month ago

Apply

5.0 - 8.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

A catastrophe modeling job involves analyzing and assessing the potential impact of catastrophic events (e.g., natural disasters like earthquakes, floods, hurricanes) on assets, infrastructure, and populations. The role typically includes developing, refining, and applying mathematical models to predict and evaluate risks, helping companies (such as insurers or government agencies) prepare for and mitigate the financial impact of such events. Responsibilities may also include data analysis, scenario testing, and collaborating with cross-functional teams to inform risk management strategies. Proficiency in data science, programming, and a strong understanding of geophysical or environmental factors are often required. Skills Set Required (Mandatory): 5 to 8 Years experience Hands on experience on AIR (Touchstone / TS Re and CATRADER) software Experience in CAT Modeling Industry Should understand & interpret CAT Modeling losses. Understanding of policy structure (layers, limits, deductibles) and how it works in insurance industry Insurance & Re-insurance Subject, Underwriting concepts Attention to detail and superior communication skills. Experience in Open Market & Binder account processing & auditing Proficiency in Excel and SQL & Analytical skills Desirable Skill Set Required (Add-on): Writing Macro s using VB scripts, Underwriting concepts. The position in Data Services team offers an interesting range of responsibilities includes Cleansing, augmenting, Validating, preparing catastrophe model exposure data for different Line of Business, Applying Insurance & Re-insurance policy conditions, analysis of client exposure data against to different perils, quantifying natural catastrophe risk based on catastrophe modeling software and reviewing of work (accounts) done by analysts, Maintain clients Turn Around Time and quality all the time. Should understand & interpret the losses, Understanding of Touchstone product and database structure. Maintain/manage account log sheet, Assign the work to team members, Audit/review the accounts done by risk analysts, manage the workflow in absence of Team Lead/Manager, raising client queries, attention to detail and superior communication skills.

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Gurugram

Work from Office

Naukri logo

Data Engineer - Immediate Joiner Do you love working with data and building scalable solutions that can handle large volumes of data? Are you passionate about helping companies make data-driven decisions and achieve their goals? If so, we are looking for a talented Data Engineer to join our team! We are Uptitude, a fast-growing start-up with a global client base, headquartered in London UK, and we are looking for someone to join us full time in our cool office in Gurugram. About Uptitude: Uptitude is a forward-thinking consultancy that specializes in providing exceptional data and business intelligence solutions to clients worldwide. Our team is passionate about empowering businesses with data-driven insights, enabling them to make informed decisions and achieve remarkable results. At Uptitude, we embrace a vibrant and inclusive culture, where innovation, excellence, and collaboration thrive. We are seeking a highly skilled Data Engineer to join our team in the next month. Responsibilities: Design and implement modern data architectures using Azure Data Services (Azure Data Lake, Azure Data Factory, Azure Databricks, Azure Synapse Analytics, etc.). Develop batch data processing solutions, ETL processes, and automated workflows. Work collaboratively with data analysts, PowerBI developers, and stakeholders to understand data needs and deliver comprehensive data solutions. Ensure data solutions are scalable, repeatable, effective, and meet the expectations of business goals and strategies. Keep abreast with industry best practices. Troubleshoot and debug data issues, ensuring robust and error-free data pipelines. Requirements: Minimum of 3 years of experience as a Data Engineer or similar role with a proven track record working with Azure services (being certified as Azure Data Engineer Associate is an asset). Strong proficiency in SQL and experience with structured and unstructured data models. In-depth knowledge of Azure data services and tools (Azure SQL Database, Azure Data Factory, Data Lake, Databricks, Azure Synapse Analytics). Proficient in scripting language like Python Understanding of big data technologies (Hadoop, Spark) and working experience with PySpark. Experience in using Azure DevOps for implementing CI/CD pipelines is desirable. Excellent problem-solving abilities and strong analytical skills. Excellent communication and teamwork skills, with the ability to interact at all levels of the organization. Working knowledge of PowerBI is desirable. Company Values: At Uptitude, we embrace a set of core values that guide our work and define our culture. As a Data Engineer, you should align with these values: 1. Be Awesome: Strive for excellence in everything you do, continuously improving your skills and delivering exceptional results. 2. Step Up: Take ownership of challenges, be proactive, and seek opportunities to contribute beyond your role. 3. Make a Difference: Embrace innovation, think creatively, and contribute to the success of our clients and the company. 4. Have Fun: Foster a positive and enjoyable work environment, celebrating achievements and building strong relationships. Benefits: Uptitude values its employees and offers a competitive benefits package, including: Competitive salary commensurate with experience and qualifications. Private health insurance coverage. Offsite trips to encourage team building and knowledge sharing. Quarterly team outings to unwind and celebrate achievements. Corporate English Lessons with UK instructor If you are passionate about Data Engineering and want to be part of a team that is making a real impact, we want to hear from you! At Uptitude, we are committed to building a team of talented and passionate individuals who are dedicated to our mission and share our values of innovation, collaboration, and customer success. If this sounds like you, please apply today!

Posted 1 month ago

Apply

7.0 - 14.0 years

9 - 16 Lacs

Chennai

Work from Office

Naukri logo

5+ years of hands on experience in ABAP development with a strong focus on RICEF objects (Reports, Interfaces, Conversions, Enhancements, and Forms). In depth knowledge of Object Oriented ABAP (OOABAP) programming for modern SAP development. Proven experience with the Clean Core Approach and implementing SAP standard solutions with minimal customizations, ensuring long term sustainability and ease of upgrades. SAP HANA expertise, including experience in developing AMDP procedures and CDS Views for complex requirements. Proficient in developing applications using the ABAP RESTful Application Programming Model (RAP). Strong experience with Core Data Services (CDS) Views and OData services for integration and data modeling. Experience working with SAP Fiori and SAP UI5, including the development of OData services to integrate backend with frontend UI applications. EML (Entity Manipulation Language) for advanced transactional scenarios in both standard and custom APIs. RAP based APIs (managed and unmanaged scenarios) with actions, authorization checks, and field validation in RAP APIs & CDS. Fiori based custom reports and message implementation in RAP. Deep understanding of SAP SD, MM, and FI modules and business processes. Excellent analytical and problem solving skills, with the ability to debug, troubleshoot, and resolve technical issues efficiently. Hands on experience with SAP NetWeaver Gateway for managing OData services. Strong communication and collaboration skills to work effectively with both technical and functional teams. Ability to handle complex end to end implementations and deliver high quality results in a fast paced environment

Posted 1 month ago

Apply

5 - 10 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Google Cloud Data Services Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your day will involve overseeing the application development process and ensuring seamless communication within the team and stakeholders. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead the application development process Ensure effective communication within the team and stakeholders Professional & Technical Skills: Must To Have Skills:Proficiency in Google Cloud Data Services Strong understanding of cloud computing principles Experience with cloud-based application development Knowledge of data storage and processing in the cloud environment Additional Information: The candidate should have a minimum of 5 years of experience in Google Cloud Data Services This position is based at our Bengaluru office A 15 years full-time education is required Qualifications 15 years full time education

Posted 1 month ago

Apply

5 - 10 years

7 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BusinessObjects Data Services Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with team members, analyzing requirements, and developing solutions to meet business needs. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead and mentor junior team members Conduct regular code reviews to ensure quality standards are met Professional & Technical Skills: Must To Have Skills:Proficiency in SAP BusinessObjects Data Services Strong understanding of ETL processes Experience in data modeling and database design Knowledge of SAP BusinessObjects reporting tools Hands-on experience in troubleshooting and debugging applications Additional Information: The candidate should have a minimum of 5 years of experience in SAP BusinessObjects Data Services This position is based at our Hyderabad office A 15 years full-time education is required Qualifications 15 years full time education

Posted 1 month ago

Apply

8 - 13 years

13 - 18 Lacs

Mumbai, Hyderabad, Bengaluru

Work from Office

Naukri logo

Impact: This position offers the opportunity to lead a team of highly motivated individuals and contribute to achieving the teams goals. You will lead efforts to improve data accuracy, completeness, and timeliness through collaboration, innovation, and the execution of ad-hoc projects, with a focus on acquiring and collecting public and private data. Through strong stakeholder relationships and a deep understanding of market dynamics, we position ourselves as trusted partners, equipping clients with the intelligence needed to navigate opportunities and risks in a competitive environment. Responsibilities: Formulate and implement data-driven strategies that balance technical and product knowledge, collaborating with multiple teams to create best-in-class solutions. Oversee and implement data quality projects that align with evolving business priorities, ensuring high standards of data integrity. Identify opportunities for new datasets within the market landscape and support the development of strategies to incorporate them into existing frameworks. Demonstrate empathy and support team members, especially during challenging times, promoting a culture of well-being and collaboration. Encourage team motivation, facilitate career progression discussions, and execute succession planning to nurture talent within the team. Enhance the technical skills of the team, preparing them for future growth and evolving industry demands. Establish SMART objectives for team members, actively manage performance, and communicate the Pay for Performance culture and its linkage to rewards. Track and communicate team performance metrics, including time utilization and quality statistics, while setting challenging benchmarks for resource efficiency. Mentor the team on industry trends and large-scale data projects, providing guidance on business initiatives. Manage short-term and long-term projects from resource planning to execution, collaborating closely with the Data Management team to ensure alignment and effectiveness. Drive constructive conversations with the leadership team and stakeholders across various locations, ensuring alignment on goals and expectations. Advocate for a culture of innovation by understanding processes and workflows, generating ideas to eliminate content gaps and establish best practices. Foster a lean mindset to improve operational efficiency. Ensure all critical timelines and requirements for business-as-usual workflows, KPIs, and projects are met, demonstrating problem-solving capabilities at all levels. As a people leader, embody and promote the organizations values, culture, and strategic objectives, setting an example for the team. What we are looking for: Prior leadership experience in data services, with a strong focus on people management. Knowledge or experience in the industry is preferred. In-depth understanding of the mechanics of the capital markets domain, with the ability to quantify trends impacting the industry and provide insightful analysis. Proven operational management skills with a keen attention to detail, gained within a respected data company, ensuring effective oversight of data quality and performance. Experience in introducing and monitoring Key Performance Indicators (KPIs) and performance metrics, facilitating continuous improvement and accountability within the team. Capacity to give and receive constructive feedback, providing coaching to team members to foster their professional growth and development. Exceptional oral and written communication skills, enabling clear articulation of complex data insights and fostering effective stakeholder engagement. Willingness to work across various shifts, including night shifts on a rotational or as-needed basis, demonstrating adaptability to meet business needs. Maintains high ethical standards both personally and professionally, ensuring transparency and integrity within the team. Strong collaboration skills with the ability to work effectively within cross-functional teams and build relationships with various stakeholders. Comfort with change management processes, adapting to evolving business needs and driving innovation within the team. Familiarity with additional analytical tools or programming languages that enhance data analysis capabilities. Experience in managing projects from inception to completion, including the ability to prioritize tasks and manage resources effectively. Understanding of cultural differences and the ability to navigate them effectively in a global work environment. Commitment to continuous learning and professional development in data analysis and emerging technologies. A results-oriented approach, focusing on achieving goals and delivering measurable outcomes. Preferred Qualifications: A minimum of 8 years of experience working closely with senior leaders and decision-makers, demonstrating the ability to influence and drive strategic initiatives. Proven experience in establishing and nurturing trust with business heads, fostering long-lasting business relationships that benefit both the organization and stakeholders. Comfort with a high degree of autonomy, effectively managing priorities from multiple internal and external stakeholders to achieve organizational goals. Basic knowledge of SQL and Generative AI is desirable, providing a foundation for data analysis and innovative solutions. Familiarity with data visualization tools, enabling effective communication of insights through visual storytelling. Possession of a Green Belt Certification and exposure to Lean concepts, indicating a commitment to process improvement and operational efficiency. Our People: We're more than 35,000 strong worldwideso we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. Were committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. Were constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: Its not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visit: Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to:? ?and your request will be forwarded to the appropriate person.? US Candidates Only: The EEO is the Law Poster describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - ----------------------------------------------------------- 10 - Officials or Managers (EEO-2 Job Categories-United States of America), DTMGOP103.2 - Middle Management Tier II (EEO Job Group), SWP Priority Ratings - (Strategic Workforce Planning)

Posted 1 month ago

Apply

4 - 9 years

14 - 18 Lacs

Noida

Work from Office

Naukri logo

Who We Are Build a brighter future while learning and growing with a Siemens company at the intersection of technology, community and s ustainability. Our global team of innovators is always looking to create meaningful solutions to some of the toughest challenges facing our world. Find out how far your passion can take you. What you need * BS in an Engineering or Science discipline, or equivalent experience * 7+ years of software/data engineering experience using Java, Scala, and/or Python, with at least 5 years' experience in a data focused role * Experience in data integration (ETL/ELT) development using multiple languages (e.g., Java, Scala, Python, PySpark, SparkSQL) * Experience building and maintaining data pipelines supporting a variety of integration patterns (batch, replication/CD C, event streaming) and data lake/warehouse in production environments * Experience with AWS-based data services technologies (e.g., Kinesis, Glue, RDS, Athena, etc.) and Snowflake CDW * Experience of working in the larger initiatives building and rationalizing large scale data environments with a large variety of data pipelines, possibly with internal and external partner integrations, would be a plus * Willingness to experiment and learn new approaches and technology applications * Knowledge and experience with various relational databases and demonstrable proficiency in SQL and supporting analytics uses and users * Knowledge of software engineering and agile development best practices * Excellent written and verbal communication skills The Brightly culture We"™re guided by a vision of community that serves the ambitions and wellbeing of all people, and our professional communities are no exception. We model that ideal every day by being supportive, collaborative partners to one another, conscientiousl y making space for our colleagues to grow and thrive. Our passionate team is driven to create a future where smarter infrastructure protects the environments that shape and connect us all. That brighter future starts with us.

Posted 1 month ago

Apply

4 - 9 years

25 - 30 Lacs

Hyderabad

Work from Office

Naukri logo

Overview In this role, We are seeking a Associate Manager Offshore Program & Delivery Management to oversee program execution, governance, and service delivery across DataOps, BIOps, AIOps, MLOps, Data IntegrationOps, SRE, and Value Delivery programs. This role requires expertise in offshore execution, cost optimization, automation strategies, and cross-functional collaboration to enhance operational excellence. Collaborate with global teams to support Data & Analytics transformation efforts and ensure sustainable, scalable, and cost-effective operations. Support the standardization and automation of pipeline workflows, report generation, and dashboard refreshes to enhance efficiency. Manage and support DataOps programs, ensuring alignment with business objectives, data governance standards, and enterprise data strategy. Assist in real-time monitoring, automated alerting, and self-healing mechanisms to improve system reliability and performance. Contribute to the development and enforcement of governance models and operational frameworks to streamline service delivery and execution roadmaps. Assist in proactive issue identification and self-healing automation, enhancing the sustainment capabilities of the PepsiCo Data Estate. Responsibilities Support DataOps and SRE operations, assisting in offshore delivery of DataOps, BIOps, Data IntegrationOps, and related initiatives. Assist in implementing governance frameworks, tracking KPIs, and ensuring adherence to operational SLAs. Contribute to process standardization and automation efforts, improving service efficiency and scalability. Collaborate with onshore teams and business stakeholders, ensuring alignment of offshore activities with business needs. Monitor and optimize resource utilization, leveraging automation and analytics to improve productivity. Support continuous improvement efforts, identifying operational risks and ensuring compliance with security and governance policies. Assist in managing day-to-day DataOps activities, including incident resolution, SLA adherence, and stakeholder engagement. Participate in Agile work intake and management processes, contributing to strategic execution within data platform teams. Provide operational support for cloud infrastructure and data services, ensuring high availability and performance. Document and enhance operational policies and crisis management functions, supporting rapid incident response. Promote a customer-centric approach, ensuring high service quality and proactive issue resolution. Assist in team development efforts, fostering a collaborative and agile work environment. Adapt to changing priorities, supporting teams in maintaining focus on key deliverables. Qualifications 6+ years of technology experience in a global organization, preferably in the CPG industry. 4+ years of experience in Data & Analytics, with a foundational understanding of data engineering, data management, and operations. 3+ years of cross-functional IT experience, working with diverse teams and stakeholders. 12 years of leadership or coordination experience, supporting team operations and service delivery. Strong communication and collaboration skills, with the ability to convey technical concepts to non-technical audiences. Customer-focused mindset, ensuring high-quality service and responsiveness to business needs. Experience in supporting technical operations for enterprise data platforms, preferably in a Microsoft Azure environment. Basic understanding of Site Reliability Engineering (SRE) practices, including incident response, monitoring, and automation. Ability to drive operational stability, supporting proactive issue resolution and performance optimization. Strong analytical and problem-solving skills, with a continuous improvement mindset. Experience working in large-scale, data-driven environments, ensuring smooth operations of business-critical solutions. Ability to support governance and compliance initiatives, ensuring adherence to data standards and best practices. Familiarity with data acquisition, cataloging, and data management tools. Strong organizational skills, with the ability to manage multiple priorities effectively.

Posted 1 month ago

Apply

5 - 10 years

22 - 27 Lacs

Hyderabad, Bengaluru

Work from Office

Naukri logo

Location: Hyderabad, Bangalore - India Function: HV Product Management Requisition ID: 1033000 The Company We’re Hitachi Vantara, the data foundation trusted by the world’s innovators. Our resilient, high-performance data infrastructure means that customers – from banks to theme parks – can focus on achieving the incredible with data. ?? If you’ve seen the Las Vegas Sphere, you’ve seen just one example of how we empower businesses to automate, optimize, innovate – and wow their customers. Right now, we’re laying the foundation for our next wave of growth. We’re looking for people who love being part of a diverse, global team – and who get excited about making a real-world impact with data. The Team The VSP 360 team is focused on building an intelligent, hybrid cloud platform that integrates observability, automation, protection, and data insights. As part of this mission, we are expanding platform capabilities to include rich data services integrations that enhance visibility, governance, compliance, and cyber resilience. This team works cross-functionally with engineering, partner ecosystems, and customer-facing teams to deliver seamless experiences and actionable insights from a wide range of data services and third-party platforms. The Role As the Product Manager for Data Services within the VSP 360 platform, you will lead the strategy and execution for integrating a diverse set of data services that drive data intelligence, governance, and protection. This includes managing platform-level integrations with services such as data classification, data cataloging, PII detection, cyber resilience, and third-party data protection solutions. You’ll collaborate with internal and external stakeholders to define use cases, capture integration requirements, and drive partner enablement. Your role will focus on building scalable APIs and workflows that bring context-rich insights and automation to the forefront of hybrid cloud storage management. You will be responsible for managing the backlog in Aha!, coordinating cross-functional execution, and ensuring customer-facing outcomes around security, compliance, and operational efficiency. What You’ll Bring 5+ years of product management experience in data services, storage, or enterprise software Strong understanding of data classification, cataloging, governance, and PII/security frameworks Familiarity with cyber resilience concepts and tools Experience integrating third-party solutions (e.g., Commvault, Veeam) into a platform environment Proven ability to define APIs and workflows for data services integration Agile product management experience with tools like Aha!, Jira, or equivalent Ability to balance technical requirements with customer value and usability Strong collaboration and communication skills across product, engineering, and partners Strategic mindset with experience driving partner ecosystems and joint solutions Passion for delivering customer-centric solutions with measurable business impact About us We’re a global team of innovators. Together, we harness engineering excellence and passion for insight to co-create meaningful solutions to complex challenges. We turn organizations into data-driven leaders that can a make positive impact on their industries and society. If you believe that innovation can inspire the future, this is the place to fulfil your purpose and achieve your potential. #LI-SP7 Championing diversity, equity, and inclusion

Posted 1 month ago

Apply

2 - 3 years

11 - 16 Lacs

Noida

Work from Office

Naukri logo

Arcadis is the world's leading company delivering sustainable design, engineering, and consultancy solutions for natural and built assets. We are more than 36,000 people, in over 70 countries, dedicated toimproving quality of life. Everyone has an important role to play. With the power of many curious minds, together we can solve the worlds most complex challenges and deliver more impact together. Role description As an Assistant Global Power Platform Developer at Arcadis, you will be responsible for designing, developing, and implementing custom solutions using Microsoft Power Platform. You will work closely with our four Business Areas and associated stakeholders to establish governance as well as guidance, training, and compliance guardrails. Role accountabilities: Work closely with Power Platform leads to oversee the usage and manage Power Platform. Work closely with Power Platform leads and business analysts to gather requirements and understand how to leverage Power Platform as a solution to business needs. Design, develop, secure, and extend Power Platform solutions and reusable solution components based on business requirements. Develop and maintain documentation related to the solutions you create. Design and develop UI/UX experiences that are engaging and user-friendly. Provide technical support as well as guidance to the business as needed through our Community of Practice or Ticketing Tool. Nurture the Community of Arcadis Power Platform Developers by providing best practices, guidelines, and guardrails through SharePoint articles, documents, training, and hosting Monthly Global Calls. Keep up to date with the latest Power Platform features and technologies. Align with, steward, and promote organizational best practices and governance (e.g., InfoSec, Privacy, Data Sovereignty, Access Rights & Permissions). Work with the wider team to define, create, and promote best practices to ensure organizational compliance. Collaborate with Global IT and business stakeholders to identify opportunities for process automation and optimization using Power Platform. Conduct thorough testing and quality assurance of developed solutions to ensure they meet requirements and are free of bugs and errors. Qualifications & Experience: Education : Bachelors degree in Information Technology, Computer Science, or a suitably related field. Technical Experience : Minimum of 2 years development experience working within Power Platform and Dataverse/Common Data Service. Holding a Microsoft CertifiedPower Platform Developer Associate certification is an advantage. Experience in designing, developing, and implementing Power Apps (Canvas and Model-Driven Apps), Dataverse, Power Automate (Cloud and Desktop flows), Power Pages, and/or Copilot Studio (formerly Power Virtual Agents). Experience with Microsoft Azure, Dynamics 365, and/or Microsoft 365 (SharePoint and Teams) is preferable. Experience with Azure DevOps and Power Platform pipelines to manage Power Platform solution lifecycle is preferable. Experience with GitHub is preferable. Development experience using JavaScript, JSON, TypeScript, C#, HTML, .NET, Azure, Microsoft 365, RESTful web services, ASP.NET and/or Power Query is an advantage. Methodology Experience : Experience working in the software development lifecycle using an Agile approach is preferable. Soft Skills : Excellent stakeholder management skills with experience working with a diverse set of stakeholders. Maintains an awareness of developing technologies and their application and takes responsibility for personal development. Familiarity with data modelling, database design, and data integration techniques. Strong problem-solving and analytical skills, with the ability to think creatively and innovate. Excellent communication and collaboration skills, with the ability to work effectively in a team environment. Why Arcadis? We can only achieve our goals when everyone is empowered to be their best. We believe everyone's contribution matters. Its why we are pioneering a skills-based approach, where you can harness your unique experience and expertise to carve your career path and maximize the impact we can make together. Youll do meaningful work, and no matter what role, youll be helping to deliver sustainable solutions for a more prosperous planet. Make your mark, on your career, your colleagues, your clients, your life and the world around you. Together, we can create a lasting legacy. Join Arcadis. Create a Legacy. Our Commitment to Equality, Diversity, Inclusion & Belonging We want you to be able to bring your best self to work every day, which is why we take equality and inclusion seriously and hold ourselves to account for our actions. Our ambition is to be an employer of choice and provide a great place to work for all our people. At Arcadis, you will have the opportunity to build the career that is right for you. Because each Arcadian has their own motivations, their own career goals. And, as a people rst business, it is why we will take the time to listen, to understand what you want from your time here, and provide the support you need to achieve your ambitions.

Posted 1 month ago

Apply

9 - 12 years

0 - 0 Lacs

Bengaluru

Work from Office

Naukri logo

Senior Data Engineer Job Summary: We are seeking an experienced and highly motivated Senior Azure Data Engineer to join a Data & Analytics team. The ideal candidate will be a hands-on technical leader responsible for designing, developing, implementing, and managing scalable, robust, and secure data solutions on the Microsoft Azure platform. This role involves leading a team of data engineers, setting technical direction, ensuring the quality and efficiency of data pipelines, and collaborating closely with data scientists, analysts, and business stakeholders to meet data requirements. Key Responsibilities: Lead, mentor, and provide technical guidance to a team of Azure Data Engineers. Design, architect, and implement end-to-end data solutions on Azure, including data ingestion, transformation, storage (lakes/warehouses), and serving layers. Oversee and actively participate in the development, testing, and deployment of robust ETL/ELT pipelines using key Azure services. Establish and enforce data engineering best practices, coding standards, data quality checks, and monitoring frameworks. Ensure data solutions are optimized for performance, cost, scalability, security, and reliability. Collaborate effectively with data scientists, analysts, and business stakeholders to understand requirements and deliver effective data solutions. Manage, monitor, and troubleshoot Azure data platform components and pipelines. Contribute to the strategic technical roadmap for the data platform. Qualifications & Experience: Experience: Minimum 6-8+ years of overall experience in data engineering roles. Minimum 3-4+ years of hands-on experience designing, implementing, and managing data solutions specifically on the Microsoft Azure cloud platform. Proven experience (1-2+ years) in a lead or senior engineering role, demonstrating mentorship and technical guidance capabilities. Education: Bachelor's degree in computer science, Engineering, Information Technology, or a related quantitative field (or equivalent practical experience). Technical Skills: Core Azure Data Services: Deep expertise in Azure Data Factory (ADF), Azure Synapse Analytics (SQL Pools, Spark Pools), Azure Databricks, Azure Data Lake Storage (ADLS Gen2). Data Processing & Programming: Strong proficiency with Spark (using PySpark or Scala) and expert-level SQL skills. Proficiency in Python is highly desired. Data Architecture & Modelling: Solid understanding of data warehousing principles (e.g., Kimball), dimensional modelling, ETL/ELT patterns, and data lake design. Databases: Experience with relational databases (e.g., Azure SQL Database) and familiarity with NoSQL concepts/databases is beneficial. Version Control: Proficiency with Git for code management. Leadership & Soft Skills: Excellent leadership, mentoring, problem-solving, and communication skills, with the ability to collaborate effectively across various teams. Required Skills # Azure Component Proficiency 1 Azure Synapse Analytics High 2 Azure Data Factory High 3 Azure SQL High 4 ADLS Storage High 5 Azure Devops - CICD High 6 Azure Databricks Medium - High 7 Azure Logic App Medium - High 8 Azure Fabric Good to Have, not mandatory 9 Azure Functions Good to Have, not mandatory 10 Azure Purview Good to Have, not mandatory 2. Good experience in Data extraction patterns via ADF - API , Files, Databases. 3. Data Masking in Synapse, RBAC 4. Experience in Data warehousing - Kimbal Modelling. 5. Good communication and collaboration skills.

Posted 1 month ago

Apply

5 - 7 years

15 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

Job Summary We are seeking a skilled and detail-oriented Azure Data Engineer to join our data team. In this role, you will be responsible for designing, building, and maintaining scalable data pipelines and solutions on the Microsoft Azure cloud platform. You will collaborate with data analysts, the reporting team, and business stakeholders to ensure efficient data availability, quality, and governance. Experience Level: Mid-Level/Senior Must have skills: Strong handson experience with Azure Data Factory , Azure Data Lake Storage , and Azure SQL . Good to have skills: Working knowledge on Databricks, Azure Synapse Analytics, Azure functions, Logic app workflows, Log analytics and Azure DevOps. Roles and Responsibilities Design and implement scalable data pipelines using Azure Data Factory , Azure SQL , Databricks , and other Azure services. Develop and maintain data lakes and data warehouses on Azure. Integrate data from various on-premises and cloud-based sources. Create and manage ETL/ELT processes , ensuring data accuracy and performance. Optimize and troubleshoot data pipelines and workflows. Ensure data security, compliance, and governance. Collaborate with business stakeholders to define data requirements and deliver actionable insights. Monitor and maintain Azure data services performance and cost-efficiency.

Posted 1 month ago

Apply

10 - 20 years

37 - 50 Lacs

Pune, Bangalore Rural, Gurugram

Hybrid

Naukri logo

Job Summary: We are looking for an experienced and dynamic AWS Data Architect/Lead Data Engineer to lead the design and implementation of data solutions in the cloud. This role will focus on leveraging AWS technologies to create scalable, reliable, and optimized data architectures that drive business insights and data-driven decision-making. As an AWS Data Architect, you will play a pivotal role in shaping the data strategy, implementing best practices, and ensuring the seamless integration of AWS-based data platforms, with a focus on services like Amazon Redshift, Aurora, and other AWS data services

Posted 1 month ago

Apply

7 - 9 years

0 - 0 Lacs

Thiruvananthapuram

Work from Office

Naukri logo

SUMMARY: This position is responsible for design, development and management of data workflows and pipelines in our Integrated Data and Analytics environment. The role is a blend of technical leadership, team leadership, and hands-on development. RESPONSIBILITIES/TASKS: Data cataloguing, Lineage and governance. A good data Engineer in Azure stack who have exposure to Unity Catalog. EMPLOYMENT QUALIFICATIONS: - Bachelor's degree in a related field. Relevant combination of education and experience may be considered in lieu of degree. - Continuous learning, as defined by the Company's learning philosophy, is required. - Certification or progress toward certification is highly preferred and encouraged. EXPERIENCE: - 10 years' experience in application development. - At least five years of experience in Azure big data services/technologies (Databricks, Azure, Python, unit testing, SQL and Azure Data Lake). - Mandatory Skills: Azure Data Engineering, Data Bricks, Unity Catalog, Python, SQL Required Skills Azure Data Engineering,Data Bricks,Unity Catalog,Python

Posted 1 month ago

Apply

7 - 12 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform, Microsoft Azure Data Services, PySpark Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will work on developing solutions to enhance business processes and meet application needs. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead the team in implementing innovative solutions Conduct regular team meetings to ensure progress and address any challenges Mentor junior team members to enhance their skills Professional & Technical Skills: Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform, Microsoft Azure Data Services, PySpark Strong understanding of cloud-based data services Experience in building and optimizing data pipelines Proficient in data modeling and database design Knowledge of data security and compliance standards Additional Information: The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform. This position is based at our Bengaluru office. A 15 years full-time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

7 - 12 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : PySpark, Microsoft Azure Databricks, Microsoft Azure Data Services Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be involved in designing, building, and configuring applications to meet business process and application requirements. Your typical day will revolve around creating innovative solutions to address specific business needs and ensuring seamless application functionality. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead the development and implementation of new software applications Conduct code reviews and provide feedback to team members Stay updated on industry trends and best practices to enhance application development processes Professional & Technical Skills: Must To Have Skills: Proficiency in PySpark, Microsoft Azure Data Services, Microsoft Azure Databricks Strong understanding of cloud-based data services and platforms Experience in building and optimizing data pipelines for large-scale data processing Knowledge of data warehousing concepts and ETL processes Familiarity with data modeling and database design principles Additional Information: The candidate should have a minimum of 7.5 years of experience in PySpark. This position is based at our Bengaluru office. A 15 years full-time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

5 - 10 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform, Microsoft Azure Data Services, PySpark Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with teams to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead the team in implementing best practices for application development Conduct regular code reviews and provide constructive feedback Stay updated on industry trends and technologies to enhance application performance Professional & Technical Skills: Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform, Microsoft Azure Data Services, PySpark Strong understanding of cloud-based data services and architecture Experience in developing and optimizing data pipelines Knowledge of data modeling and database design principles Familiarity with DevOps practices for continuous integration and deployment Additional Information: The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform This position is based at our Bengaluru office A 15 years full-time education is required Qualification 15 years full time education

Posted 1 month ago

Apply

3 - 8 years

5 - 9 Lacs

Navi Mumbai

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BusinessObjects Data Services Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work-related problems. Develop and implement SAP ABAP solutions for HANA. Collaborate with cross-functional teams to analyze and address application requirements. Conduct code reviews and ensure adherence to coding standards. Troubleshoot and resolve technical issues in applications. Stay updated on industry trends and best practices in SAP ABAP development. Professional & Technical Skills: Must To Have Skills: Proficiency in SAP ABAP Development for HANA. Strong understanding of SAP ABAP programming concepts. Experience in developing and optimizing SAP applications. Knowledge of SAP HANA database and integration with ABAP. Hands-on experience in performance tuning and debugging SAP ABAP code. Additional Information: The candidate should have a minimum of 3 years of experience in SAP ABAP Development for HANA. This position is based at our Mumbai office. A 15 years full-time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

1 - 4 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : SAP BusinessObjects Data Services Good to have skills : NA Educational Qualification : 15 years of full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code across multiple clients. Your day will involve collaborating with team members to perform maintenance and enhancements, ensuring that the applications meet the evolving needs of users and stakeholders. You will also be responsible for troubleshooting issues and implementing solutions that enhance the overall functionality and performance of the applications you work on. Roles & Responsibilities: Expected to be an SME. Collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Facilitate knowledge sharing sessions to enhance team capabilities. Monitor project progress and ensure timely delivery of milestones. Professional & Technical Skills: Must To Have Skills: Proficiency in SAP BusinessObjects Data Services. Strong understanding of data integration and transformation processes. Experience with ETL (Extract, Transform, Load) processes. Familiarity with database management systems and SQL. Ability to troubleshoot and resolve data-related issues. Additional Information: The candidate should have minimum 5 years of experience in SAP BusinessObjects Data Services. This position is based at our Pune office. A 15 years of full time education is required. Qualification 15 years of full time education

Posted 1 month ago

Apply

7 - 12 years

13 - 23 Lacs

Kolkata, Pune, Bengaluru

Hybrid

Naukri logo

We have a drive on coming Saturday(26th Apr25) @ Bangalore, Pune and Kolkata location. Roles and Responsibilities Design, develop, and implement SAP Cloud Platform Integration (CPI) solutions to integrate various systems and applications. Collaborate with cross-functional teams to identify business requirements and design scalable integration architectures. Develop data services using SAP Data Services such as BTP, DCP, IDP, OData, REST, SOAP, XML etc. to enable seamless data exchange between systems. Implement SCPI (SAP Cloud Platform Integration) projects from end-to-end including configuration, testing, deployment and maintenance. Provide technical guidance on best practices for integrating complex enterprise-wide integrations.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies