Home
Jobs

1079 Data Integration Jobs - Page 37

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 6.0 years

7 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Skills PLSQL DEVELOPER Responsibilities: Design and develop PL/SQL programs and packages. Optimize SQL queries for performance. Analyze and debug complex stored procedures. Collaborate with frontend and application teams for data integration. Support production issues and data migration tasks.

Posted 1 month ago

Apply

3 - 7 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Hyperion Essbase Developer Full-time DepartmentEnterprise Applications Company Description Version 1 has celebrated over 26 years in Technology Services and continues to be trusted by global brands to deliver solutions that drive customer success. Version 1 has several strategic technology partners including Microsoft, AWS, Oracle, Red Hat, OutSystems and Snowflake. Were also an award-winning employer reflecting how employees are at the heart of Version 1. Weve been awardedInnovation Partner of the Year Winner 2023 Oracle EMEA Partner Awards, Global Microsoft Modernising Applications Partner of the Year Award 2023, AWS Collaboration Partner of the Year - EMEA 2023 and Best Workplaces for Women by Great Place To Work in UK and Ireland 2023. As a consultancy and service provider, Version 1 is a digital-first environment, and we do things differently. Were focused on our core values; using these weve seen significant growth across our practices and our Digital, Data and Cloud team is preparing for the next phase of expansion. This creates new opportunities for driven and skilled individuals to join one of the fastest-growing consultancies globally. About The Role A Hyperion Essbase Developer will be responsible for designing, developing, and maintaining Oracle Hyperion Planning and Essbase applications . This role requires expertise in multidimensional databases, OLAP technologies, and financial data modeling . Technical Responsibilities Essbase Development : Design and develop BSO (Block Storage Option) and ASO (Aggregate Storage Option) cubes . Implement calculation scripts, business rules, and member formulas . Optimize cube performance using indexing, partitioning, and aggregation techniques. Hyperion Planning : Configure and maintain Hyperion Planning applications . Develop data forms, task lists, and workflow processes . Data Integration & Automation : Implement ETL processes using FDMEE (Financial Data Quality Management Enterprise Edition) . Develop SQL scripts for data extraction and transformation. Automate data loads, metadata updates, and security provisioning . Security & Performance Optimization : Manage user roles, access permissions, and authentication . Optimize query performance using Essbase tuning techniques. Monitor system health and troubleshoot performance issues . Qualifications Oracle Hyperion Planning & Essbase Essbase Calculation Scripts & Business Rules SQL & PL/SQL FDMEE & Data Integration EPM Automate Smart View & Financial Reporting Metadata Management & Security Configuration Additional Information At Version 1, we believe in providing our employees with a comprehensive benefits package that prioritises their well-being, professional growth, and financial stability. One of our standout advantages is the ability to work with a hybrid schedule along with business travel, allowing our employees to strike a balance between work and life. We also offer a range of tech-related benefits, including an innovative Tech Scheme to help keep our team members up-to-date with the latest technology. We prioritise the health and safety of our employees, providing private medical and life insurance coverage, as well as free eye tests and contributions towards glasses. Our team members can also stay ahead of the curve with incentivized certifications and accreditations, including AWS, Microsoft, Oracle, and Red Hat. Our employee-designed Profit Share scheme divides a portion of our company's profits each quarter amongst employees. We are dedicated to helping our employees reach their full potential, offering Pathways Career Development Quarterly, a programme designed to support professional growth. #LI-BS1 Cookies Settings

Posted 1 month ago

Apply

9 - 12 years

12 - 16 Lacs

Hyderabad

Work from Office

Naukri logo

Overview We are seeking an Associate Manager Data IntegrationOps to support and assist in managing data integration and operations (IntegrationOps) programs within our growing data organization. In this role, you will help maintain and optimize data integration workflows, ensure data reliability, and support operational excellence. This position requires a solid understanding of enterprise data integration, ETL/ELT automation, cloud-based platforms, and operational support. Support the management of Data IntegrationOps programs by assisting in aligning with business objectives, data governance standards, and enterprise data strategies. Monitor and enhance data integration platforms by implementing real-time monitoring, automated alerting, and self-healing capabilities to help improve uptime and system performance under the guidance of senior team members. Assist in developing and enforcing data integration governance models, operational frameworks, and execution roadmaps to ensure smooth data delivery across the organization. Support the standardization and automation of data integration workflows, including report generation and dashboard refreshes. Collaborate with cross-functional teams to help optimize data movement across cloud and on-premises platforms, ensuring data availability, accuracy, and security. Provide assistance in Data & Analytics technology transformations by supporting full sustainment capabilities, including data platform management and proactive issue identification with automated solutions. Contribute to promoting a data-first culture by aligning with PepsiCos Data & Analytics program and supporting global data engineering efforts across sectors. Support continuous improvement initiatives to help enhance the reliability, scalability, and efficiency of data integration processes. Engage with business and IT teams to help identify operational challenges and provide solutions that align with the organizations data strategy. Develop technical expertise in ETL/ELT processes, cloud-based data platforms, and API-driven data integration, working closely with senior team members. Assist with monitoring, incident management, and troubleshooting in a data operations environment to ensure smooth daily operations. Support the implementation of sustainable solutions for operational challenges by helping analyze root causes and recommending improvements. Foster strong communication and collaboration skills, contributing to effective engagement with cross-functional teams and stakeholders. Demonstrate a passion for continuous learning and adapting to emerging technologies in data integration and operations. Responsibilities Support and maintain data pipelines using ETL/ELT tools such as Informatica IICS, PowerCenter, DDH, SAP BW, and Azure Data Factory under the guidance of senior team members. Assist in developing API-driven data integration solutions using REST APIs and Kafka to ensure seamless data movement across platforms. Contribute to the deployment and management of cloud-based data platforms like Azure Data Services, AWS Redshift, and Snowflake, working closely with the team. Help automate data pipelines and participate in implementing DevOps practices using tools like Terraform, GitOps, Kubernetes, and Jenkins. Monitor system reliability using observability tools such as Splunk, Grafana, Prometheus, and other custom monitoring solutions, reporting issues as needed. Assist in end-to-end data integration operations by testing and monitoring processes to maintain service quality and support global products and projects. Support the day-to-day operations of data products, ensuring SLAs are met and assisting in collaboration with SMEs to fulfill business demands. Support incident management processes, helping to resolve service outages and ensuring the timely resolution of critical issues. Assist in developing and maintaining operational processes to enhance system efficiency and resilience through automation. Collaborate with cross-functional teams like Data Engineering, Analytics, AI/ML, CloudOps, and DataOps to improve data reliability and contribute to data-driven decision-making. Work closely with teams to troubleshoot and resolve issues related to cloud infrastructure and data services, escalating to senior team members as necessary. Support building and maintaining relationships with internal stakeholders to align data integration operations with business objectives. Engage directly with customers, actively listening to their concerns, addressing challenges, and helping set clear expectations. Promote a customer-centric approach by contributing to efforts that enhance the customer experience and empower the team to advocate for customer needs. Assist in incorporating customer feedback and business priorities into operational processes to ensure continuous improvement. Contribute to the work intake and Agile processes for data platform teams, ensuring operational excellence through collaboration and continuous feedback. Support the execution of Agile frameworks, helping drive a culture of adaptability, efficiency, and learning within the team. Help align the team with a shared vision, ensuring a collaborative approach while contributing to a culture of accountability. Mentor junior technical team members, supporting their growth and ensuring adherence to best practices in data integration. Contribute to resource planning by helping assess team capacity and ensuring alignment with business objectives. Remove productivity barriers in an agile environment, assisting the team to shift priorities as needed without compromising quality. Support continuous improvement in data integration processes by helping evaluate and suggest optimizations to enhance system performance. Leverage technical expertise in cloud and computing technologies to support business goals and drive operational success. Stay informed on emerging trends and technologies, helping bring innovative ideas to the team and supporting ongoing improvements in data operations. Qualifications 9+ years of technology work experience in a large-scale, global organization CPG (Consumer Packaged Goods) industry preferred. 4+ years of experience in Data Integration, Data Operations, and Analytics, supporting and maintaining enterprise data platforms. 4+ years of experience working in cross-functional IT organizations, collaborating with teams such as Data Engineering, CloudOps, DevOps, and Analytics. 1+ years of leadership/management experience supporting technical teams and contributing to operational efficiency initiatives. 4+ years of hands-on experience in monitoring and supporting SAP BW processes for data extraction, transformation, and loading (ETL). Managing Process Chains and Batch Jobs to ensure smooth data load operations and identifying failures for quick resolution. Debugging and troubleshooting data load failures and performance bottlenecks in SAP BW systems. Validating data consistency and integrity between source systems and BW targets. Strong understanding of SAP BW architecture, InfoProviders, DSOs, Cubes, and MultiProviders. Knowledge of SAP BW process chains and event-based triggers to manage and optimize data loads. Exposure to SAP BW on HANA and knowledge of SAPs modern data platforms. Basic knowledge of integrating SAP BW with other ETL/ELT tools like Informatica IICS, PowerCenter, DDH, and Azure Data Factory. Knowledge of ETL/ELT tools such as Informatica IICS, PowerCenter, Teradata, and Azure Data Factory. Hands-on knowledge of cloud-based data integration platforms such as Azure Data Services, AWS Redshift, Snowflake, and Google BigQuery. Familiarity with API-driven data integration (e.g., REST APIs, Kafka), and supporting cloud-based data pipelines. Basic proficiency in Infrastructure-as-Code (IaC) tools such as Terraform, GitOps, Kubernetes, and Jenkins for automating infrastructure management. Understanding of Site Reliability Engineering (SRE) principles, with a focus on proactive monitoring and process improvements. Strong communication skills, with the ability to explain technical concepts clearly to both technical and non-technical stakeholders. Ability to effectively advocate for customer needs and collaborate with teams to ensure alignment between business and technical solutions. Interpersonal skills to help build relationships with stakeholders across both business and IT teams. Customer Obsession: Enthusiastic about ensuring high-quality customer experiences and continuously addressing customer needs. Ownership Mindset: Willingness to take responsibility for issues and drive timely resolutions while maintaining service quality. Ability to support and improve operational efficiency in large-scale, mission-critical systems. Some experience leading or supporting technical teams in a cloud-based environment, ideally within Microsoft Azure. Able to deliver operational services in fast-paced, transformation-driven environments. Proven capability in balancing business and IT priorities, executing solutions that drive mutually beneficial outcomes. Basic experience with Agile methodologies, and an ability to collaborate effectively across virtual teams and different functions. Understanding of master data management (MDM), data standards, and familiarity with data governance and analytics concepts. Openness to learning new technologies, tools, and methodologies to stay current in the rapidly evolving data space. Passion for continuous improvement and keeping up with trends in data integration and cloud technologies.

Posted 1 month ago

Apply

5 - 7 years

7 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

Overview We are seeking an Associate Manager Data IntegrationOps to support and assist in managing data integration and operations (IntegrationOps) programs within our growing data organization. In this role, you will help maintain and optimize data integration workflows, ensure data reliability, and support operational excellence. This position requires a solid understanding of enterprise data integration, ETL/ELT automation, cloud-based platforms, and operational support. Support the management of Data IntegrationOps programs by assisting in aligning with business objectives, data governance standards, and enterprise data strategies. Monitor and enhance data integration platforms by implementing real-time monitoring, automated alerting, and self-healing capabilities to help improve uptime and system performance under the guidance of senior team members. Assist in developing and enforcing data integration governance models, operational frameworks, and execution roadmaps to ensure smooth data delivery across the organization. Support the standardization and automation of data integration workflows, including report generation and dashboard refreshes. Collaborate with cross-functional teams to help optimize data movement across cloud and on-premises platforms, ensuring data availability, accuracy, and security. Provide assistance in Data & Analytics technology transformations by supporting full sustainment capabilities, including data platform management and proactive issue identification with automated solutions. Contribute to promoting a data-first culture by aligning with PepsiCos Data & Analytics program and supporting global data engineering efforts across sectors. Support continuous improvement initiatives to help enhance the reliability, scalability, and efficiency of data integration processes. Engage with business and IT teams to help identify operational challenges and provide solutions that align with the organizations data strategy. Develop technical expertise in ETL/ELT processes, cloud-based data platforms, and API-driven data integration, working closely with senior team members. Assist with monitoring, incident management, and troubleshooting in a data operations environment to ensure smooth daily operations. Support the implementation of sustainable solutions for operational challenges by helping analyze root causes and recommending improvements. Foster strong communication and collaboration skills, contributing to effective engagement with cross-functional teams and stakeholders. Demonstrate a passion for continuous learning and adapting to emerging technologies in data integration and operations. Responsibilities Support and maintain data pipelines using ETL/ELT tools such as Informatica IICS, PowerCenter, DDH, SAP BW, and Azure Data Factory under the guidance of senior team members. Assist in developing API-driven data integration solutions using REST APIs and Kafka to ensure seamless data movement across platforms. Contribute to the deployment and management of cloud-based data platforms like Azure Data Services, AWS Redshift, and Snowflake, working closely with the team. Help automate data pipelines and participate in implementing DevOps practices using tools like Terraform, GitOps, Kubernetes, and Jenkins. Monitor system reliability using observability tools such as Splunk, Grafana, Prometheus, and other custom monitoring solutions, reporting issues as needed. Assist in end-to-end data integration operations by testing and monitoring processes to maintain service quality and support global products and projects. Support the day-to-day operations of data products, ensuring SLAs are met and assisting in collaboration with SMEs to fulfill business demands. Support incident management processes, helping to resolve service outages and ensuring the timely resolution of critical issues. Assist in developing and maintaining operational processes to enhance system efficiency and resilience through automation. Collaborate with cross-functional teams like Data Engineering, Analytics, AI/ML, CloudOps, and DataOps to improve data reliability and contribute to data-driven decision-making. Work closely with teams to troubleshoot and resolve issues related to cloud infrastructure and data services, escalating to senior team members as necessary. Support building and maintaining relationships with internal stakeholders to align data integration operations with business objectives. Engage directly with customers, actively listening to their concerns, addressing challenges, and helping set clear expectations. Promote a customer-centric approach by contributing to efforts that enhance the customer experience and empower the team to advocate for customer needs. Assist in incorporating customer feedback and business priorities into operational processes to ensure continuous improvement. Contribute to the work intake and Agile processes for data platform teams, ensuring operational excellence through collaboration and continuous feedback. Support the execution of Agile frameworks, helping drive a culture of adaptability, efficiency, and learning within the team. Help align the team with a shared vision, ensuring a collaborative approach while contributing to a culture of accountability. Mentor junior technical team members, supporting their growth and ensuring adherence to best practices in data integration. Contribute to resource planning by helping assess team capacity and ensuring alignment with business objectives. Remove productivity barriers in an agile environment, assisting the team to shift priorities as needed without compromising quality. Support continuous improvement in data integration processes by helping evaluate and suggest optimizations to enhance system performance. Leverage technical expertise in cloud and computing technologies to support business goals and drive operational success. Stay informed on emerging trends and technologies, helping bring innovative ideas to the team and supporting ongoing improvements in data operations. Qualifications 5+ years of technology work experience in a large-scale, global organization CPG (Consumer Packaged Goods) industry preferred. 4+ years of experience in Data Integration, Data Operations, and Analytics, supporting and maintaining enterprise data platforms. 4+ years of experience working in cross-functional IT organizations, collaborating with teams such as Data Engineering, CloudOps, DevOps, and Analytics. 3+ years of hands-on experience in MQ & WebLogic administration. 1+ years of leadership/management experience supporting technical teams and contributing to operational efficiency initiatives. Knowledge of ETL/ELT tools such as Informatica IICS, PowerCenter, SAP BW, Teradata, and Azure Data Factory. Hands-on knowledge of cloud-based data integration platforms such as Azure Data Services, AWS Redshift, Snowflake, and Google BigQuery. Familiarity with API-driven data integration (e.g., REST APIs, Kafka), and supporting cloud-based data pipelines. Basic proficiency in Infrastructure-as-Code (IaC) tools such as Terraform, GitOps, Kubernetes, and Jenkins for automating infrastructure management. Understanding of Site Reliability Engineering (SRE) principles, with a focus on proactive monitoring and process improvements. Strong communication skills, with the ability to explain technical concepts clearly to both technical and non-technical stakeholders. Ability to effectively advocate for customer needs and collaborate with teams to ensure alignment between business and technical solutions. Interpersonal skills to help build relationships with stakeholders across both business and IT teams. Customer Obsession: Enthusiastic about ensuring high-quality customer experiences and continuously addressing customer needs. Ownership Mindset: Willingness to take responsibility for issues and drive timely resolutions while maintaining service quality. Ability to support and improve operational efficiency in large-scale, mission-critical systems. Some experience leading or supporting technical teams in a cloud-based environment, ideally within Microsoft Azure. Able to deliver operational services in fast-paced, transformation-driven environments. Proven capability in balancing business and IT priorities, executing solutions that drive mutually beneficial outcomes. Basic experience with Agile methodologies, and an ability to collaborate effectively across virtual teams and different functions. Understanding of master data management (MDM), data standards, and familiarity with data governance and analytics concepts. Openness to learning new technologies, tools, and methodologies to stay current in the rapidly evolving data space. Passion for continuous improvement and keeping up with trends in data integration and cloud technologies.

Posted 1 month ago

Apply

- 5 years

10 - 14 Lacs

Chennai

Work from Office

Naukri logo

Wipro Limited (NYSEWIT, BSE507685, NSEWIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. About The Role Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters ? Mandatory Skills DataIku for ETL operations and preferably other ETL tools like Informatica. Good in Python Coding, SQL, GIT Proven experience as a Data Engineer, Data Integration, Data Analyst. Preferred Skills Banking Domain exposure and AI/ML, Data Science exposure ? 7+ years of experience, including 2+ years of experience in delivering projects in DataIku platforms. Proficiency in configuring and optimizing Dataiku’s architecture, including data connections, security settings and workflow management. Hands-on experience with Dataiku recipes, Designer nodes, API nodes & Automation nodes with deployment. Expertise in python scripting, automation and development of custom workflows in Dataiku Collaborate with data analyst, business stakeholders and client to gather and understand the requirement. To contribute to the developments in DataIku environment to apply data integration with given logic to fulfil Bank Regulatory requirement and other customer requirement. Gather, analyse and interpret requirement specifications received directly from the client. Ability to work independently and effectively in a fast-paced, dynamic environment. Strong analytical and problem-solving skills. Familiarity with agile development methodologies. Participate in the CR/Production deployment implementation process through Azure DevOps ? ? Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries, Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 month ago

Apply

3 - 8 years

7 - 12 Lacs

Mumbai, Gurugram, Delhi / NCR

Work from Office

Naukri logo

Banking- Data & AI Consultant Find endless opportunities to solve our clients toughest challenges, as you work with exceptional people, the latest tech, and leading companies. Practice: Banking, Industry Consulting, Strategy & Consulting Global Network (earlier named as Capability Network) I Areas of Work: Data Driven Consulting & Business Analytics | Level: Consultant | Location: Delhi, Gurgaon, Mumbai | Years of Exp: 3-9 years Explore an Exciting Career at Accenture Are you an outcome-oriented problem solver? Do you enjoy working on transformation strategies for global clients? Does working in an inclusive and collaborative environment spark your interest? Then, is the right place for you to explore limitless possibilities. As a part of within the agile and distributed Strategy & Consulting Global Network team, you will contribute as a strategic partner, helping shape new business models, collaborate with clients in unconventional ways, rediscover their purpose, and commit to constant innovation. Bring in your banking and analytics expertise with a global perspective to enable banks and payment providers take > Engage deeply with C-level executives to define and execute Enterprise-wide Data & AI Strategy programs, often becoming the "trusted advisor" on D&A strategy related topics Provide tailored advice and best practices to help customers implement mission-critical reforms and advances in analytics roadmap execution Work closely with clients to solve complex business problems and deliver business value by leveraging data and analytics capabilities Advise clients on Analytics-driven revenue growth, intelligent operations, fraud, risk and regulatory management, and future-ready platforms Design and implement campaigns around new products leveraging deep industry and client data Define & implement Digital Signals-to-Sales framework at scale Lead the project delivery, support in areas such as process improvement , systems enhancement and user training Interface with cross functional client teams to perform project management activities Support leaders in drafting winning pitches and discover new business opportunities Mentor team members and contribute to their professional development Support asset development , contribute to thought leadership, staffing efforts, and capability development Estimate resource effort and cost budgets to complete a project within a defined scope Provide subject matter expertise , guidance and recommendations that will drive the success and growth of the solutions and programs in the market. Bring your best skills forward to excel in the role: Impeccable team management skills with an ability to engage effectively with multiple stakeholders Ability to lead client workshops and engage with stakeholders Ability to solve complex business problems through a methodical and structured solutioning approach and deliver client delight Strong analytical and writing skills to build viewpoints on industry trends Developing and implementing data analyses, data collection systems and other strategies that optimize statistical efficiency and quality Ability to translate business requirements into non-technical, lay terms Knowledge about programming languages such as SQL, Oracle and Python is good-to-have Knowledge about Data Visualisation Tools such as PowerBI, QLIK and Tableu is good-to-have Excellent communication, interpersonal and presentation skills Cross-cultural competence with an ability to thrive in a dynamic environment Read more about us. Qualifications Your experience counts! MBA from Tier-1 B-school 3-9 years work experience of Data and AI solutions delivery/implementation in top strategy, management, technology consulting firms, and/or analytics firms for banks or financial services company Responsible in delivery and supporting pre-sales opportunities for Data, analytics and AI solutions working with broader team in country, across geographies and with our ecosystem Experience in combining technical expertise with advanced product experience to deliver Data & AI solutions to solve business problems Experience with data integration and governance, preferably from data warehousing and migration projects Identify problems or areas of improvement for the client using data analytics and modelling techniques and propose corrective solutions and implementable roadmap Experience with Data Science and AI concepts Generation of new business by identifying opportunities in existing accounts Lead and support the delivery of technical solutions throughout each stage of the full systems lifecycle during the development of software solutions Participate in the development engagement work plans; identify resource requirements and completing assigned tasks to budget and plan. Effectively identify and communicate any necessary changes to engagement scope, Help set and manage clients expectations to ensure that all customers are delighted with Data and AI products and services Demonstration of working collaboratively with business, technology practitioners, cross-functional teams to design and implement comprehensive strategy and operating models that enable Data & AI solutions for banking industry

Posted 1 month ago

Apply

3 - 8 years

5 - 10 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : No Function Specialty Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for ensuring the smooth functioning of applications and their alignment with business needs. Roles & Responsibilities: Expected to perform independently and become an SME. Required active participation/contribution in team discussions. Contribute in providing solutions to work-related problems. Collaborate with cross-functional teams to gather and analyze requirements. Design, develop, and test applications based on business requirements. Troubleshoot and debug applications to ensure their smooth functioning. Ensure the security and integrity of applications by implementing appropriate measures. Document application design, development, and maintenance processes. Professional & Technical Skills: Must To Have Skills:Proficiency in Ab Initio. Strong understanding of data integration and ETL concepts. Experience in designing and developing Ab Initio graphs and plans. Knowledge of database concepts and SQL. Experience with version control systems such as Git. Good To Have Skills:Experience with data warehousing concepts. Additional Information: The candidate should have a minimum of 3 years of experience in Ab Initio. This position is based at our Pune office. A 15 years full-time education is required. Qualifications 15 years full time education

Posted 1 month ago

Apply

5 - 10 years

8 - 14 Lacs

Surat

Work from Office

Naukri logo

Job Summary : We are seeking a driven and experienced Manager/Senior Manager - QlikSense to lead our Business Intelligence (BI) team. You will play a pivotal role in driving the adoption and strategic use of QlikSense across the organization, empowering data-driven decision making. This role requires a strong understanding of QlikSense development, data integration, and a proven ability to manage and lead a team. Responsibilities : Strategic Leadership : - Develop and implement a comprehensive QlikSense strategy aligned with the organization's business objectives. - Identify opportunities to leverage QlikSense for advanced analytics and data visualization. - Advocate for the value of BI and ensure user adoption of QlikSense across the organization. QlikSense Development and Management : - Lead the development and deployment of robust and scalable QlikSense applications. - Ensure data quality and integrity within QlikSense applications. - Implement security measures and access controls for QlikSense applications. Team Management : - Lead, mentor, and coach a team of QlikSense developers and analysts. - Foster a collaborative and results-oriented team environment. - Delegate tasks and provide clear performance expectations. Collaboration and Communication : - Partner with business stakeholders to understand their data needs and translate them into actionable insights through QlikSense applications. - Effectively communicate complex data and analytics concepts to diverse audiences. - Develop and maintain strong relationships with IT and other departments. Innovation and Continuous Improvement : - Stay abreast of the latest QlikSense features and functionalities. - Identify and implement new technologies and best practices to enhance QlikSense development and user experience. - Continuously improve the efficiency and effectiveness of BI processes within the organization. Qualifications : - Master's degree in Business Administration, Information Technology, or a related field (or equivalent experience). - Minimum 5 years of experience in Business Intelligence or Data Analytics (adjust X based on your needs). - Proven experience managing and leading a team of BI professionals. - In-depth knowledge of QlikSense development, data modeling, and scripting techniques. - Strong understanding of data integration concepts and tools. - Excellent communication, collaboration, and interpersonal skills. - Experience working with stakeholders at all levels to translate data into actionable insights. - Proven ability to manage projects effectively and deliver results on time and within budget.

Posted 1 month ago

Apply

2 - 5 years

7 - 11 Lacs

Mumbai, Hyderabad

Work from Office

Naukri logo

EDS Specialist - NAV02JP Company Worley Primary Location IND-MM-Navi Mumbai Job Engineering Design Systems (EDS) Schedule Full-time Employment Type Employee Job Level Experienced Job Posting Apr 4, 2025 Unposting Date May 30, 2025 Reporting Manager Title Senior Engineering Design Systems Specialist We deliver the worlds most complex projects. Work as part of a collaborative and inclusive team. Enjoy a varied & challenging role. Building on our past. Ready for the future Worley is a global professional services company of energy, chemicals and resources experts headquartered in Australia. Right now, were bridging two worlds as we accelerate to more sustainable energy sources, while helping our customers provide the energy, chemicals and resources that society needs now. We partner with our customers to deliver projects and create value over the life of their portfolio of assets. We solve complex problems by finding integrated data-centric solutions from the first stages of consulting and engineering to installation and commissioning, to the last stages of decommissioning and remediation. Join us and help drive innovation and sustainability in our projects The Role As an EDS Specialist with Worley, you will work closely with our existing team to deliver projects for our clients while continuing to develop your skills and experience etc. The Asset Information Management Administrator is responsible for project set up, maintenance and support of the system. The successful applicant will require significant SQL skills as well as the ability to work as part of a team to deliver integrated solutions. He or she will work within all aspects of data integration complex systems that include design and engineering systems, data management as well as extraction and validation of legacy data. Being part of the AIM team will provide the successful applicant with a network of support, but it is also a clear requirement of this role to be able to work independently. This role offers significant opportunity to be part of cutting-edge technologies and to work on some of the largest engineering projects in the world. About You To be considered for this role it is envisaged you will possess the following attributes: Certification in AVEVA AIM-A administration or related technologies. Experience with 3D visualization tools and techniques. Knowledge of SQL databases and query optimization techniques. Knowledge in programming using, SQL, python, PowerShell, VBA Associate or bachelors degree (or equivalent experience) in a relevant engineering discipline. Proven 10 years or more knowledge of engineering, design, procurement or construction within a relevant industry (e.g. Oil & Gas, Energy or Marine industries) Proven 3 years or more experience of AVEVA products or similar technologies. Proven experience in AVEVA AIM-A administration, including configuration, deployment, and support. Proficiency in PowerShell programming, with experience developing scripts and tools for automation. Proficiency in SQL programming, with experience developing tables, queries, views and stored procedures for data storing, analyses, and reporting. Familiarity with AVEVA ISM and its integration capabilities. Proficiency in creating dashboards and reports using Power BI or similar BI tools. Strong analytical skills and experience with data analysis techniques and tools Moving forward together We want our people to be energized and empowered to drive sustainable impact. So, our focus is on a values-inspired culture that unlocks brilliance through belonging, connection and innovation. Were building a diverse, inclusive and respectful workplace. Creating a space where everyone feels they belong, can be themselves, and are heard. And we're not just talking about it; we're doing it. We're reskilling our people, leveraging transferable skills, and supporting the transition of our workforce to become experts in today's low carbon energy infrastructure and technology. Whatever your ambition, theres a path for you here. And theres no barrier to your potential career success. Join us to broaden your horizons, explore diverse opportunities, and be part of delivering sustainable change. Worley takes personal data protection seriously and respects EU and local data protection laws. You can read our full Recruitment Privacy Notice Here. Please noteIf you are being represented by a recruitment agency you will not be considered, to be considered you will need to apply directly to Worley.

Posted 1 month ago

Apply

3 - 5 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role Role Purpose The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions. ? Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customer’s business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement ? Deliver NoPerformance ParameterMeasure1.Analyses data sets and provide relevant information to the clientNo. Of automation done, On-Time Delivery, CSAT score, Zero customer escalation, data accuracy ? ? Mandatory Skills: Tableau. Experience3-5 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 month ago

Apply

5 - 8 years

3 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

About The Role Role Purpose The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions. ? Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customer??s business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement ? Deliver NoPerformance ParameterMeasure1.Analyses data sets and provide relevant information to the clientNo. Of automation done, On-Time Delivery, CSAT score, Zero customer escalation, data accuracy ? ? Mandatory Skills: Geographic Info. Systems(Car support). Experience5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 month ago

Apply

5 - 8 years

3 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

About The Role Role Purpose The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions. ? Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customer??s business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement ? Deliver NoPerformance ParameterMeasure1.Analyses data sets and provide relevant information to the clientNo. Of automation done, On-Time Delivery, CSAT score, Zero customer escalation, data accuracy ? ? Mandatory Skills: Database Architecting. Experience5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 month ago

Apply

5 - 8 years

3 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

About The Role Role Purpose The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions. ? Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customer??s business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement ? Deliver NoPerformance ParameterMeasure1.Analyses data sets and provide relevant information to the clientNo. Of automation done, On-Time Delivery, CSAT score, Zero customer escalation, data accuracy ? ? Mandatory Skills: Business Analyst/ Data Analyst(Maps). Experience5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 month ago

Apply

4 - 6 years

30 - 34 Lacs

Bengaluru

Work from Office

Naukri logo

Overview Annalect is seeking a hands-on Data QA Manager to lead and elevate data quality assurance practices across our growing suite of software and data products. This is a technical leadership role embedded within our Technology teams, focused on establishing best-in-class data quality processes that enable trusted, scalable, and high-performance data solutions. As a Data QA Manager, you will drive the design, implementation, and continuous improvement of end-to-end data quality frameworks, with a strong focus on automation, validation, and governance. You will work closely with data engineering, product, and analytics teams to ensure data integrity, accuracy, and compliance across complex data pipelines, platforms, and architectures, including Data Mesh and modern cloud-based ecosystems. This role requires deep technical expertise in SQL, Python, data testing frameworks like Great Expectations, data orchestration tools (Airbyte, DbT, Trino, Starburst), and cloud platforms (AWS, Azure, GCP). You will lead a team of Data QA Engineers while remaining actively involved in solution design, tool selection, and hands-on QA execution. Responsibilities Key Responsibilities: Develop and implement a comprehensive data quality strategy aligned with organizational goals and product development initiatives. Define and enforce data quality standards, frameworks, and best practices, including data validation, profiling, cleansing, and monitoring processes. Establish data quality checks and automated controls to ensure the accuracy, completeness, consistency, and timeliness of data across systems. Collaborate with Data Engineering, Product, and other teams to design and implement scalable data quality solutions integrated within data pipelines and platforms. Define and track key performance indicators (KPIs) to measure data quality and effectiveness of QA processes, enabling actionable insights for continuous improvement. Generate and communicate regular reports on data quality metrics, issues, and trends to stakeholders, highlighting opportunities for improvement and mitigation plans. Maintain comprehensive documentation of data quality processes, procedures, standards, issues, resolutions, and improvements to support organizational knowledge-sharing. Provide training and guidance to cross-functional teams on data quality best practices, fostering a strong data quality mindset across the organization. Lead, mentor, and develop a team of Data QA Analysts/Engineers, promoting a high-performance, collaborative, and innovative culture. Provide thought leadership and subject matter expertise on data quality, influencing technical and business stakeholders toward quality-focused solutions. Continuously evaluate and adopt emerging tools, technologies, and methodologies to advance data quality assurance capabilities and automation. Stay current with industry trends, innovations, and evolving best practices in data quality, data engineering, and analytics to ensure cutting-edge solutions. Qualifications Required Skills 11+ years of hands-on experience in Data Quality Assurance, Data Test Automation, Data Comparison, and Validation across large-scale datasets and platforms. Strong proficiency in SQL for complex data querying, data validation, and data quality investigations across relational and distributed databases. Deep knowledge of data structures, relational and non-relational databases, stored procedures, packages, functions, and advanced data manipulation techniques. Practical experience with leading data quality tools such as Great Expectations, DbT tests, and data profiling and monitoring solutions. Experience with data mesh and distributed data architecture principles for enabling decentralized data quality frameworks. Hands-on experience with modern query engines and data platforms, including Trino/Presto, Starburst, and Snowflake. Experience working with data integration and ETL/ELT tools such as Airbyte, AWS Glue, and DbT for managing and validating data pipelines. Strong working knowledge of Python and related data libraries (e.g., Pandas, NumPy, SQLAlchemy) for building data quality tests and automation scripts.

Posted 1 month ago

Apply

7 - 11 years

11 - 16 Lacs

Mumbai

Work from Office

Naukri logo

At Sogeti, we believe the best is inside every one of us. Whether you are early in your career or at the top of your game, well encourage you to fulfill your potentialto be better. Through our shared passion for technology, our entrepreneurial culture , and our focus on continuous learning, well provide everything you need to doyour best work and become the best you can be. About The Role A Data Platform Engineer specialises in the design, build, and maintenance of cloud-based data infrastructure and platforms for data-intensive applications and services. They develop Infrastructure as Code and manage the foundational systems and tools for efficient data storage, processing, and management. This role involves architecting robust and scalable cloud data infrastructure, including selecting and implementing suitable storage solutions, data processing frameworks, and data orchestration tools. Additionally, a Data Platform Engineer ensures the continuous evolution of the data platform to meet changing data needs and leverage technological advancements, while maintaining high levels of data security, availability, and performance. They are also tasked with creating and managing processes and tools that enhance operational efficiency, including optimising data flow and ensuring seamless data integration, all of which are essential for enabling developers to build, deploy, and operate data-centric applications efficiently. About The Role - Grade Specific An expert on the principles and practices associated with data platform engineering, particularly within cloud environments, and demonstrates proficiency in specific technical areas related to cloud-based data infrastructure, automation, and scalability.Key responsibilities encompass:Team Leadership and ManagementSupervising a team of platform engineers, with a focus on team dynamics and the efficient delivery of cloud platform solutions.Technical Guidance and Decision-MakingProviding technical leadership and making pivotal decisions concerning platform architecture, tools, and processes. Balancing hands-on involvement with strategic oversight.Mentorship and Skill DevelopmentGuiding team members through mentorship, enhancing their technical proficiencies, and nurturing a culture of continual learning and innovation in platform engineering practices.In-Depth Technical ProficiencyPossessing a comprehensive understanding of platform engineering principles and practices, and demonstrating expertise in crucial technical areas such as cloud services, automation, and system architecture.Community ContributionMaking significant contributions to the development of the platform engineering community, staying informed about emerging trends, and applying this knowledge to drive enhancements in capability. Skills (competencies) Part of the Capgemini Group, Sogeti makes business value through technology for organizations that need to implement innovation at speed and want a localpartner with global scale. With a hands-on culture and close proximity to its clients, Sogeti implements solutions that will help organizations work faster, better, andsmarter. By combining its agility and speed of implementation through a DevOps approach, Sogeti delivers innovative solutions in quality engineering, cloud andapplication development, all driven by AI, data and automation.

Posted 1 month ago

Apply

2 - 7 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Hello Talented Techie! We provide support in Project Services and Transformation, Digital Solutions and Delivery Management. We offer joint operations and digitalization services for Global Business Services and work closely alongside the entire Shared Services organization. We make efficient use of the possibilities of new technologies such as Business Process Management (BPM) and Robotics as enablers for efficient and effective implementations. We are looking for Data Engineer ( AWS, Confluent & Snaplogic ) Data Integration Integrate data from various Siemens organizations into our data factory, ensuring seamless data flow and real-time data fetching. Data Processing Implement and manage large-scale data processing solutions using AWS Glue, ensuring efficient and reliable data transformation and loading. Data Storage Store and manage data in a large-scale data lake, utilizing Iceberg tables in Snowflake for optimized data storage and retrieval. Data Transformation Apply various data transformations to prepare data for analysis and reporting, ensuring data quality and consistency. Data Products Create and maintain data products that meet the needs of various stakeholders, providing actionable insights and supporting data-driven decision-making. Workflow Management Use Apache Airflow to orchestrate and automate data workflows, ensuring timely and accurate data processing. Real-time Data Streaming Utilize Confluent Kafka for real-time data streaming, ensuring low-latency data integration and processing. ETL Processes Design and implement ETL processes using SnapLogic , ensuring efficient data extraction, transformation, and loading. Monitoring and Logging Use Splunk for monitoring and logging data processes, ensuring system reliability and performance. You"™d describe yourself as: Experience 3+ relevant years of experience in data engineering, with a focus on AWS Glue, Iceberg tables, Confluent Kafka, SnapLogic, and Airflow. Technical Skills : Proficiency in AWS services, particularly AWS Glue. Experience with Iceberg tables and Snowflake. Knowledge of Confluent Kafka for real-time data streaming. Familiarity with SnapLogic for ETL processes. Experience with Apache Airflow for workflow management. Understanding of Splunk for monitoring and logging. Programming Skills Proficiency in Python, SQL, and other relevant programming languages. Data Modeling Experience with data modeling and database design. Problem-Solving Strong analytical and problem-solving skills, with the ability to troubleshoot and resolve data-related issues. Preferred Qualities: Attention to Detail Meticulous attention to detail, ensuring data accuracy and quality. Communication Skills Excellent communication skills, with the ability to collaborate effectively with cross-functional teams. Adaptability Ability to adapt to changing technologies and work in a fast-paced environment. Team Player Strong team player with a collaborative mindset. Continuous Learning Eagerness to learn and stay updated with the latest trends and technologies in data engineering. Create a better #TomorrowWithUs! This role, based in Bangalore, is an individual contributor position. You may be required to visit other locations within India and internationally. In return, you'll have the opportunity to work with teams shaping the future. At Siemens, we are a collection of over 312,000 minds building the future, one day at a time, worldwide. We value your unique identity and perspective and are fully committed to providing equitable opportunities and building a workplace that reflects the diversity of society. Come bring your authentic self and create a better tomorrow with us. Find out more about Siemens careers at: www.siemens.com/careers

Posted 1 month ago

Apply

3 - 5 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Hello Talented Techie! We provide support in Project Services and Transformation, Digital Solutions and Delivery Management. We offer joint operations and digitalization services for Global Business Services and work closely alongside the entire Shared Services organization. We make efficient use of the possibilities of new technologies such as Business Process Management (BPM) and Robotics as enablers for efficient and effective implementations. We are looking for Data Engineer We are looking for a skilled Data Architect/Engineer with strong expertise in AWS and data lake solutions. If you"™re passionate about building scalable data platforms, this role is for you. Your responsibilities will include: Architect & Design Build scalable and efficient data solutions using AWS services like Glue, Redshift, S3, Kinesis (Apache Kafka), DynamoDB, Lambda, Glue Streaming ETL, and EMR. Real-Time Data Integration Integrate real-time data from multiple Siemens orgs into our central data lake. Data Lake Management Design and manage large-scale data lakes using S3, Glue, and Lake Formation. Data Transformation Apply transformations to ensure high-quality, analysis-ready data. Snowflake Integration Build and manage pipelines for Snowflake, using Iceberg tables for best performance and flexibility. Performance Tuning Optimize pipelines for speed, scalability, and cost-effectiveness. Security & Compliance Ensure all data solutions meet security standards and compliance guidelines. Team Collaboration Work closely with data engineers, scientists, and app developers to deliver full-stack data solutions. Monitoring & Troubleshooting Set up monitoring tools and quickly resolve pipeline issues when needed. You"™d describe yourself as: Experience 3+ years of experience in data engineering or cloud solutioning, with a focus on AWS services. Technical Skills Proficiency in AWS services such as AWS API, AWS Glue, Amazon Redshift, S3, Apache Kafka and Lake Formation. Experience with real-time data processing and streaming architectures. Big Data Querying Tools: Solid understanding of big data querying tools (e.g., Hive, PySpark). Programming Strong programming skills in languages such as Python, Java, or Scala for building and maintaining scalable systems. Problem-Solving Excellent problem-solving skills and the ability to troubleshoot complex issues. Communication Strong communication skills, with the ability to work effectively with both technical and non-technical stakeholders. Certifications AWS certifications are a plus. Create a better #TomorrowWithUs! This role, based in Bangalore, is an individual contributor position. You may be required to visit other locations within India and internationally. In return, you'll have the opportunity to work with teams shaping the future. At Siemens, we are a collection of over 312,000 minds building the future, one day at a time, worldwide. We value your unique identity and perspective and are fully committed to providing equitable opportunities and building a workplace that reflects the diversity of society. Come bring your authentic self and create a better tomorrow with us. Find out more about Siemens careers at: www.siemens.com/careers

Posted 1 month ago

Apply

4 - 9 years

14 - 18 Lacs

Noida

Work from Office

Naukri logo

Who We Are Build a brighter future while learning and growing with a Siemens company at the intersection of technology, community and s ustainability. Our global team of innovators is always looking to create meaningful solutions to some of the toughest challenges facing our world. Find out how far your passion can take you. What you need * BS in an Engineering or Science discipline, or equivalent experience * 7+ years of software/data engineering experience using Java, Scala, and/or Python, with at least 5 years' experience in a data focused role * Experience in data integration (ETL/ELT) development using multiple languages (e.g., Java, Scala, Python, PySpark, SparkSQL) * Experience building and maintaining data pipelines supporting a variety of integration patterns (batch, replication/CD C, event streaming) and data lake/warehouse in production environments * Experience with AWS-based data services technologies (e.g., Kinesis, Glue, RDS, Athena, etc.) and Snowflake CDW * Experience of working in the larger initiatives building and rationalizing large scale data environments with a large variety of data pipelines, possibly with internal and external partner integrations, would be a plus * Willingness to experiment and learn new approaches and technology applications * Knowledge and experience with various relational databases and demonstrable proficiency in SQL and supporting analytics uses and users * Knowledge of software engineering and agile development best practices * Excellent written and verbal communication skills The Brightly culture We"™re guided by a vision of community that serves the ambitions and wellbeing of all people, and our professional communities are no exception. We model that ideal every day by being supportive, collaborative partners to one another, conscientiousl y making space for our colleagues to grow and thrive. Our passionate team is driven to create a future where smarter infrastructure protects the environments that shape and connect us all. That brighter future starts with us.

Posted 1 month ago

Apply

2 - 5 years

11 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Technical Expert BSI We are looking for Technical Expert to be part of our Business Solutions Integrations team in the Analytics, Data and Integration stream. Position Snapshot LocationBengaluru Type of ContractPermanent Analytics, Data and Integration Type of workHybrid Work LanguageFluent Business English The role The Integration Technical expert will be working in the Business Solution Integration team focused on the Product Engineering and Operations related to Data Integration, Digital integration, and Process Integration the products in the in-Business solution integration and the initiatives where these products are used. Will work together with the Product Manager and Product Owners, as well as various other counterparts in the evolution of the DI, PI, and Digital Products. Will work with architects for orchestrating the design of the integration solutions. Will also a ct as the first point of contact for project teams to manage demand and will help to drive the transition from engineering to sustain as per the BSI standards. Will work with Operations Managers and Sustain teams on the orchestration of the operations activities, proposing improvements for better performance of the platforms. What you’ll do Work with architects to understand and orchestrate the design choices between the different Data, Process and Digital Integration patterns for fulfilling the data needs. Translate the various requirements into the deliverables for the development and implementation of Process, Data and Digital Integration solutions, following up the requests for getting the work done. Design, develop, and implement integration solutions using ADF, LTRS, Data Integration , SAP PO, CPI, Logic Apps MuleSoft, and Confluent. Work with the Operations Managers and Sustain teams for orchestrating performance and operational issues. We offer you We offer more than just a job. We put people first and inspire you to become the best version of yourself. Great benefits including competitive salary and a comprehensive social benefits package. We have one of the most competitive pension plans on the market, as well as flexible remuneration with tax advantageshealth insurance, restaurant card, mobility plan, etc . Personal and professional growth through ongoing training and constant career opportunities reflecting our conviction that people are our most important asset. Minimum qualifications Minimum of 7 years industry experience in software delivery projects Experience in project and product management, agile methodologies and solution delivery at scale. Skilled and experienced Technical Integration Expert with experience various integration platforms and tools, including ADF, LTRS, Data Integration , SAP PO, CPI, Logic Apps, , MuleSoft, and Confluent. Ability to contribute to a high-performing, motivated workgroup by applying interpersonal and collaboration skills to achieve goals. Fluency in English with excellent oral and written communication skills. Experience in working with cultural diversityrespect for various cultures and understanding how to work with a variety of cultures in the most effective way. Bonus Points If You Experience with the Azure platform (especially with Data Factory) Experience with Azure DevOps and with Service Now Experience with Power Apps and Power BI About the IT Hub We are a team of IT professionals from many countries and diverse backgrounds, each with unique missions and challenges in the biggest health, nutrition and wellness company of the world. We innovate every day through forward-looking technologies to create opportunities for Nestl’s digital challenges with our consumers, customers and at the workplace. We collaborate with our business partners around the world to deliver standardized, integrated technology products and services to create tangible business value. About Nestl We are Nestl, the largest food and beverage company. We are approximately 275,000 employees strong, driven by the purpose of enhancing the quality of life and contributing to a healthier future. Our values are rooted in respectrespect for ourselves, respect for others, respect for diversity and respect for our future. With more than CHF 94.4 ?billion sales in 2022, we have an expansive presence, with 344 ?factories in 77 ?countries. Want to learn more? Visit us at www.nestle.com . ?We encourage the diversity of applicants across gender, age, ethnicity, nationality, sexual orientation, social background, religion or belief and disability. Step outside your comfort zone; share your ideas, way of thinking and working to make a difference to the world, every single day. You own a piece of the action – make it count. Join IT Hub Nestl #beaforceforgood How we will proceed You send us your CV ? We contact relevant applicants ? Interviews ? Feedback ? Job Offer communication to the Finalist ? First working day

Posted 1 month ago

Apply

1 - 5 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Technical Expert BSI We are looking for Technical Expert to be part of our Business Solutions Integrations team in the Analytics, Data and Integration stream. Position Snapshot LocationBengaluru Type of ContractPermanent Analytics, Data and Integration Type of workHybrid Work LanguageFluent Business English The role The Integration Technical expert will be working in the Business Solution Integration team focused on the Product Engineering and Operations related to Data Integration, Digital integration, and Process Integration the products in the in-Business solution integration and the initiatives where these products are used. Will work together with the Product Manager and Product Owners, as well as various other counterparts in the evolution of the DI, PI, and Digital Products. Will work with architects for orchestrating the design of the integration solutions. Will also a ct as the first point of contact for project teams to manage demand and will help to drive the transition from engineering to sustain as per the BSI standards. Will work with Operations Managers and Sustain teams on the orchestration of the operations activities, proposing improvements for better performance of the platforms. What you’ll do Work with architects to understand and orchestrate the design choices between the different Data, Process and Digital Integration patterns for fulfilling the data needs. Translate the various requirements into the deliverables for the development and implementation of Process, Data and Digital Integration solutions, following up the requests for getting the work done. Design, develop, and implement integration solutions using SAP PO, CPI, Logic Apps , ADF, LTRS, Data Integration, MuleSoft, and Confluent. Work with the Operations Managers and Sustain teams for orchestrating performance and operational issues. We offer you We offer more than just a job. We put people first and inspire you to become the best version of yourself. Great benefits including competitive salary and a comprehensive social benefits package. We have one of the most competitive pension plans on the market, as well as flexible remuneration with tax advantageshealth insurance, restaurant card, mobility plan, etc . Personal and professional growth through ongoing training and constant career opportunities reflecting our conviction that people are our most important asset. Minimum qualifications Minimum of 7 years industry experience in software delivery projects Experience in project and product management, agile methodologies and solution delivery at scale. Skilled and experienced Technical Integration Expert with experience various integration platforms and tools, including SAP PO, CPI, Logic Apps, ADF, LTRS, Data Integration, MuleSoft, and Confluent. Ability to contribute to a high-performing, motivated workgroup by applying interpersonal and collaboration skills to achieve goals. Fluency in English with excellent oral and written communication skills. Experience in working with cultural diversityrespect for various cultures and understanding how to work with a variety of cultures in the most effective way. Bonus Points If You Experience with the Azure platform (especially with Data Factory) Experience with Azure DevOps and with Service Now Experience with Power Apps and Power BI About the IT Hub We are a team of IT professionals from many countries and diverse backgrounds, each with unique missions and challenges in the biggest health, nutrition and wellness company of the world. We innovate every day through forward-looking technologies to create opportunities for Nestl’s digital challenges with our consumers, customers and at the workplace. We collaborate with our business partners around the world to deliver standardized, integrated technology products and services to create tangible business value. About Nestl We are Nestl, the largest food and beverage company. We are approximately 275,000 employees strong, driven by the purpose of enhancing the quality of life and contributing to a healthier future. Our values are rooted in respectrespect for ourselves, respect for others, respect for diversity and respect for our future. With more than CHF 94.4 ?billion sales in 2022, we have an expansive presence, with 344 ?factories in 77 ?countries. Want to learn more? Visit us at www.nestle.com . ?We encourage the diversity of applicants across gender, age, ethnicity, nationality, sexual orientation, social background, religion or belief and disability. Step outside your comfort zone; share your ideas, way of thinking and working to make a difference to the world, every single day. You own a piece of the action – make it count. Join IT Hub Nestl #beaforceforgood How we will proceed You send us your CV ? We contact relevant applicants ? Interviews ? Feedback ? Job Offer communication to the Finalist ? First working day

Posted 1 month ago

Apply

2 - 7 years

10 - 15 Lacs

Pune

Work from Office

Naukri logo

About The Role : Job TitleEPM Database developer LocationPune, India Role Description Testing ETL processes to ensure data accuracy, completeness and integrity throughout the extraction, transformation and loading processes. Scripting knowledge to automate the processes. Expert in tools like SQL, Python/Java, ETL testing frameworks, Tableau or Google looker. Develop and automate test scripts for ETL workflows, data pipelines to ensure database testing coverage. Identifying quality issues, collaborate with ETL developers, data engineers to understand data requirements. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Design databases that are functional, dependable, and stable - Partner with development teams to design data management and governance protocols, manage the information lifecycle, design infrastructure, and data integration controls Understanding business strategy and cascades business intelligence needs to the database level Execute tests and update databases in accordance with requests. Develop and maintain complex models and logical database designs. Create and Maintaining the ETL system to ensure data accuracy and integrity throughout the ETL process including data validation, cleansing, deduplication, and error handling to ensure reliable and usable data. Monitor its performance, updating ETL scripts and workflows as business requirements change, and ensuring the system scales with data growth. Perform performance tuning for optimizing ETL processes for speed and efficiency, addressing bottlenecks, and ensuring the ETL system can handle the volume, velocity, and variety of data. Create technical and training guides. Support users' data management. Verify that all database programs adhere to the organization's and performance standards. Conduct research and make recommendations for fresh database services, products, and methods. Your skills and experience Proven work experience (10+ years) as a Database developer. Proficient with relational databases (e.g. Oracle, SQL Server, MySQL, PostgreSQL) In-depth understanding of data management (e.g. permissions, recovery, security and monitoring) Experience in designing and implementing database structures and automating data flows and DBA tasks Experience with data modeling and schema design. Demonstrated experience in tuning database and its objects Understanding of database design practices, including database normalization concepts Familiarity with working withJavaScript, HTML, Net Framework, and Oracle. Excellent analytical and organisational skills. Understanding the needs of front-end users and having a problem-solving mindset. Excellent verbal and written communication skills. Degree in Computer Science or relevant field How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm

Posted 1 month ago

Apply

3 - 8 years

18 - 33 Lacs

Bengaluru

Work from Office

Naukri logo

About The Engineering Team at Traveloka is the backbone of our innovation, building scalable and high-performance systems that power millions of users worldwide. With a relentless focus on scalability and performance, we ensure a seamless experience for travelers. Our dedication to excellence and problem-solving makes us instrumental in shaping Traveloka's future as a leader in the digital travel space. Youll be joining a team that has built industry-leading, high-performance backend systems. Our engineers work on cutting-edge technologies, solving complex challenges in distributed computing, API design, and high-traffic systems. Backend engineering is evolving rapidly, and we’re looking for developers who are eager to build the next big thing in scalable backend architecture. What You'll be Doing: Design, build, and maintain scalable backend applications, APIs, and data integrations. Write clean, efficient, and maintainable code while adhering to best practices. Improve and optimize existing systems for better performance and reliability. Monitor system performance, troubleshoot issues, and respond to alerts proactively. Collaborate with cross-functional teams to deliver high-impact solutions. Conduct code reviews, unit testing, and integration testing to ensure software quality. Participate in architectural discussions and propose innovative solutions for high-traffic environments. Contribute to post-mortem analyses and continuously improve system resilience. Requirements Bachelor’s degree in Computer Science from a reputable university. Minimum 5 years of experience in software engineering, particularly in backend development. Proficiency in Go, Java, and Python. Ability to dive deep and work across backend services, ensuring efficiency, scalability, and maintainability. Experience designing scalable and maintainable architectures. Eagerness to continuously learn—whether it’s technology-related, product-related, or beyond. A strong sense of ownership and accountability for both the product(s) and assigned tasks. Fluency in English, both spoken and written. Prior technical engineering experience or relevant work experience is a plus. If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us!

Posted 1 month ago

Apply

2 - 5 years

10 - 14 Lacs

Kochi

Work from Office

Naukri logo

We are looking for a skilled ETL Developer with 2 to 5 years of experience to join our team in Bengaluru. The ideal candidate will have hands-on experience in developing data integration routines using Azure Data Factory, Azure Data Bricks, Scala/PySpark Notebooks, Azure PaaS SQL & Azure BLOB Storage. ### Roles and Responsibility Convert business and technical requirements into appropriate technical solutions and implement features using Azure Data Factory, Databricks, and Azure Data Lake Store. Implement data integration features using Azure Data Factory, Azure Data Bricks, and Scala/PySpark Notebooks. Set up and maintain Azure PaaS SQL databases and database objects, including Azure BLOB Storage. Create complex queries, including dynamic queries, for data ingestion. Own project tasks and ensure timely completion. Maintain effective communication within the team, with peers, leadership teams, and other IT groups. ### Job Requirements Bachelor's degree in Computer Science or equivalent. Minimum 2-5 years of experience as a software developer. Hands-on experience in developing data integration routines using Azure Data Factory, Azure Data Bricks, Scala/PySpark Notebooks, Azure PaaS SQL, and Azure BLOB Storage. Experience/knowledge in Azure Data Lake and related services. Ability to take accountability for quality technical deliverables to agreed schedules and estimates. Strong verbal and written communication skills. Must be an outstanding team player. Ability to manage and prioritize workload. Quick learner with a 'can-do' attitude. Flexible and able to quickly adapt to change.

Posted 1 month ago

Apply

2 - 5 years

7 - 11 Lacs

Noida

Work from Office

Naukri logo

We are looking for a highly skilled and experienced Staff Consultant with expertise in Oracle Analytics Cloud to join our team. The ideal candidate will have 2-5 years of experience in Data warehousing and Business Intelligence projects, along with a strong track record of delivering solutions to different lines of business. ### Roles and Responsibility Design, build, and enable solutions on various Analytics Solutions. Engage with customers to discover business problems and goals. Develop solutions using different cloud services. Deliver PoCs tailored to customer needs. Run and deliver customer hands-on workshops. Collaborate with all roles at the customer, including executives, architects, technical staff, and business representatives. ### Job Requirements Expertise in Oracle's analytics offerings: Oracle Analytics Cloud, Data Visualization, OBIEE, Fusion Analytics for Warehouse. Solution design skills to provide expertise and guide customers for specific needs. Hands-on experience with Analytics and Data Warehousing report/solution development. Experience in configuring OBIEE/OAC security (Authentication and Authorization – Object level and Data level security) and tuning reports. Knowledge in developing Oracle BI Repository (RPD). Solid knowledge of data extraction using SQL. Good understanding of Oracle Applications – Oracle E-business Suite or Oracle ERP, Oracle HCM (the Oracle cloud SaaS offering) is preferable. Deep knowledge of Database, Cloud Concepts, Autonomous Data warehouse (ADW), Data Integration tools such as ODI, Informatica, etc., is an added advantage. B.E. / B.Tech. / Master’s degree required.

Posted 1 month ago

Apply

12 - 17 years

11 - 15 Lacs

Kochi

Work from Office

Naukri logo

We are looking for a highly motivated and experienced professional with 12 to 17 years of experience to join our team as a Manager/Lead Consultant, Digital Transformation (SAP BW/SAC Analytics & Planning specialist), FAAS, GDS in Bengaluru. ### Roles and Responsibility Lead SAP Analytics Cloud (SAC) and SAP Business Warehouse (BW) implementation projects focused on analytics and planning. Design, configure, and implement SAC modules including Business Intelligence (BI), Planning, and Predictive Analytics, as well as BW data models and data flows. Customize SAC solutions to meet specific business process requirements. Develop and maintain comprehensive documentation, including business requirement documents, business blueprints, solution design documents, functional specifications, test scripts, and training materials. Utilize training content tools such as Enable Now. Manage all testing cycles, including Unit Testing, System Integration Testing (SIT), and User Acceptance Testing (UAT). Integrate SAC with SAP BW to leverage existing data models and ensure seamless data flow and reporting. Demonstrate sound knowledge of testing tools like HP ALM and SAP Solution Manager (SOLMAN). ### Job Requirements Bachelor’s degree or equivalent in a relevant subject such as finance, accounting, engineering. SAP Certification in SAP SAC Analytics and Planning and SAP BW. Minimum of 12+ years of relevant experience in SAP with consulting and/or system integrators. Experience with at least 10 end-to-end project implementations in SAP SAC Analytics & Planning and BW on the latest SAP S4 HANA versions in ERP-enabled finance transformation projects. Hands-on experience in SAP Analytics Cloud (SAC) and SAP Business Warehouse (BW) for data modeling, data integration, and reporting. Proficiency in designing and implementing SAC models, stories, dashboards, and predictive analytics. Strong knowledge of BW data extraction, transformation, and loading (ETL) processes. Experience integrating SAC with SAP BW to leverage existing data models and ensure seamless data flow. Familiarity with core SAP modules like MM, SD, PP, PS, PM, HCM, REFX, and BPC for comprehensive data integration. Ensure data accuracy and consistency by managing data integration processes between SAC and BW, including ETL activities. Sector experience in Oil & Gas, Manufacturing, Real Estate, Power and Utilities. Excellent documentation skills, with preference given to Big 4 experience. Established track record of business development, practice management, and team development. Strong working experience in SAP ASAP, Activate Methodologies. Strong technical skills and recognized cautious risk management ability. Deep understanding of the client's industry and marketplace. Flexibility and willingness to travel on short notice, as necessary. Track record of a strong consulting background. Experience with GCC-based clients. Extensive professional knowledge and sector knowledge. Commercially driven. Strong stakeholder management skills in understanding strategic direction and being able to convert into cohesive change management strategy and plans. Desire to build/develop a career in advising our clients with organizational-wide improvements and industry insight. Strong interest and commitment to understanding and developing leading-edge finance solutions to our industry clients. Ability to develop client opportunities and identify market growth opportunities. Ability to support the business development cycle in the system.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies