Jobs
Interviews

389 Aggregations Jobs - Page 5

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Job Description: As ELK architect (Elasticsearch, Logstash, and Kibana), you will be responsible for designing and implementing the data architecture and infrastructure for data analytics, log management, and visualization solutions using the ELK stack. You will collaborate with cross-functional teams, including data engineers, developers, system administrators, and stakeholders, to define data requirements, design data models, and ensure efficient data processing, storage, and retrieval. Your expertise in ELK and data architecture will be instrumental in building scalable and performant data solutions. Responsibilities: Data Architecture Design: Collaborate with stakeholders to understand business requirements and define the data architecture strategy for ELK-based solutions. Design scalable and robust data models, data flows, and data integration patterns. ELK Stack Implementation: Lead the implementation and configuration of ELK stack infrastructure to support data ingestion, processing, indexing, and visualization. Ensure high availability, fault tolerance, and optimal performance of the ELK environment. Data Ingestion and Integration: Design and implement efficient data ingestion pipelines using Logstash or other relevant technologies. Integrate data from various sources, such as databases, APIs, logs, AppDynamics, storage and streaming platforms, into ELK for real-time and batch processing. Data Modeling and Indexing: Design and optimize Elasticsearch indices and mappings to enable fast and accurate search and analysis. Define index templates, shard configurations, and document structures to ensure efficient storage and retrieval of data. Data Visualization and Reporting: Collaborate with stakeholders to understand data visualization and reporting requirements. Utilize Kibana to design and develop visually appealing and interactive dashboards, reports, and visualizations that enable data-driven decision-making. Performance Optimization: Analyze and optimize the performance of data processing and retrieval in ELK. Tune Elasticsearch settings, queries, and aggregations to improve search speed and response time. Optimize data storage, caching, and memory management. Data Security and Compliance: Implement security measures and access controls to protect sensitive data stored in ELK. Ensure compliance with data privacy regulations and industry standards by implementing appropriate encryption, access controls, and auditing mechanisms. Documentation and Collaboration: Create and maintain documentation of data models, data flows, system configurations, and best practices. Collaborate with cross-functional teams, providing guidance and support on data architecture and ELK-related topics Who You Are Candidate should have minimum 8+ years of experience. Apply Architectural Methods. Design Information System Architecture. Lead Systems Engineering Management. AD & AI leadership. Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.

Posted 1 month ago

Apply

3.0 years

0 Lacs

Andhra Pradesh, India

On-site

At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. In SAP technology at PwC, you will specialise in utilising and managing SAP software and solutions within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of SAP products and technologies. Focused on relationships, you are building meaningful client connections, and learning how to manage and inspire others. Navigating increasingly complex situations, you are growing your personal brand, deepening technical expertise and awareness of your strengths. You are expected to anticipate the needs of your teams and clients, and to deliver quality. Embracing increased ambiguity, you are comfortable when the path forward isn’t clear, you ask questions, and you use these moments as opportunities to grow. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Respond effectively to the diverse perspectives, needs, and feelings of others. Use a broad range of tools, methodologies and techniques to generate new ideas and solve problems. Use critical thinking to break down complex concepts. Understand the broader objectives of your project or role and how your work fits into the overall strategy. Develop a deeper understanding of the business context and how it is changing. Use reflection to develop self awareness, enhance strengths and address development areas. Interpret data to inform insights and recommendations. Uphold and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements. SAP Native Hana Developer Technical Skills: Bachelor's or Master's degree in a relevant field (e.g., computer science, information systems, engineering). Minimum of 3 years of experience in HANA Native development and configurations, including at least 1 year with SAP BTP Cloud Foundry and HANA Cloud. Demonstrated experience in working with various data sources SAP(SAP ECC, SAP CRM, SAP S/4HANA) and non-SAP (Oracle, Salesforce, AWS) Demonstrated expertise in designing and implementing solutions utilizing the SAP BTP platform. Solid understanding of BTP HANA Cloud and its service offerings. Strong focus on building expertise in constructing calculation views within the HANA Cloud environment (BAS) and other supporting data artifacts. Experience with HANA XS Advanced and HANA 2.0 versions. Ability to optimize queries and data models for performance in SAP HANA development environment and sound understanding of indexing, partitioning and other performance optimization techniques. Proven experience in applying SAP HANA Cloud development tools and technologies, including HDI containers, HANA OData Services , HANA XSA, strong SQL scripting, SDI/SLT replication, Smart Data Access (SDA) and Cloud Foundry UPS services. Experience with ETL processes and tools (SAP Data Services Preferred). Ability to debug and optimize existing queries and data models for performance. Hands-on experience in utilizing Git within Business Application Studio and familiarity with Github features and repository management. Familiarity with reporting tools and security based concepts within the HANA development environment. Understanding of the HANA Transport Management System, HANA Transport Container and CI/CD practices for object deployment. Knowledge of monitoring and troubleshooting techniques for SAP HANA BW environments. Familiarity with reporting tools like SAC/Power BI building dashboards and consuming data models is a plus. HANA CDS views: (added advantage) Understanding of associations, aggregations, and annotations in CDS views. Ability to design and implement data models using CDS. Certification in SAP HANA or related areas is a plus Functional knowledge of SAP business processes (FI/CO, MM, SD, HR).

Posted 1 month ago

Apply

5.0 years

0 Lacs

Greater Kolkata Area

On-site

About The Role We are seeking a highly skilled and detail-oriented Test Engineer - Data Migration to join our QA and Engineering team. This role demands an expert in data testing processes who will be responsible for validating complex data migrations, ensuring data integrity, and supporting automated testing across cross-platform environments. The ideal candidate will bring hands-on experience in migration validation, reconciliation, and automation testing within diverse enterprise systems and cloud ecosystems. Key Responsibilities Participate actively in all phases of the Software Testing Life Cycle (STLC), ensuring comprehensive test coverage and traceability. Design, develop, and execute test strategies and test cases for data migration and data transformation initiatives. Conduct detailed data mapping validation, row-level and summary-level data validation, and data reconciliation between source and target environments. Collaborate with data architects, developers, and analysts to interpret ETL logic, verify transformation rules, and ensure successful migration outcomes. Identify, log, and track defects in the migration pipeline using standard defect tracking tools; drive issues to resolution through coordination with cross-functional teams. Leverage automated testing tools such as Selenium to optimize functional and regression testing across backend systems and data flows. Support integration of testing frameworks into CI/CD pipelines using Jenkins, Git, Azure DevOps, or similar platforms. Participate in Agile ceremonies such as sprint planning, stand-ups, and retrospectives to contribute to user story refinement and estimation. Validate data lineage, auditability, and compliance throughout the data migration process, including transformation, masking, and retention policies. Analyze test results, prepare test reports, and provide actionable feedback to stakeholders and leadership. Required Technical Skills And Experience 5+ years of professional experience in software testing with a strong focus on data migration testing (3+ years mandatory). Deep understanding of STLC, defect lifecycle, and both Agile and Waterfall methodologies. Strong experience in data migration scenarios, such as : Oracle to Snowflake SQL Server to BigQuery Flat files to cloud-based databases Extensive experience in : Data mapping validation (source to target transformation logic) Data reconciliation and integrity checks Schema validation and metadata comparison Proficiency in SQL for database querying, joins, aggregations, and validation logic. Familiarity with ETL tools such as DataStage, Talend, or similar. Proficient in automation scripting using Selenium with Java or Python. Working knowledge of CI/CD tools including Jenkins, Git, Azure DevOps, and integration of test suites within pipelines. Preferred (Nice-to-Have) Skills Experience with API testing tools such as Postman, RestAssured, or similar REST clients. Exposure to cloud platforms (AWS, Azure, or GCP), especially in the context of data migration, warehousing, and integration. Understanding of data quality frameworks, profiling tools, and data governance standards. Familiarity with data masking and anonymization techniques for testing sensitive datasets in lower environments. Experience with big data testing frameworks and tools related to distributed data platforms. (ref:hirist.tech)

Posted 1 month ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. ERP Analytics_Oracle EPM – PBCS_Senior Experience: 4 to 7 /8 years Job Summary We are looking for an experienced Oracle PBCS/EPBCS Senior Consultant with proven expertise in implementing Oracle Planning and Budgeting to lead and implement financial planning solutions. The ideal candidate should have strong hands-on skills in PBCS/EPBCS, Dimension hierarchy management, Data Integration, Management Reports/ Smart View reporting, Webforms and Dashboards, Task Manager/ Task Manager, Business Rules, Groovy Scripting, Security, Approval workflow and financial process optimization. This role demands strong collaboration with business stakeholders, technical teams, and project managers to deliver effective solutions. Key Responsibilities Proven record with a minimum of 4 years in customer-facing implementation projects, particularly in Oracle PBCS/EPBCS is highly preferred. Design and deploy Financial Reports, Web Forms, Dashboards, Business rules, Smart lists, Management Reports, Security and Approval Workflow. Work on Data Integration including direct integration between various source systems, file based integrations, Pipelines and Data maps between cubes. Develop and maintain complex business rules, aggregations, and dynamic calculations while ensuring alignment with best practices. Collaborate with FP&A to gather requirements, prototype solutions, conduct system testing, and deliver end-user training to ensure smooth implementation and adoption. Troubleshoot and resolve metadata and data load issues and perform data validations Work closely with clients to understand business objectives and deliver scalable, high-performance planning solutions that support strategic decision-making. Define process automation using EPMAutomate and Batch Scripting. A degree or background in Finance or Accounting is a strong advantage to better align with business user needs and financial reporting requirements. Preferred Skills Experience in EPBCS with preferred expertise in implementing Workforce, Financials, Capital, and Projects modules. Experience in large corporate or multinational environments. And involvement in finance transformation projects. Exposure to project environments with agile or waterfall methodologies. Passion for automation, process improvement, and modern finance practices. Exposure to Groovy Scripting and SQL for advanced customization. EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.

Posted 1 month ago

Apply

8.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. ERP Analytics_Oracle EPM – PBCS_Senior Experience: 4 to 7 /8 years Job Summary We are looking for an experienced Oracle PBCS/EPBCS Senior Consultant with proven expertise in implementing Oracle Planning and Budgeting to lead and implement financial planning solutions. The ideal candidate should have strong hands-on skills in PBCS/EPBCS, Dimension hierarchy management, Data Integration, Management Reports/ Smart View reporting, Webforms and Dashboards, Task Manager/ Task Manager, Business Rules, Groovy Scripting, Security, Approval workflow and financial process optimization. This role demands strong collaboration with business stakeholders, technical teams, and project managers to deliver effective solutions. Key Responsibilities Proven record with a minimum of 4 years in customer-facing implementation projects, particularly in Oracle PBCS/EPBCS is highly preferred. Design and deploy Financial Reports, Web Forms, Dashboards, Business rules, Smart lists, Management Reports, Security and Approval Workflow. Work on Data Integration including direct integration between various source systems, file based integrations, Pipelines and Data maps between cubes. Develop and maintain complex business rules, aggregations, and dynamic calculations while ensuring alignment with best practices. Collaborate with FP&A to gather requirements, prototype solutions, conduct system testing, and deliver end-user training to ensure smooth implementation and adoption. Troubleshoot and resolve metadata and data load issues and perform data validations Work closely with clients to understand business objectives and deliver scalable, high-performance planning solutions that support strategic decision-making. Define process automation using EPMAutomate and Batch Scripting. A degree or background in Finance or Accounting is a strong advantage to better align with business user needs and financial reporting requirements. Preferred Skills Experience in EPBCS with preferred expertise in implementing Workforce, Financials, Capital, and Projects modules. Experience in large corporate or multinational environments. And involvement in finance transformation projects. Exposure to project environments with agile or waterfall methodologies. Passion for automation, process improvement, and modern finance practices. Exposure to Groovy Scripting and SQL for advanced customization. EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.

Posted 1 month ago

Apply

8.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. ERP Analytics_Oracle EPM – PBCS_Senior Experience: 4 to 7 /8 years Job Summary We are looking for an experienced Oracle PBCS/EPBCS Senior Consultant with proven expertise in implementing Oracle Planning and Budgeting to lead and implement financial planning solutions. The ideal candidate should have strong hands-on skills in PBCS/EPBCS, Dimension hierarchy management, Data Integration, Management Reports/ Smart View reporting, Webforms and Dashboards, Task Manager/ Task Manager, Business Rules, Groovy Scripting, Security, Approval workflow and financial process optimization. This role demands strong collaboration with business stakeholders, technical teams, and project managers to deliver effective solutions. Key Responsibilities Proven record with a minimum of 4 years in customer-facing implementation projects, particularly in Oracle PBCS/EPBCS is highly preferred. Design and deploy Financial Reports, Web Forms, Dashboards, Business rules, Smart lists, Management Reports, Security and Approval Workflow. Work on Data Integration including direct integration between various source systems, file based integrations, Pipelines and Data maps between cubes. Develop and maintain complex business rules, aggregations, and dynamic calculations while ensuring alignment with best practices. Collaborate with FP&A to gather requirements, prototype solutions, conduct system testing, and deliver end-user training to ensure smooth implementation and adoption. Troubleshoot and resolve metadata and data load issues and perform data validations Work closely with clients to understand business objectives and deliver scalable, high-performance planning solutions that support strategic decision-making. Define process automation using EPMAutomate and Batch Scripting. A degree or background in Finance or Accounting is a strong advantage to better align with business user needs and financial reporting requirements. Preferred Skills Experience in EPBCS with preferred expertise in implementing Workforce, Financials, Capital, and Projects modules. Experience in large corporate or multinational environments. And involvement in finance transformation projects. Exposure to project environments with agile or waterfall methodologies. Passion for automation, process improvement, and modern finance practices. Exposure to Groovy Scripting and SQL for advanced customization. EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.

Posted 1 month ago

Apply

8.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. ERP Analytics_Oracle EPM – PBCS_Senior Experience: 4 to 7 /8 years Job Summary We are looking for an experienced Oracle PBCS/EPBCS Senior Consultant with proven expertise in implementing Oracle Planning and Budgeting to lead and implement financial planning solutions. The ideal candidate should have strong hands-on skills in PBCS/EPBCS, Dimension hierarchy management, Data Integration, Management Reports/ Smart View reporting, Webforms and Dashboards, Task Manager/ Task Manager, Business Rules, Groovy Scripting, Security, Approval workflow and financial process optimization. This role demands strong collaboration with business stakeholders, technical teams, and project managers to deliver effective solutions. Key Responsibilities Proven record with a minimum of 4 years in customer-facing implementation projects, particularly in Oracle PBCS/EPBCS is highly preferred. Design and deploy Financial Reports, Web Forms, Dashboards, Business rules, Smart lists, Management Reports, Security and Approval Workflow. Work on Data Integration including direct integration between various source systems, file based integrations, Pipelines and Data maps between cubes. Develop and maintain complex business rules, aggregations, and dynamic calculations while ensuring alignment with best practices. Collaborate with FP&A to gather requirements, prototype solutions, conduct system testing, and deliver end-user training to ensure smooth implementation and adoption. Troubleshoot and resolve metadata and data load issues and perform data validations Work closely with clients to understand business objectives and deliver scalable, high-performance planning solutions that support strategic decision-making. Define process automation using EPMAutomate and Batch Scripting. A degree or background in Finance or Accounting is a strong advantage to better align with business user needs and financial reporting requirements. Preferred Skills Experience in EPBCS with preferred expertise in implementing Workforce, Financials, Capital, and Projects modules. Experience in large corporate or multinational environments. And involvement in finance transformation projects. Exposure to project environments with agile or waterfall methodologies. Passion for automation, process improvement, and modern finance practices. Exposure to Groovy Scripting and SQL for advanced customization. EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.

Posted 1 month ago

Apply

8.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. ERP Analytics_Oracle EPM – PBCS_Senior Experience: 4 to 7 /8 years Job Summary We are looking for an experienced Oracle PBCS/EPBCS Senior Consultant with proven expertise in implementing Oracle Planning and Budgeting to lead and implement financial planning solutions. The ideal candidate should have strong hands-on skills in PBCS/EPBCS, Dimension hierarchy management, Data Integration, Management Reports/ Smart View reporting, Webforms and Dashboards, Task Manager/ Task Manager, Business Rules, Groovy Scripting, Security, Approval workflow and financial process optimization. This role demands strong collaboration with business stakeholders, technical teams, and project managers to deliver effective solutions. Key Responsibilities Proven record with a minimum of 4 years in customer-facing implementation projects, particularly in Oracle PBCS/EPBCS is highly preferred. Design and deploy Financial Reports, Web Forms, Dashboards, Business rules, Smart lists, Management Reports, Security and Approval Workflow. Work on Data Integration including direct integration between various source systems, file based integrations, Pipelines and Data maps between cubes. Develop and maintain complex business rules, aggregations, and dynamic calculations while ensuring alignment with best practices. Collaborate with FP&A to gather requirements, prototype solutions, conduct system testing, and deliver end-user training to ensure smooth implementation and adoption. Troubleshoot and resolve metadata and data load issues and perform data validations Work closely with clients to understand business objectives and deliver scalable, high-performance planning solutions that support strategic decision-making. Define process automation using EPMAutomate and Batch Scripting. A degree or background in Finance or Accounting is a strong advantage to better align with business user needs and financial reporting requirements. Preferred Skills Experience in EPBCS with preferred expertise in implementing Workforce, Financials, Capital, and Projects modules. Experience in large corporate or multinational environments. And involvement in finance transformation projects. Exposure to project environments with agile or waterfall methodologies. Passion for automation, process improvement, and modern finance practices. Exposure to Groovy Scripting and SQL for advanced customization. EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.

Posted 1 month ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Summary: Job Location: Manikonda, Hyderabad. Job Type: Full-Time Shift Timings: 2 PM - 11 PM IST Work Mode: Hybrid, 3 days a week mandatory Experience: 3 - 5 years : 4 Positions & 5 - 7 years : 3 Positions Interview Process - 1-2 rounds only but F2F (Hyderabad Office) Notice: Immediate - 7 days We are seeking a highly motivated Python Developer with strong skills in SQL and Snowflake to join our growing data engineering and analytics team. The ideal candidate should have hands-on experience building and maintaining data pipelines, integrating databases, and optimizing queries, along with strong Python programming skills to automate workflows and enable scalable data solutions. Skill Set (Core Must-Haves): Ø Engineering mindset (adaptable, open to change, problem-solving). Ø Strong SQL (2+ years) & understanding of data modelling. Ø Programming experience (Python — not limited to Snowflake). Ø 1+ years Snowflake (preferred, but not strictly required). Nice-to-Have, but not limited to Ø CI/CD & pipeline tools. Ø Airflow (or similar tools; ZB uses Dagster) Ø AI/data science interest or background. Attitude Requirements: Ø Handle change with minimal resistance. Ø Willingness to learn and move across technologies. Ø Proactive, ownership mindset. Required Skills & Qualifications: Ø 3+ years of professional experience in Python programming with a strong understanding of object-oriented concepts and libraries such as Pandas, NumPy, etc. Ø Solid hands-on experience with SQL (writing queries, joins, subqueries, aggregations, and performance tuning). Ø 1+ years of working experience with the Snowflake cloud data warehouse (data loading, transformations, schema design, and query optimization).

Posted 1 month ago

Apply

0 years

0 Lacs

India

On-site

WHO WE ARE Sapaad is a global leader in all-in-one unified commerce platforms, dedicated to delivering world-class software solutions. Its flagship product, Sapaad, has seen tremendous success in the last decade, with thousands of customers worldwide, and many more signing on. Driven by a team of passionate developers and designers, Sapaad is constantly innovating, introducing cutting-edge features that reshape the industry. Headquartered in Singapore, with offices across five countries, Sapaad is backed by technology veterans with deep expertise in web, mobility, and e commerce, making it a key player in the tech landscape. THE OPPORTUNITY We are seeking a detail-oriented QA Engineer with expertise in Business Intelligence tools and data analysis to ensure the quality and accuracy of our dashboard solutions. The successful candidate will be responsible for testing BI dashboards, validating data integrity, and ensuring analytical reports meet business requirements. ROLES & RESPONSIBILITIES Testing & Quality Assurance Design and execute comprehensive test plans for BI dashboards and analytical reports Perform functional, regression, and user acceptance testing on dashboard components Validate data accuracy, completeness, and consistency across multiple data sources Test dashboard performance, load times, and responsiveness across different devices and browsers Document and track defects using bug tracking systems and collaborate with development teams for resolution Data Analysis & Validation Conduct thorough data validation to ensure accuracy of metrics, KPIs, and calculations Compare dashboard outputs against source systems and business requirements Analyze data trends and patterns to identify potential anomalies or data quality issues Verify data transformations, aggregations, and filtering logic within dashboards Perform end-to-end data lineage testing from source to presentation layer End-to-End BI Solution Verification Ensure proper functionality of interactive dashboard features including filters, drill-downs, and parameter controls Validate data refresh schedules and automated report generation processes Test dashboard security settings and user access permissions REQUIRED QUALIFICATIONS Proficiency in at least one major BI tool (Tableau, Power BI, QlikView, Looker, or similar) Basic Database concepts and SQL skills. Experience with data analysis and statistical concepts Basic knowledge of ETL processes and data warehousing concepts Familiarity with testing frameworks and methodologies Strong analytical and problem-solving abilities Experience with data profiling and data quality assessment Understanding of business metrics, KPIs, and dashboard design principles Ability to interpret complex data relationships and business logic Excellent attention to detail and accuracy Strong communication skills for collaborating with business stakeholders and technical teams Ability to create clear test documentation and defect reports Project management skills and ability to work with multiple priorities

Posted 1 month ago

Apply

3.0 - 6.0 years

0 Lacs

Bangalore Urban, Karnataka, India

On-site

We are seeking a Data Engineer with strong expertise in SQL and ETL processes to support banking data quality data pipelines, regulatory reporting, and data quality initiatives. The role involves building and optimizing data structures, implementing validation rules, and collaborating with governance and compliance teams. Experience in the banking domain and tools like Informatica and Azure Data Factory is essential. Strong proficiency in SQL for writing complex queries, joins, data transformations, and aggregations Proven experience in building tables, views, and data structures within enterprise Data Warehouses and Data Lakes Strong understanding of data warehousing concepts, such as Slowly Changing Dimensions (SCDs), data normalization, and star/snowflake schemas Practical experience in Azure Data Factory (ADF) for orchestrating data pipelines and managing ingestion workflows Exposure to data cataloging, metadata management, and lineage tracking using Informatica EDC or Axon Experience implementing Data Quality rules for banking use cases such as completeness, consistency, uniqueness, and validity Familiarity with banking systems and data domains such as Flexcube, HRMS, CRM, Risk, Compliance, and IBG reporting Understanding of regulatory and audit readiness needs for Central Bank and internal governance forums Write optimized SQL scripts to extract, transform, and load (ETL) data from multiple banking source systems Design and implement staging and reporting layer structures, aligned to business requirements and regulatory frameworks Apply data validation logic based on predefined business rules and data governance requirements Collaborate with Data Governance, Risk, and Compliance teams to embed lineage, ownership, and metadata into datasets Monitor scheduled jobs and resolve ETL failures to ensure SLA adherence for reporting and operational dashboards Support production deployment, UAT sign off, and issue resolution for data products across business units 3 to 6 years in banking-focused data engineering roles with hands on SQL, ETL, and DQ rule implementation Bachelors or Master's Degree in Computer Science, Information Systems, Data Engineering, or related fields Banking domain experience is mandatory, especially in areas related to regulatory reporting, compliance, and enterprise data governance

Posted 1 month ago

Apply

7.0 years

0 Lacs

Haryana, India

On-site

ETL QA - Technical Lead Experience: 7 to 11 Years Job Locations: Gurgaon (1 position) Job Summary: We are looking for a highly skilled and detail-oriented ETL QA - Technical Lead with strong experience in Big Data Testing , Hadoop ecosystem , and SQL validation . The ideal candidate should have hands-on experience in test planning, execution, and automation in a data warehouse/ETL environment. You'll work closely with cross-functional teams in an Agile environment to ensure the quality and integrity of large-scale data solutions. Key Responsibilities: Lead end-to-end testing efforts for data/ETL pipelines across big data platforms Design and implement test strategies for validating large datasets, transformations, and integrations Perform hands-on testing of Hadoop-based data platforms (HDFS, Hive, Spark, etc.) Develop complex SQL queries for data validation and business rule testing Collaborate with developers, product owners, and business analysts in Agile ceremonies Own test planning, test case design, defect tracking, and reporting for assigned modules Identify areas of automation and build reusable QA assets Drive QA best practices and mentor junior QA team members Required Skills: 7-11 years of experience in Software Testing, with at least 3+ years in Big Data/Hadoop testing Strong hands-on experience in testing Hadoop components like HDFS, Hive, Spark, Sqoop, etc. Proficient in SQL (complex joins, aggregations, data validation) Experience in ETL/Data Warehouse testing Familiarity with data ingestion, transformation, and validation techniques

Posted 1 month ago

Apply

7.0 years

0 Lacs

Telangana, India

On-site

ETL QA - Technical Lead Experience: 7 to 11 Years Job Locations: Gurgaon (1 position) Job Summary: We are looking for a highly skilled and detail-oriented ETL QA - Technical Lead with strong experience in Big Data Testing , Hadoop ecosystem , and SQL validation . The ideal candidate should have hands-on experience in test planning, execution, and automation in a data warehouse/ETL environment. You'll work closely with cross-functional teams in an Agile environment to ensure the quality and integrity of large-scale data solutions. Key Responsibilities: Lead end-to-end testing efforts for data/ETL pipelines across big data platforms Design and implement test strategies for validating large datasets, transformations, and integrations Perform hands-on testing of Hadoop-based data platforms (HDFS, Hive, Spark, etc.) Develop complex SQL queries for data validation and business rule testing Collaborate with developers, product owners, and business analysts in Agile ceremonies Own test planning, test case design, defect tracking, and reporting for assigned modules Identify areas of automation and build reusable QA assets Drive QA best practices and mentor junior QA team members Required Skills: 7-11 years of experience in Software Testing, with at least 3+ years in Big Data/Hadoop testing Strong hands-on experience in testing Hadoop components like HDFS, Hive, Spark, Sqoop, etc. Proficient in SQL (complex joins, aggregations, data validation) Experience in ETL/Data Warehouse testing Familiarity with data ingestion, transformation, and validation techniques

Posted 1 month ago

Apply

3.0 years

0 Lacs

Andhra Pradesh, India

On-site

At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. In SAP technology at PwC, you will specialise in utilising and managing SAP software and solutions within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of SAP products and technologies. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. SAP Native Hana Developer Technical Skills Bachelor's or Master's degree in a relevant field (e.g., computer science, information systems, engineering). Minimum of 3 years of experience in HANA Native development and configurations, including at least 1 year with SAP BTP Cloud Foundry and HANA Cloud. Demonstrated experience in working with various data sources SAP(SAP ECC, SAP CRM, SAP S/4HANA) and non-SAP (Oracle, Salesforce, AWS) Demonstrated expertise in designing and implementing solutions utilizing the SAP BTP platform. Solid understanding of BTP HANA Cloud and its service offerings. Strong focus on building expertise in constructing calculation views within the HANA Cloud environment (BAS) and other supporting data artifacts. Experience with HANA XS Advanced and HANA 2.0 versions. Ability to optimize queries and data models for performance in SAP HANA development environment and sound understanding of indexing, partitioning and other performance optimization techniques. Proven experience in applying SAP HANA Cloud development tools and technologies, including HDI containers, HANA OData Services , HANA XSA, strong SQL scripting, SDI/SLT replication, Smart Data Access (SDA) and Cloud Foundry UPS services. Experience with ETL processes and tools (SAP Data Services Preferred). Ability to debug and optimize existing queries and data models for performance. Hands-on experience in utilizing Git within Business Application Studio and familiarity with Github features and repository management. Familiarity with reporting tools and security based concepts within the HANA development environment. Understanding of the HANA Transport Management System, HANA Transport Container and CI/CD practices for object deployment. Knowledge of monitoring and troubleshooting techniques for SAP HANA BW environments. Familiarity with reporting tools like SAC/Power BI building dashboards and consuming data models is a plus. HANA CDS views: (added advantage) Understanding of associations, aggregations, and annotations in CDS views. Ability to design and implement data models using CDS. Certification in SAP HANA or related areas is a plus Functional knowledge of SAP business processes (FI/CO, MM, SD, HR).

Posted 1 month ago

Apply

6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Get to know Okta Okta is The World’s Identity Company. We free everyone to safely use any technology, anywhere, on any device or app. Our flexible and neutral products, Okta Platform and Auth0 Platform, provide secure access, authentication, and automation, placing identity at the core of business security and growth. At Okta, we celebrate a variety of perspectives and experiences. We are not looking for someone who checks every single box - we’re looking for lifelong learners and people who can make us better with their unique experiences. Join our team! We’re building a world where Identity belongs to you. Sr Analytics Engineer We are looking for an experienced Analytics Engineer to join Okta’s enterprise data team. This analyst will have strong background in SaaS subscription and product analytics, a passion for providing customer usage insights to internal stakeholders, and experience organizing complex data into consumable data assets. In this role, you will be focusing on subscription analytics and product utilization insights and will partner with Product, Engineering, Customer Success, and Pricing to implement enhancements and build end-to-end customer subscription insights into new products. Requirements Experience in customer analytics, product analytics, and go-to-market analytics Experience in SaaS business, Product domain as well as Salesforce Proficiency in SQL, ETL tools, GitHub, and data integration technologies, including familiarity with data modeling techniques, database design, and query optimization. Experience in “data” languages like R and Python. Knowledge of data processing frameworks like PySpark is also beneficial. Experience working with cloud-based data solutions like AWS or Google Cloud Platform and cloud-based data warehousing tools like Snowflake. Strong analytical and problem-solving skills to understand complex data problems and provide effective solutions. Experience in building reports and visualizations to represent data in Tableau or Looker Ability to effectively communicate with stakeholders, and work cross-functionally and communicate with technical and non-technical teams Familiarity with SCRUM operating model and tracking work via a tool such as Jira 6+ years in data engineering, data warehousing, or business intelligence BS in computer science, data science, statistics, mathematics, or a related field Responsibilities Engage with Product and Engineering to implement product definitions into subscription and product analytics, building new insights and updates to existing key data products Analyze a variety of data sources, structures, and metadata and develop mapping, transformation rules, aggregations and ETL specifications Configure scalable and reliable data pipelines to consume, integrate and analyze large volumes of complex data from different sources to support the growing needs of subscription and product analytics Partner with internal stakeholders to understand user needs and implement user feedback, and develop reporting and dashboards focused on subscription analytics Work closely with other Analytics team members to optimize data self service, reusability, performance, and ensure validity of source of truth Enhance reusable knowledge of the models and metrics through documentation and use of the data catalog Ensure data security and compliance by implementing appropriate data access controls, encryption, and auditing mechanisms. Take ownership of successful completion for project activities Nice to Have Experience in data science, AI/ML concepts and techniques What you can look forward to as a Full-Time Okta employee! Amazing Benefits Making Social Impact Developing Talent and Fostering Connection + Community at Okta Okta cultivates a dynamic work environment, providing the best tools, technology and benefits to empower our employees to work productively in a setting that best and uniquely suits their needs. Each organization is unique in the degree of flexibility and mobility in which they work so that all employees are enabled to be their most creative and successful versions of themselves, regardless of where they live. Find your place at Okta today! https://www.okta.com/company/careers/. Some roles may require travel to one of our office locations for in-person onboarding. Okta is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, ancestry, marital status, age, physical or mental disability, or status as a protected veteran. We also consider for employment qualified applicants with arrest and convictions records, consistent with applicable laws. If reasonable accommodation is needed to complete any part of the job application, interview process, or onboarding please use this Form to request an accommodation. Okta is committed to complying with applicable data privacy and security laws and regulations. For more information, please see our Privacy Policy at https://www.okta.com/privacy-policy/.

Posted 1 month ago

Apply

3.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

About QualityKiosk Technologies QualityKiosk Technologies is one of the world's largest independent Quality Engineering (QE) providers and digital transformation enablers, helping companies build and manage applications for optimal performance and user experience. Founded in 2000, the company specializes in providing quality engineering, QA automation, performance assurance, intelligent automation (IA) and robotic process automation (RPA), customer experience management, site reliability engineering (SRE), digital testing as a service (DTaaS), cloud, and data analytics solutions and services. With operations spread across 25+ countries and a workforce of more than 4000 employees, the organization enables some of the leading banking, e-commerce, automotive, telecom, insurance, OTT, entertainment, pharmaceuticals, and BFSI brands to achieve their business transformation goals. QualityKiosk Technologies has been featured in renowned global advisory firms' reports, including Forrester, Gartner, The Everest Group, and Hurun Report, for its innovative, IP-led quality assurance solutions and the positive impact it has created for its clients in the fast-changing digital landscape. QualityKiosk, which offers automated quality assurance solutions for clients across geographies and verticals, counts 50 of the Indian Fortune 100 companies and 18 of the global Fortune 500 companies as its key clients. The company is banking on its speed of execution and technology advancement as key factors to drive a 5X growth in the next five years, both in revenues and number of employees. Key Responsibilities: - Design, implement, and optimize Elasticsearch clusters and associated applications. - Develop and maintain scalable and fault-tolerant search architectures to support large-scale data. - Troubleshoot performance and reliability issues within Elasticsearch environments. - Integrate Elasticsearch with other tools like Logstash, Kibana, Beats, etc. - Implement search features such as auto-complete, aggregations, fuzziness, and advanced search functionalities. - Manage Elasticsearch data pipelines and work on data ingest, indexing, and transformation. - Monitor, optimize, and ensure the health of Elasticsearch clusters and associated services. - Conduct capacity planning and scalability testing for search infrastructure. - Ensure high availability and disaster recovery strategies for Elasticsearch clusters. - Collaborate with software engineers, data engineers, and DevOps teams to ensure smooth deployment and integration. - Document solutions, configurations, and best practices. - Stay updated on new features and functionalities of the Elastic Stack and apply them to enhance existing systems. - Grooming of freshers and junior team members to enable them to take up responsibilities Required Skills & Qualifications: - Experience: 3 years of hands-on experience with Elasticsearch and its ecosystem (Elasticsearch, Kibana, Logstash, Fleet server, Elastic agents, Beats). - Core Technologies: Strong experience with Elasticsearch, including cluster setup, configuration, and optimization. - Search Architecture: Experience designing and maintaining scalable search architectures and handling large datasets.

Posted 1 month ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Company Description MH Cockpit is the premium training academy founded by professionals from the aviation industry. We provide specialised courses in Air Hostess and Ground Staff training to help individuals prepare for careers in aviation. MH Cockpit is honoured to teach, motivate, and inspire individuals who wish to make successful careers in the aviation sector. Our commitment to excellence ensures that our students receive the highest quality training. Role Description This is a Part-time, on-site role located in Chennai for a Microsoft Power BI Trainer. The Power BI Trainer will be responsible for designing and delivering training programs on Microsoft Power BI, including creating curriculum, teaching, and coaching students. Additional responsibilities include monitoring student progress, providing guidance, and adapting training materials to meet student needs. The role may also involve coordinating with the sales team to promote training programs. Skills: DAX, Data Modelling, Power Quality, ETL Tools, Snowflake, Data Lakes, SQL, Data Warehousing, Qualifications And Skills: Extensive experience in data modelling techniques and practices, crucial for crafting scalable and optimised data models. Proficiency in Power BI development and administration to support robust reporting and analytics. Strong command of DAX (Mandatory skill) for developing intricate calculations and aggregations within Power BI solutions. Advanced expertise in handling Snowflake (Mandatory skill) for adept cloud-based data warehousing capabilities. Knowledge of Data Lakes (Mandatory skill) for managing large volumes of structured and unstructured data efficiently. Solid understanding of ETL tools, necessary for the efficient extraction, transformation, and loading of data. Exceptional SQL skills to design, query, and manage complex databases for data-driven decisions. Experience in data warehousing concepts and architectures to support structured and systematic data storage. Roles And Responsibilities Architect and implement cutting-edge Power BI solutions that transform business requirements into insightful dashboards and reports. Collaborate with cross-functional teams to gather and analyse data requirements, ensuring alignment with business objectives. Design and optimise data models using advanced techniques for improved performance and scalability in Power BI. Leverage DAX to create complex calculations and custom metrics, enhancing the depth and quality of analytical outputs. Utilise Snowflake to manage and optimise cloud-based data warehousing solutions for seamless data integration. Implement and administer data lakes to efficiently handle and store large datasets, both structured and unstructured. Ensure data accuracy, validity, and security through meticulous data quality checks and validation processes. Stay updated with the latest industry trends and best practices to continuously improve BI solutions and strategies. Desired Skills and Experience DAX, Data Modelling, Power Quality, ETL Tools, Snowflake, Data Lakes, SQL, Data Warehousing. Package: open to negotiation.

Posted 1 month ago

Apply

7.0 years

0 Lacs

Telangana

On-site

ETL QA – Technical Lead: Experience: 7 to 11 Years Job Locations: Hyderabad (1 position) | Gurgaon (1 position) Job Summary:: We are looking for a highly skilled and detail-oriented ETL QA – Technical Lead with strong experience in Big Data Testing, Hadoop ecosystem, and SQL validation. The ideal candidate should have hands-on experience in test planning, execution, and automation in a data warehouse/ETL environment. You’ll work closely with cross-functional teams in an Agile environment to ensure the quality and integrity of large-scale data solutions. : Key Responsibilities:: Lead end-to-end testing efforts for data/ETL pipelines across big data platforms. Design and implement test strategies for validating large datasets, transformations, and integrations. Perform hands-on testing of Hadoop-based data platforms (HDFS, Hive, Spark, etc.). Develop complex SQL queries for data validation and business rule testing. Collaborate with developers, product owners, and business analysts in Agile ceremonies. Own test planning, test case design, defect tracking, and reporting for assigned modules. Identify areas of automation and build reusable QA assets. Drive QA best practices and mentor junior QA team members. : Required Skills:: 7–11 years of experience in Software Testing, with at least 3+ years in Big Data/Hadoop testing. Strong hands-on experience in testing Hadoop components like HDFS, Hive, Spark, Sqoop, etc. Proficient in SQL (complex joins, aggregations, data validation). Experience in ETL/Data Warehouse testing. Familiarity with data ingestion, transformation, and validation techniques.

Posted 1 month ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

TCS present an excellent opportunity for Data Engineers Job Location: Pan India Experience required : 10-12 Yrs Skills: GCP + Big query + DBT + Data Build Tool + Airflow + Cloud Composer + Data Modeler Detailed Job Description • Google Cloud Data Engineering – BigQuery, Dataflow, Cloud composer, Cloud Pub/Sub, • Designing, building, and maintaining scalable data pipelines and architectures on Google Cloud Platform • SQL, Database/schema design, ETL, write queries to extract, transform data from multiple Data sources. • Write complex SQL/Bigquery/DBT Queries for analysis and reporting • ETL : Develop ETL solutions on Google cloud platform, Develop Data model, move data from various sources into Data warehouse – BigQuery • Data Analytics Platform : Extract actionable insights from large, complex datasets and build data products like dashboards to operationalize them, driving measurable improvements in Key Performance Indicators (KPIs) • Build, test, and maintain database pipeline architectures, Ability to work on Metadata - Various sources of data, query and display large data sets • Pseudocode interpretation : Translate Pseudocode into structured SQL queries, Verify table relationships, filters and aggregations • Own the process of gathering, extracting, and compiling data across sources • Trends and patterns : Pre-processing processing and cleansing of structured and unstructured data, analyse large amount of data to find patterns and develop solutions • AI&ML : Design, develop, and deploy machine learning models, build data insights in Cybersecurity domain using Vertex AI • Automation of workflows, data flow from and to Various data sources, Machine learning algorithms (Decision trees, random forests) and using these algorithms to build predictive models

Posted 1 month ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Senior QA Engineer Job Summary We are seeking a Senior QA Engineer to join our Data Engineering team, responsible for ensuring the integrity, quality, and reliability of our enterprise-scale data pipelines and analytics solutions. You will play a critical role in shaping our data quality strategy, building automated testing frameworks, and leading QA efforts across complex data initiatives. This role requires deep experience with testing large-scale data processing systems, excellent SQL skills, and a proactive mindset toward identifying data quality issues before they impact the business Key Responsibilities Define and lead the end-to-end QA strategy for data engineering initiatives, ensuring high data accuracy, completeness, and reliability. Develop and maintain automated data validation frameworks to test batch and streaming data pipelines. Conduct in-depth testing of ETL/ELT processes, transformations, and data integration logic using SQL and scripting languages. Build reusable data quality rules and monitoring dashboards using tools like Great Expectations or similar. Collaborate with cross-functional teams including data engineers, data scientists, DevOps, and business stakeholders to validate business logic and data lineage. Identify gaps in data QA practices and drive improvements in processes, tooling, and documentation. Own and enhance test suites integrated into CI/CD pipelines for data products. Provide mentorship to junior QA engineers and contribute to best practice sharing across teams. Participate in code reviews and offer recommendations on data testing approaches and design. Required Qualifications Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or a related field. 8+ years of experience in data-centric QA roles, with a focus on large-scale data platforms and pipelines. Expert-level SQL skills; capable of writing complex joins, aggregations, and validation logic. Strong hands-on experience with data integration tools like Talend, Informatica, Apache Airflow, or DBT. Proficiency in Python or other scripting languages used for data validation and test automation. Familiarity with modern cloud data platforms (e.g., Snowflake, Redshift, Databricks, BigQuery). Experience with CI/CD tools and integration of automated data tests into deployment workflows. Proven track record of leading QA efforts on cross-functional data initiatives Preferred Qualifications Experience with data observability and quality tools (e.g., Great Expectations, Monte Carlo, Soda). Exposure to big data ecosystems (Spark, Kafka, Hive, etc.). Knowledge of data governance, lineage, and cataloging tools (e.g., Collibra, Alation). ISTQB, DataOps, or similar QA certifications. Background in testing BI/reporting layers or semantic models (e.g., Tableau, Looker, Power BI). Soft Skills Strong leadership and ownership mindset. Ability to communicate complex data quality issues to both technical and non-technical audiences. Comfortable working in a fast-paced, agile environment with evolving priorities. A passion for data excellence and continuous quality improvement. About Convera Convera is the largest non-bank B2B cross-border payments company in the world. Formerly Western Union Business Solutions, we leverage decades of industry expertise and technology-led payment solutions to deliver smarter money movements to our customers – helping them capture more value with every transaction. Convera serves more than 30,000 customers ranging from small business owners to enterprise treasurers to educational institutions to financial institutions to law firms to NGOs. Our teams care deeply about the value we bring to our customers which makes Convera a rewarding place to work. This is an exciting time for our organization as we build our team with growth-minded, results-oriented people who are looking to move fast in an innovative environment. As a truly global company with employees in over 20 countries, we are passionate about diversity; we seek and celebrate people from different backgrounds, lifestyles, and unique points of view. We want to work with the best people and ensure we foster a culture of inclusion and belonging. We offer an abundance of competitive perks and benefits including: Competitive salary Opportunity to earn an annual bonus. Great career growth and development opportunities in a global organization A flexible approach to work There are plenty of amazing opportunities at Convera for talented, creative problem solvers who never settle for good enough and are looking to transform Business to Business payments. Apply now if you’re ready to unleash your potential.

Posted 1 month ago

Apply

0 years

0 Lacs

India

Remote

Job Details: Role: Solution Architect Employment Type: FTE with Vdart Digital Work Location: Remote Job Description: Solution Architect - Analytics (Snowflake) Role Summary: We are seeking a Solution Architect with a strong background in application modernization and enterprise data platform migrations. Key Responsibilities: Provide solution architecture leadership & strategy development. Engage with BAs to translate functional and non-functional requirements into solution designs. Lead the overall design, including detailed configuration. Collaborate with Enterprise Architecture. Conduct thorough reviews of code and BRDs to ensure alignment with architectural standards, business needs, and technical feasibility. Evaluate system designs, ensuring scalability, security, and performance while adhering to best practices and organizational guidelines. Troubleshoot and resolve technical challenges encountered during coding, integration, and testing phases to maintain project timelines and quality. Strong expertise in data warehousing and data modelling Excellent communication, collaboration and presentation skills. Experience with ETL/ELT tools and processes, building complex pipelines and data ingestion. SQL Skillset needed: Should be able to write Advanced SQL, complex joins. Subqueries - correlated /non correlated, CTEs. Window functions. Aggregations - Group by, rollup, cube, pivot. Snowflake Skilled needed: Should be able to understand and write UDFs and stored procedures in snowflake. Have good understanding of Snowflake architecture, clustering , micro partitions, caching, virtual warehouse, stages, storage , security(row and column level security). Knowledge of snowflake features (Streams, time -travel, zero copy cloning, Snowpark and tasks). Provide expert recommendations on frameworks, tools, and methodologies to optimize development efficiency and system robustness. Performance tuning within Snowflake (Performance bottlenecks, materialized views, search optimization) Solution design- Ability to architect scalable, cost-effective snowflake solutions. Cost management - Monitor and optimize Snowflake credit usage and storage costs.

Posted 1 month ago

Apply

7.0 years

0 Lacs

Haryana, India

On-site

ETL QA - Technical Lead Experience: 7 to 11 Years Job Locations: Hyderabad (1 position) | Gurgaon (1 position) Job Summary: We are looking for a highly skilled and detail-oriented ETL QA - Technical Lead with strong experience in Big Data Testing , Hadoop ecosystem , and SQL validation . The ideal candidate should have hands-on experience in test planning, execution, and automation in a data warehouse/ETL environment. You'll work closely with cross-functional teams in an Agile environment to ensure the quality and integrity of large-scale data solutions. Key Responsibilities: Lead end-to-end testing efforts for data/ETL pipelines across big data platforms Design and implement test strategies for validating large datasets, transformations, and integrations Perform hands-on testing of Hadoop-based data platforms (HDFS, Hive, Spark, etc.) Develop complex SQL queries for data validation and business rule testing Collaborate with developers, product owners, and business analysts in Agile ceremonies Own test planning, test case design, defect tracking, and reporting for assigned modules Identify areas of automation and build reusable QA assets Drive QA best practices and mentor junior QA team members Required Skills: 7-11 years of experience in Software Testing, with at least 3+ years in Big Data/Hadoop testing Strong hands-on experience in testing Hadoop components like HDFS, Hive, Spark, Sqoop, etc. Proficient in SQL (complex joins, aggregations, data validation) Experience in ETL/Data Warehouse testing Familiarity with data ingestion, transformation, and validation techniques

Posted 1 month ago

Apply

4.0 - 7.0 years

3 - 6 Lacs

Hyderābād

On-site

KEY AREAS DETAILS FOR RECRUITMENT, PERFORMANCE MANAGEMENT AND DEVELOPMENT RELATED ACTIVITIES Job Title Business Analyst Functional Job Title Business Analyst Department Hyderabad Corporate Reports to CFO PAN-India Market-research. Supporting Sales & Marketing, Plant & Procurement Teams. Creating a detailed business analysis, outlining problems, opportunities and solutions for a business. Budgeting and forecasting. Planning and monitoring. Variance analysis. Pricing & Reporting. Scope Job Summary (Why does this Job Exist?) Creating detailed business analysis, Outlining problems, opportunities and solutions for a business. Budgeting and forecasting. Planning and monitoring. Variance analysis. Pricing & Reporting. Minimum Qualification B. Tech (Mechanical) preferably, MBA Finance from a good Business School. Certification of Data Science/ Business Analyst will be preferred Experience 4 – 7 years of relevant experience (preferably the building material industry) Compensation As per industry norms Sound business acumen (Market understanding / knowledge) Technical Capability (analytical / software tools) Minimum Competencies (Knowledge and Skills) Excellent knowledge of statistical packages (SPSS, SAS or similar), databases and MS Office (having excellent working on-hand knowledge & experience in excel/spreadsheets ) Excellent understanding of search engines, web analytics and business research tools. Knowledge of the Python (or R) programming language (preferred) Ability to interpret and convert large amounts of data into meaningful analysis accurately. Conducting statistical analysis and presenting results using modern data visualization techniques Developing insightful and interactive business intelligence reports and Dashboards Good interpersonal skills Strong communication and presentation skills. Adheres toward ethical conduct, personal effectiveness and credibility Behavioral Competencies Ability to prioritize, multi-task, and deliver with regard to fast deadlines. Ability to simplify complex information into a user-friendly format. Analytical thinker with strong theoretical and research proficiencies. Solid organizational skills and detail oriented. Primary Roles Data Collection and consolidating information. Compiling and analyzing statistical data. Monitoring and forecasting marketing and product trends. Having good working on-hand experience in Trend Analysis Conversion of complex data and findings into understandable tables, graphs, and written reports. Preparation of reports for presenting to the management. Experience in data acquisition, performing data transformations, data aggregations using SQL, Python. Expertise in performing in-depth data analysis using Microsoft Excel and its advanced functions The primary responsibility of Business Analyst is to communicate with all stakeholders & to elicit, analyze and validate the requirements for changes to business processes, information systems, and policies. He /She plays a big role in moving an organization toward efficiency, productivity and profitability. Collect data on consumers, competitors and marketplace and consolidate information into actionable items, reports and presentations Understand business objectives and Forecast probability market trend Primary Responsibilities / Accountabilities (What to do in this job?) Compile and analyze statistical data using modern and traditional methods. Perform valid and reliable market research SWOT analysis Interpret data, formulate reports and make recommendations Use online market research & give inputs to company databases Provide competitive analysis on various companies’ market offerings, identify market trends, pricing/business models, Production, Sales and methods of operation Evaluate program methodology and key data to ensure that data on the releases are accurate Remain fully informed on market trends, other parties research and implement best practices Experience providing ad-hoc reports to answer specific business questions from business leaders Experience conducting and delivering experiments and proofs of concept to validate business ideas and their potential value

Posted 1 month ago

Apply

0 years

3 - 7 Lacs

Chennai

On-site

ABAP developer will coordinate the plan, design, develop, test, and implementation of SAP programs across an enterprise-wide SAP system instance. This person must be able to work with a small team of SAP professionals. The ABAP developer must be a proactively member of project development teams and support specific best practices related the BorgWarner’s SAP template. Development objects will include reports, interfaces, conversions and enhancements ABAP developer must be able to brief project management on the status of development and any associated risk. Key Roles & Responsibilities With in-depth knowledge of general ABAP programming techniques (RICEFW - Report, Interface, Conversion, Enhancement, Form and Workflow); with programming Function Modules, Object-Oriented ABAP, User Exits, Enhancement Spots (Implicit and Explicit) and Dialog Programming Expert in data conversion using LSMW (BDC, BAPI, IDOC, Direct/Batch Input) and developing standalone programs with correct techniques for data conversions Intensive knowledge of RF development using RF framework supported by MDE configuration Module pool programming experience using custom controls (ALV/Tree/Image), OLE embedding, etc., Expertise in report programming using ALV, classical, drill down and interactive using ALV events Proficient in developing ABAP queries and quick viewers queries Experience in code optimization, performance tuning and runtime analysis Expertise in using Code Inspector, Single Transaction Analysis, SQL Performance Trace, Runtime Analysis tools Good knowledge and hands on experience of SAP interfacing technologies (ALE/IDocs, RFCs, BAPI's, ODATA, flat-file interfaces). Experience with SAP Scripts, SAP Smartforms and Adobe Forms Knowledge of label design using Nice Label/Bartender is preferred Knowledge in usage of RF guns, Zebra label printers Experience with SPAU/SPDD activities for system upgrade activities Ability to help resolve complex technical issues and independently manage critical/complex situations Perform break/fix analysis and recommend solutions Estimate development costs on associated programs Creating technical design specifications to ensure compliance with the functional teams and IT Management Advise on new technologies and keep abreast of SAP releases, enhancements/new functionality Ensure compliance with BorgWarner policies and design standards on implementation projects Nice to have skills, Preferred Skill Set – Including, but not limited to, ABAP on HANA, HANA modelling, OO ABAP, Gateway for OData service building, XML, REST Web Service Experience with SAP Fiori, Cloud Platform, OData technologies, HANA DB, SAPUI5, implementing and extending standard SAP Fiori Apps is a plus Expertise in Native HANA development and ABAP CDS views, experience in creating complex HANA views with aggregations and joins; AMDP, code push down techniques is a plus Should have expertise in Designing and Modeling OData services using the Gateway Service Builder Experience in using SOLMAN 7.2 with ChaRM and SolDoc Internal Use Only: Salary Global Terms of Use and Privacy Statement Carefully read the BorgWarner Privacy Policy before using this website. Your ability to access and use this website and apply for a job at BorgWarner are conditioned on your acceptance and compliance with these terms. Please access the linked document by clicking here , select the geographical area where you are applying for employment, and review. Before submitting your application you will be asked to confirm your agreement with the terms. Career Scam Disclaimer: BorgWarner makes no representations or guarantees regarding employment opportunities listed on any third-party website. To protect against career scams, job applicants should take the necessary precautions when interviewing for and accepting employment positions allegedly offered by BorgWarner. Applicants should never provide their national ID numbers, birth dates, credit card numbers, bank account information or other private information when communicating with prospective employers or responding to employment opportunities online. Job applicants are invited to contact BorgWarner through BorgWarner’s website to verify the authenticity of any employment opportunities.

Posted 1 month ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Where Data Does More. Join the Snowflake team. Snowflake’s Support team is expanding! We are looking for a Senior Cloud Support Engineer who likes working with data and solving a wide variety of issues utilizing their technical experience having worked on a variety of operating systems, database technologies, big data, data integration, connectors, and networking. Snowflake Support is committed to providing high-quality resolutions to help deliver data-driven business insights and results. We are a team of subject matter experts collectively working toward our customers’ success. We form partnerships with customers by listening, learning, and building connections. Snowflake’s values are key to our approach and success in delivering world-class Support. Putting customers first, acting with integrity, owning initiative and accountability, and getting it done are Snowflake's core values, which are reflected in everything we do. As a Senior Cloud Support Engineer , your role is to delight our customers with your passion and knowledge of Snowflake Data Warehouse. Customers will look to you for technical guidance and expert advice with regard to their effective and optimal use of Snowflake. You will be the voice of the customer regarding product feedback and improvements for Snowflake’s product and engineering teams. You will play an integral role in building knowledge within the team and be part of strategic initiatives for organizational and process improvements. Based on business needs, you may be assigned to work with one or more Snowflake Priority Support customers . You will develop a strong understanding of the customer’s use case and how they leverage the Snowflake platform. You will deliver exceptional service, enabling them to achieve the highest levels of continuity and performance from their Snowflake implementation. Ideally, you have worked in a 24x7 environment, handled technical case escalations and incident management, worked in technical support for an RDBMS, been on-call during weekends, and are familiar with database release management. AS A SENIOR CLOUD SUPPORT ENGINEER AT SNOWFLAKE, YOU WILL: Drive technical solutions to complex problems providing in-depth analysis and guidance to Snowflake customers and partners using the following methods of communication: email, web, and phone Adhere to response and resolution SLAs and escalation processes to ensure fast resolution of customer issues that exceed expectations Demonstrate good problem-solving skills and be process-oriented Utilize the Snowflake environment, connectors, 3rd party partner software, and tools to investigate issues Document known solutions to the internal and external knowledge base Report well-documented bugs and feature requests arising from customer-submitted requests Partner with engineering teams in prioritizing and resolving customer requests Participate in a variety of Support initiatives Provide support coverage during holidays and weekends based on business needs OUR IDEAL SENIOR CLOUD SUPPORT ENGINEER WILL HAVE: Bachelor’s. or Master’s degree in Computer Science or equivalent discipline. 5+ years experience in a Technical Support environment or a similar technical function in a customer-facing role Solid knowledge of at least one major RDBMS In-depth understanding of SQL data types, aggregations, and advanced functions including analytical/window functions A deep understanding of resource locks and experience with managing concurrent transactions Proven experience with query lifecycle, profiles, and execution/explain plans Demonstrated ability to analyze and tune query performance and provide detailed recommendations for performance improvement Advanced skills in interpreting SQL queries and execution workflow logic Proven ability with rewriting joins for optimization while maintaining logical consistency In-depth knowledge of various caching mechanisms and ability to take advantage of caching strategies to enhance performance Ability to interpret systems performance metrics (CPU, I/O, RAM, Network stats) Proficiency with JSON, XML, and other semi-structured data formats Proficient in database patch and release management NICE TO HAVES: Knowledge of distributed computing principles and frameworks (e.g., Hadoop, Spark) Scripting/coding experience in any programming language Database migration and ETL experience Ability to monitor and optimize cloud spending using cost management tools and strategies. SPECIAL REQUIREMENTS: Participate in pager duty rotations during nights, weekends, and holidays Ability to work the 4th/night shift which typically starts from 10 pm IST Applicants should be flexible with schedule changes to meet business needs Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake. How do you want to make your impact? For jobs located in the United States, please visit the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies