Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 10.0 years
25 - 32 Lacs
bengaluru
Work from Office
Position Overview: We seek a highly skilled and experienced Data Engineering Lead to join our team. This role demands deep technical expertise in Apache Spark, Hive, Trino (formerly Presto), Python, AWS Glue, and the broader AWS ecosystem. The ideal candidate will possess strong hands-on skills and the ability to design and implement scalable data solutions, optimise performance, and lead a high-performing team to deliver data-driven insights. Key Responsibilities: Technical Leadership Lead and mentor a team of data engineers, fostering best practices in coding, design, and delivery. Drive the adoption of modern data engineering frameworks, tools, and methodologies to ensure high-quality and scalable solutions. Translate complex business requirements into effective data pipelines, architectures, and workflows. Data Pipeline Development Architect, develop, and optimize scalable ETL/ELT pipelines using Apache Spark, Hive, AWS Glue, and Trino. Handle complex data workflows across structured and unstructured data sources, ensuring performance and cost-efficiency. Develop real-time and batch processing systems to support business intelligence, analytics, and machine learning applications. Cloud & Infrastructure Management Build and maintain cloud-based data solutions using AWS services like S3, Athena, Redshift, EMR, DynamoDB, and Lambda. Design and implement federated query capabilities using Trino for diverse data sources. Manage Hive Metastore for schema and metadata management in data lakes. Performance Optimization Optimize Apache Spark jobs and Hive queries for performance, ensuring efficient resource utilization and minimal latency. Implement caching and indexing strategies to accelerate query performance in Trino. Continuously monitor and improve system performance through diagnostics and tuning. Collaboration & Stakeholder Engagement Work closely with data scientists, analysts, and business teams to understand requirements and deliver actionable insights. Ensure that data infrastructure aligns with organizational goals and compliance standards. Data Governance & Quality Establish and enforce data quality standards, governance practices, and monitoring processes. Ensure data security, privacy, and compliance with regulatory frameworks. Innovation & Continuous Learning Stay ahead of industry trends, emerging technologies, and best practices in data engineering. Proactively identify and implement improvements in data architecture and processes. Qualifications: Required Technical Expertise Advanced proficiency with Apache Spark (core, SQL, streaming) for large-scale data processing. Strong expertise in Hive for querying and managing structured data in data lakes. In-depth knowledge of Trino (Presto) for federated querying and high-performance SQL execution. Solid programming skills in Python with frameworks like PySpark and Pandas. Hands-on experience with AWS Glue, including Glue ETL jobs, Glue Data Catalog, and Glue Crawlers. Deep understanding of data formats such as Parquet, ORC, Avro, and their use cases. Cloud Proficiency Expertise in AWS services, including S3, Redshift, Athena, EMR, DynamoDB, and IAM. Experience designing scalable and cost-efficient cloud-based data solutions. Performance Tuning Strong ability to optimize Apache Spark jobs, Hive queries, and Trino workloads for distributed environments. Experience with advanced techniques like partitioning, bucketing, and query plan optimization. Leadership & Collaboration Proven experience leading and mentoring data engineering teams. Strong communication skills, with the ability to interact with technical and non-technical stakeholders effectively. Education & Experience Bachelors or Masters degree in Computer Science, Data Engineering, or a related field. 8+ years of experience in data engineering with a minimum of 2 years in a leadership role. Qualifications: 8+ years of experience in building data pipelines from scratch in large data volume environments AWS certifications, such as AWS Certified Data Analytics or AWS Certified Solutions Architect. Experience with Kafka or Kinesis for real-time data streaming would be a plus. Familiarity with containerization tools like Docker and orchestration platforms like Kubernetes. Knowledge of CI/CD pipelines and DevOps practices for data engineering. Prior experience with data lake architectures and integrating ML workflows. Mandatory Key SkillsCI/CD,DevOps,data engineering,Apache Spark jobs,Hive queries,Performance Tuning,AWS Glue,Data Governance,AWS*,Spark*,Python*,Hive*,ETL*
Posted 11 hours ago
1.0 - 4.0 years
4 - 8 Lacs
bengaluru
Work from Office
We are currently seeking a Java Backend to join our team in Bangalore, Karntaka (IN-KA), India (IN). Key Skills: Core Java, J2EE, Spring Boot, Micro Services, REST, Database skills, Cloud Technologies . Key Responsibilities: Candidate should have strong fundamentals in Core Java, JDBC and J2EE Hands on experience in Java(version 8 or higher), Spring, Spring Boot and Cloud technologies. Good knowledge of Micro services architecture and REST web services/JSON Strong knowledge in any of the databases like Mongo DB/MySQL/SQL Server/Postgresql Knowledge of Design Patterns and Unit testing using Mockito frameworks. Guide/Mentor developers and help them with technical aspects as needed Good to have knowledge on DevOps. Good to have knowledge on AWS. Creative ideas with problem-solving mindset.
Posted 14 hours ago
0.0 - 4.0 years
0 Lacs
noida, uttar pradesh
On-site
As an AI Engineer at our company, your role will involve designing, developing, and deploying an in-house AI assistant powered by LLaMA 3 and integrated with our MS SQL-based ERP system (4QT ERP). You will be responsible for setting up LLM infrastructure, implementing voice input (Whisper), translating natural language to SQL, and ensuring accurate, context-aware responses to ERP-related queries. Key Responsibilities: - Setup and deploy LLaMA 3 (8B/FP16) models using llama-cpp-python or Hugging Face - Integrate the AI model with FastAPI to create secure REST endpoints - Implement prompt engineering or fine-tuning (LoRA) to enhance SQL generation accuracy - Develop a user-facing interface (React or basic web UI) for text or voice interactions - Integrate Whisper (OpenAI) or any STT system to support voice commands - Ensure that model responses are secure, efficient, and auditable (only SELECT queries allowed) - Supervise or perform supervised fine-tuning with custom ERP datasets - Optimize for performance (GPU usage) and accuracy (prompt/RAG tuning) Qualifications Required: - Strong experience with LLM deployment (LLaMA 3, Mistral, GPT-type models) - Solid Python development experience using FastAPI or Flask - Proficiency in SQL, especially with MS SQL Server, and ability to write and validate queries - Experience with llama-cpp-python, Hugging Face Transformers, and LoRA fine-tuning - Familiarity with LangChain or similar LLM frameworks - Understanding of Whisper (STT) or equivalent Speech-to-Text tools - Experience working with GPU inference (NVIDIA 4070/5090 etc.) This job is suitable for freshers and the work location is in person.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
ahmedabad, gujarat
On-site
As an experienced ServiceNow Administrator with domain separation expertise, your role will involve supporting a multi-tenant ServiceNow environment. Your responsibilities will include incident management and resolution, system administration and maintenance, reporting and analytics, contact and configuration management, as well as providing support for MSP and NOC operations. Key Responsibilities: - Incident Management & Resolution - Monitor, triage, and resolve ServiceNow platform incidents across multiple domains - Perform root cause analysis and implement corrective actions - Collaborate with cross-functional teams for timely incident resolution - Maintain incident documentation and knowledge base articles - System Administration & Maintenance - Administer ServiceNow instances with domain separation configurations - Assist with system upgrade cycles and release management - Develop and execute fix scripts for system-wide records - Perform regular system health checks and preventive maintenance - Manage user access, roles, and permissions across separated domains - Reporting & Analytics - Create and maintain operational dashboards and reports - Generate KPI reports for MSP and NOC operations - Develop custom reports to meet business requirements - Monitor system performance metrics and usage analytics - Contact & Configuration Management - Maintain accurate contact databases and user records - Update and validate CMDB data integrity across domains - Manage asset lifecycle processes and documentation - Ensure data consistency and quality across all modules - MSP & NOC Support - Provide technical support for Managed Service Provider operations - Support Network Operations Center activities and workflows - Configure and customize applications to meet operational requirements - Assist with service catalog management and workflow automation Qualifications Required: - Proven experience in ServiceNow domain separation implementation and management - Strong understanding of ServiceNow platform architecture and modules - Proficiency in JavaScript, HTML, CSS, and ServiceNow scripting - Knowledge of ITIL processes and best practices - Experience supporting MSP and NOC environments - Strong analytical and problem-solving skills - Proficiency in navigating Now UI and Workspace views - Excellent communication and documentation skills Preferred Certifications: - ServiceNow Certified System Administrator (CSA) - ServiceNow Certified Implementation Specialist - Service Provider (CIS-SP) - ServiceNow Certified Implementation Specialist - Customer Service Management (CIS-CSM) Technical Skills: - ServiceNow platform administration (minimum 3 years) - Domain separation configuration and management - CMDB and Asset Management - Incident and Problem Management - Service Portal configuration - Integration experience (REST/SOAP APIs) - Database management and SQL knowledge Please note, domain separation experience is mandatory for this position. Candidates without this specific experience will not be considered.,
Posted 4 days ago
5.0 - 10.0 years
8 - 12 Lacs
pune
Work from Office
5+ years of overall experience with min 2 years of experience in Regulatory Reporting with exposure to the Axiom. Implementation experience preferably in the Banking domain. Transformation of business requirements into technical needs and implementation procedures. Ability to work independently. Good communication skills. Experience in Software Design and Development. Primarily working in the Data Warehousing and Regulatory Reporting area in Banking and Finance domain Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Should have good experience on simple and complex sql query writing using all type of joins, analytical functions, with clause, sub query, set operators. Very well verse and should have good exposure with Oracle SQL Loader, Partitioning, performance tuning. Should have good hands-on oracle procedural language like PL/SQL procedure, function, packages, views, temporary tables, collections, etc. Preferred to have knowledge on UNIX, shell scripting, Autosys jobs Preferred technical and professional experience Experience with Axiom Controller View Regulatory Reporting tool - version 9x/10x. Experience with Axiom implementation for different Regulations
Posted 5 days ago
1.0 - 2.0 years
17 - 20 Lacs
bengaluru
Work from Office
About The Role Job Title:Transformation Office Value Support (Operations) Management Level :11 Analyst Location:Bangalore / Gurgaon / Mumbai Must have skills: Excel, Power Bi, Figma, PPT Experience is a must. Python, SQL knowledge is good to have. Good to have skills:English, Transformation PMO, Culture Fit & Relevant skills Job Summary : We are looking for a detail-oriented Value Support Specialist to assist in driving value delivery across transformation initiatives. The role involves supporting teams with impactful presentations, design assets, and data tracking using PowerPoint, Figma, and Excel. Strong collaboration, visual storytelling, and execution skills are key to success in this role. Roles & Responsibilities: Support regular value updates and ensure accurate reflection in the Momentum tool. Maintain and update initiative data, baselines, and targets in coordination with initiative owners. Connect with stakeholders to gather input, resolve data queries, and ensure timely updates. Provide ongoing support to initiative teams for value tracking and reporting requirements. Create and update dashboards and reports using Excel and PowerPoint for performance reviews. Professional & Technical Skills: Minimum of 1-2 years of experience in a related field Proficiency in Value Realization Tools Familiarity with Python, SQL Proficiency in PowerPoint, Excel, and basic design tools like Figma. Strong attention to detail and data accuracy. Good communication and stakeholder coordination skills. Exposure to dashboarding or automation tools. Ability to manage multiple tasks and deliverables in a fast-paced environment. Additional Information: Experience in a transformation office or similar role. Advanced degree in business, finance, or a related field. Ability to collaborate cross-functionally with initiative teams and senior stakeholders. About Our Company | Accenture Qualification Experience:Minimum of 2 year(s) of experience is required Educational Qualification:MBA
Posted 5 days ago
0.0 - 1.0 years
0 Lacs
gurugram
Work from Office
Role & responsibilities: Assist in extracting and analyzing data using SQL queries. Create and maintain reports and dashboards in Excel. Help identify trends and provide insights to support business decisions. Work with team members to ensure data accuracy and consistency. Participate in data cleaning and validation processes. Preferred candidate profile: Basic knowledge of SQL. Strong skills in Microsoft Excel (pivot tables, formulas, charts, etc.). Currently pursuing or recently completed a degree in Data Science, Statistics, Computer Science, Engineering, Business, or related field. Strong analytical thinking and attention to detail. Good communication skills and ability to work in a team environment. Willingness to learn and take initiative.
Posted 6 days ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
Join us as an ETL Tester at Barclays where you will spearhead the evolution of our infrastructure and deployment pipelines, driving innovation and operational excellence. You will harness cutting-edge technology to build and manage robust, scalable and secure infrastructure, ensuring seamless delivery of our digital solutions. To be successful as an ETL Tester, you should have experience with ETL, SQL, Unix, Cloud Exposure, Agile Experience, analytical ability, Jira, and documentation preparation. Additionally, highly valued skills may include Scala, Pyspark, Python, AWS Experience, and Automation UI/ETL. You may be assessed on key critical skills relevant for success in the role, such as risk and controls, change and transformation, business acumen, strategic thinking, digital and technology, as well as job-specific technical skills. The role is based out of Pune. Purpose of the role: To design, develop, and execute testing strategies to validate functionality, performance, and user experience, while collaborating with cross-functional teams to identify and resolve defects. You will continuously improve testing processes and methodologies to ensure software quality and reliability. Accountabilities include: - Development and implementation of comprehensive test plans and strategies to validate software functionality and ensure compliance with established quality standards. - Creation and execution of automated test scripts, leveraging testing frameworks and tools to facilitate early detection of defects and quality issues. - Collaboration with cross-functional teams to analyze requirements, participate in design discussions, and contribute to the development of acceptance criteria. - Root cause analysis for identified defects, working closely with developers to provide detailed information and support defect resolution. - Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing. - Stay informed of industry technology trends and innovations, and actively contribute to the organization's technology communities. Analyst Expectations: To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in the assigned area of expertise, leading and supervising a team, guiding and supporting professional development, allocating work requirements, and coordinating team resources. People Leaders are expected to demonstrate a clear set of leadership behaviors: Listen and be authentic, Energize and inspire, Align across the enterprise, Develop others. For an individual contributor, they develop technical expertise in the work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area, partner with other functions and business areas, take responsibility for end results of a team's operational processing and activities, escalate breaches of policies/procedures appropriately, advise and influence decision-making within their area of expertise, take ownership for managing risk and strengthening controls, deliver work in line with relevant rules, regulations, and codes of conduct. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship our moral compass. They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge, and Drive the operating manual for how we behave.,
Posted 6 days ago
0.0 - 3.0 years
0 Lacs
gandhinagar, gujarat
On-site
As a QA Tester, you will be responsible for testing mobile applications (Android & iOS) and web-based admin panels. Your role will involve preparing, executing, and updating test cases, as well as identifying, documenting, and tracking bugs with accuracy. Conducting functional, regression, and UI testing will be part of your daily tasks. Collaborating with developers is essential to ensure product quality and performance, and providing valuable feedback for product improvements. To excel in this role, you should have between fresher to 1 year of QA/testing experience and possess basic knowledge of software testing concepts. Familiarity with mobile apps and web applications is required. Strong analytical and problem-solving skills, along with attention to detail and a user-focused mindset, are key attributes for success. While knowledge of bug-tracking tools like JIRA/Trello is preferred, it is not mandatory. It would be beneficial if you have an understanding of automation tools such as Selenium, Appium, etc. Basic SQL knowledge for database testing and exposure to Agile methodology are considered good to have skills. This is a full-time position based in Gandhinagar, Gujarat, with on-site work mode. The salary offered will be as per industry standards and will be based on your skills and experience level.,
Posted 6 days ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
You are an IAM Consultant with expertise in IAM product implementation and consulting, possessing 6+ years of experience in the field. You excel in creating an environment that promotes high-performing teams and are adept at collaborating with multiple stakeholders. Your skill set includes the ability to facilitate workshops, conduct activity & story mapping exercises, and deliver solutions effectively. A solution and delivery-oriented approach coupled with a strong problem-resolution mindset defines your work ethic. Additionally, you prioritize team dynamics and emphasize a "team before me" philosophy. As a vital member of the Technology and Innovation team at Falaina, you will collaborate with a dedicated group of professionals, including document writers, designers, developers, and architects. Together, your primary objective is to develop cutting-edge identity and data security technologies tailored to diverse industry use cases. Your performance will be gauged based on your ability to meet deadlines and deliver top-notch products/solutions aligned with the company's strategic direction. Key responsibilities include providing specialized knowledge in Identity and Access Management, Privileged Access Management, Data Access Governance, Web Single Sign-On, and MFA, crafting technical solution write-ups, offering insights based on industry best practices, generating various documentations, gathering IAM business and functional requirements, training pre-sales and project implementation teams, conducting R&D to enhance expertise, and delivering expert recommendations for product development. To excel in this role, you must hold a Bachelors/Masters degree in Computer Science/Information Technology/Engineering or equivalent and possess a minimum of 6 years of experience in IAM implementation, consultation, or architecture. Additional certifications such as CISSP, CISA, or similar are advantageous. You should demonstrate proficiency in cyber security domains, programming (preferably .Net), software development processes like SDLC, Agile, Scrum, DevOps, regulatory requirements related to IAM, PAM, SSO, DAG, software deployment models, SQL knowledge, troubleshooting, and problem-solving skills. Your comprehensive expertise will be instrumental in driving the success of IAM initiatives and contributing to the company's overall objectives.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
Join us as a Technical Specialist Practitioner at Barclays, responsible for supporting the successful delivery of Location Strategy projects to plan, budget, agreed quality and governance standards. You'll spearhead the evolution of our digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences. To be successful as a Technical Specialist Practitioner you should have experience with proficiency in Tableau, including creating dashboards, reports, and visual analytics. Additionally, possessing SQL knowledge to write and optimize queries for data extraction and manipulation. Understanding of data warehousing concepts such as data quality management, data analysis, data blending, and integration from multiple sources within Tableau is crucial. Other highly valued skills may include knowledge of records and data governance, excellent communication skills for presenting data findings effectively to stakeholders, and familiarity with project management practices including agile methodologies. This role is based in Pune. Purpose of the role: To build and maintain infrastructure platforms and products that support applications and data systems using hardware, software, networks, and cloud computing platforms to ensure reliability, scalability, and security. Ensure the reliability, availability, and scalability of systems, platforms, and technology through software engineering techniques, automation, and incident response best practices. Accountabilities include: Build Engineering: Developing, delivering, and maintaining high-quality infrastructure solutions to meet business requirements. Incident Management: Monitoring IT infrastructure and system performance to identify and resolve potential issues. Automation: Implementing automated tasks and processes to improve efficiency. Security: Implementing secure configurations and measures to protect infrastructure against cyber threats. Teamwork: Collaborating with product managers, architects, and engineers to align IT infrastructure with business objectives. Learning: Staying informed of industry trends and contributing to the organization's technology communities. As an Assistant Vice President, you are expected to advise decision-making, contribute to policy development, and lead a team in delivering impactful work. People Leaders are required to demonstrate leadership behaviours: Listen and be authentic, Energise and inspire, Align across the enterprise, Develop others. For individual contributors, leading collaborative assignments, guiding team members, and identifying new directions for projects are key responsibilities. Consult on complex issues, identify ways to mitigate risks, and engage in complex data analysis from multiple sources. All colleagues are expected to demonstrate Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship. Additionally, showcasing the Barclays Mindset to Empower, Challenge, and Drive in all aspects of work.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
You will be responsible for designing, developing, and publishing Power BI Reports. Additionally, you will develop Power BI Data models and provide recommendations for best practices in report creation and visualizations. You will implement row-level security on data, understanding application security layer models in Power BI, and integrate Dashboards into applications using Embedded Analytics. Collaboration with product owners to gather business requirements for Analytics and Reporting will also be part of your responsibilities. Providing guidance to BI developers in the team, troubleshooting reporting issues, and tuning report performance will be crucial. Furthermore, you will build and modify tables, views, optimize SQL queries, and indexes as needed. Implementing security on developed Dashboards in alignment with security and access requirements is essential. Ensuring agile/scrum practices are followed during the development process, evaluating self-service reporting and dashboard tools, developing standard reports, custom reports, and DAX expressions as per needs are key tasks. For this role, you should have a minimum of 8+ years of experience in Power BI report development, proficiency in writing DAX expressions, troubleshooting performance issues in reports, knowledge of query performance tuning, indexes, and understanding of database structure. Familiarity with data modeling, data warehouse, various databases like MS SQL Server, PostgreSQL, Oracle, and SQL expertise in developing complex SQL queries and Stored Procedures is required. A passion for complex data structures, problem-solving, and the ability to quickly grasp new data tools and concepts are important qualities. Ideally, you should hold a degree in Computer Science, Software Engineering, or possess related software engineering experience.,
Posted 1 week ago
5.0 - 10.0 years
7 - 11 Lacs
bengaluru
Work from Office
We are looking for a Senior Software Engineer- Data Engineer to join our team in India. This is an amazing opportunity to work on different project with our Data engineering team having different competencies to compete and learn together. We have a great skill set in the team related to Pypark, AWS, Apahe, databrics. About You experience, education, skills, and accomplishments: Bachelors degree in engineering or masters degree (BE, ME, B Tech, MTech, MCA, MS) 5+ years of experience working in software development 3+ years experience in building massively scalable distributed data processing solutions 3+ years experience in database design & development Strong data analysis skills, strong analytics, and problem-solving skills Solid development experience in a commercial IT environment Experience in designing cloud-based data pipelines & solutions Passionate about code and software architecture Effective communicator at all levels, excellent inter-personal skills, strong business focus Highly self-motivated, confident to work on projects alone as well as in a team An enthusiastic approach to extending knowledge and learning new skills Strong commitment to the quality of work and a good attention to detail Experience with Agile\Scrum Software Development Methodologies It would be great if you also had: Proficient in Python, Apache Spark, including PySpark, DLT, DBT, Databricks Strong technical knowledge in Relation Database Management System, including Oracle, Postgres, etc Proficient in writing complex SQL queries, PL/SQL procedures Good understanding of XML, JSON Version control software: GitHub, BitBucket Continuous Integration and Deployment Concepts and tools Knowledge on any of these technologies/tools: Cassandra, Hadoop, Apache Hive, Snowflake, Jupiter notebook, Databricks stack, AWS services, EC2, ECS, RDS, EMR, S3, AWS Glue, Airflow What will you be doing in this role? Write clean, efficient, and maintainable code in accordance with coding standards. Works closely with higher-level engineers to increase functional knowledge. Learn and apply best practices in software development. Develops and applies understanding of software development lifecycle and delivery methodology. Suggests alternative methodologies or techniques to achieving desired results. Maintains awareness of technical advances. Manually tests and unit tests all assigned applications. Participates as a team member on various engineering projects. Writes application technical documentation. Follows team policies, procedures, and work instructions. About the Team We are team located in India, US, and Europe. Hours of Work Regular working timing in India.
Posted 1 week ago
7.0 - 8.0 years
12 - 16 Lacs
bengaluru
Work from Office
Role & responsibilities Required Skills Core Expertise: SAP Analytics Cloud (SAC) Additional Skills (any of the following): SAP Universe Lumira Designer SAP BusinessObjects (BO) / Web Intelligence (Webi) Analysis for Office Data Skills: SAP Datasphere or strong SQL knowledge Preferred Skills (Good to Have any one of the following) Hybrid Planning (SAP BPC) BW on HANA and/or BW/4HANA Native HANA Job Responsibilities Data Analysis & Visualization Design intuitive dashboards and reports using enterprise templates Strong understanding of SAC, SAP BO, BW, BW-IP architecture and reporting strategies Connect to various data sources (Live/Import): SAP Universe, BW, Native HANA, S/4HANA, HANA Cloud, Concur, SuccessFactors, ARIBA, IBP, and other SAP/non-SAP systems Perform data modeling, wrangling, merging, custom calculations, and formatting of dimensions/measures Develop analytic, planning, and predictive models with interactive visuals (charts, tables, maps, filters, dropdowns, etc.) Utilize SAC Augmented Analytics: Smart Insights, Search to Insight, Smart Discovery, Smart Predict Implement security models (e.g., rebuild from BW) Enable data export/write-back to SAP systems via standard methods, OData, and APIs Build SAC stories using canvas, responsive, and grid layouts with advanced features like data blending, cross calculations, input controls, linked analysis Develop Digital Boardroom and Analytics Designer applications Analytics Designer Build complex analytic and planning applications Proficient in HTML, CSS, JavaScript Bonus: Python or R knowledge Planning Design and implement SAC planning, forecasting, and budgeting models Experience with EPM tools like SAP BPC, IP Configure planning dimensions and models (standard/new), and map security to BW setups Support collaborative planning scenarios (top-down/bottom-up), versioning, currency conversions Implement allocations, value driver trees, what-if simulations, and Smart Predict Familiarity with SAC/BW/BW-IP functions: Data Actions, Multi Actions, Planning Data Connections SAP Datasphere Data Integration: Connect and ingest data from SAP/non-SAP systems and flat files Support batch and real-time integrations Data Modeling: Create schemas using star/snowflake models Build complex structures, relationships, and hierarchies using graphical and SQL modeling Data Quality & Governance: Apply data profiling, cleansing, validation Analyze governance and compliance via lineage and impact tools Integrations: Implement integrated scenarios with SAC and other SAP/non-SAP applications General Requirements 3-4 years of experience in SAP Analytics Cloud, including SAC Planning 7-8 years of experience in SAP BO/BI/BW/HANA Strong stakeholder management and presentation skills Ability to lead workshops, support UAT, and adhere to regulated environments and processes Familiarity with ALM test cases and transition to support teams Collaborative and professional team engagement Capable of working independently and within teams Skilled in wireframing/prototyping SAC solutions to demonstrate possibilities Translate business requirements into functional and technical dashboard specifications Deliver configurations per wireframes and approved requirements Build and execute unit test plans; support product integration testing and issue resolution Lead blueprinting sessions and act as a techno-functional developer for SAC implementations including Enterprise HANA Preferred candidate profile EXP:7-8 YEARS NP IMMEDIATE TO SERVING NP LOC:BANGALORE CLIENTCAPGEMINI PAYROLL ESOFT LABS If interested candidates please share the profiles at HR@TECHXLNC.AI
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You are a Solution Designer/Analyst responsible for bridging the gap between IT and the business. Your primary tasks include gathering and shaping technical and functional requirements from program stakeholders such as Business, IT, and Transformation programs. You will drive a comprehensive requirements management process through workshops with stakeholders, prioritization, fit-gap and dependencies analysis, user stories elaboration, estimations, and implementation risk assessments. Your role involves defining an end-to-end telco solution that meets stakeholder requirements and mapping these requirements to design and code, ensuring proper documentation and adherence to Telco standards (ODA / ETOM/ SID). In addition, you will provide support in the Software Development Life Cycle (SDLC) to Development and QA teams. To excel in this role, you should have at least 5+ years of software development experience in Java/J2EE, .NET, or similar technologies. A strong understanding of Object-Oriented Programming Concepts, experience with BPMN tools, and the ability to prepare High-Level Design (HLD) and Low-Level Design (LLD) based on user requirements is essential. You should also be proficient in tracing requirements to design through to code, database model designs, and possess the technical skills required to understand the final solution. It would be advantageous if you have exposure to Aris, SQL knowledge, a good command of the English language, and excellent written and verbal communication skills. Experience in team collaboration using Agile frameworks is highly desirable. Your business knowledge should include expertise in Telco solutions analysis based on TMFORUM standards (ODA/ETOM/SID) and understanding of Telco business processes such as Customer Engagement, Customer Care, Customer Sales, Customer Order Management, Service Order Management, Telecom Catalogs, Telecom Inventories, Risk Management, and Number Portability. Knowledge of Telecom application architectures will be an asset in performing your duties effectively.,
Posted 1 week ago
10.0 - 15.0 years
4 - 8 Lacs
mumbai
Work from Office
The candidate must possess in-depth functional knowledge of the process area and apply it to operational scenarios to provide effective solutions. He/she must be able to identify discrepancies and propose optimal solutions by using a logical, systematic, and sequential methodology. It is vital to be open-minded towards inputs and views from team members and to effectively lead, control, and motivate groups towards company objects. Additionally, he/she must be self-directed, proactive, and seize every opportunity to meet internal and external customer needs and achieve customer satisfaction by effectively auditing processes, implementing best practices and process improvements, and utilizing the frameworks and tools available. Goals and thoughts must be clearly and concisely articulated and conveyed, verbally and in writing, to clients, colleagues, subordinates, and supervisors. Process Manager Role and responsibilities: Understand the business model on why things are the way they are, ask relevant questions and get them clarified Breakdown complex problems in small solvable components, to be able to identify problem areas in each component Conduct cost/benefit analysis and feasibility studies for proposed projects to aid in decision-making Facilitate the implementation of new or improved business processes and systems Coordinate with business stakeholders to identify gaps in data, processes and suggest process improvements Understand and follow the project roadmap, plan data availability and coordinate with the execution team to ensure a successful execution of projects Prescribe suitable solutions with an understanding in limitations of toolsets and available data Fetch and analyze data from disparate sources and drive meaningful insights Provide recommendations on the business rules for effective campaign targeting Interpret analytical results and provide insights; present key findings and recommended next steps to clients Develop tangible analytical projects; communicate project details to clients and internal delivery team via written documents and presentations, in forms of specifications, diagrams, and data/process models Audit deliverables ensuring accuracy by critically examining the data and reports against requirements Collaborate on regional/global analytic initiatives and localize inputs for country campaign practices Actively work on audience targeting insights, optimize campaigns and improve comm governance Technical and Functional Skills: B Tech / BE / BSc / MBA degree or equivalent professionalDegreewith 10+ years of experience in marketing analytics or performance marketing professional experience in advanced analytics for a Fortune 500-scale company or a prominent consulting organization Reporting - Data Extraction tools, Advanced Excel, CRM Analytics, Campaign Performance Marketing, Analytics knowledge Campaign Analytics Strong in numerical and analytical skills. Strong in Advanced Excel (prior experience with Google sheets is an added plus) 10 years of relevant experience in analytical and storytelling skills; ability to derive relevant insights from large reports and piles of disparate data Very Strong in SQL execution Advanced knowledge in using Power Bi reporting tool Experience in marketing analytics or performance marketing, concepts around campaign operations shall be clear like how does a paid media marketing works, campaigns acquisition, reactivations, cross-selling, brand awareness, measure and optimize the campaigns etc., Comfortable working autonomously with broad guidelines Passion for data andanalytics for marketing and eagerness to learn Excellent communications skills, both written and spoken; ability to explain complex technical concepts in plain English Ability to manage multiple priorities and projects, aligning teams to project timelines and ensuring quality of deliverables Work with business teams to identify business use cases and develop solutions to meet these needs using analytical approaches Manage regular reporting and ad-hoc data extract from other departments. Knowledge on analyzing digital campaigns and the tools/technologies of performance marketing Experience with Google sheet/Excel Hands-on experience in digital marketing and/or 1:1 marketing in any channel; expert level knowledge in campaign performance marketing and CRM MMM, A/B testing, attribution modeling is a plus Working knowledge of analytical/statistical techniques Experience in Hadoop environment Hive, Presto is a plus Experience in Python/R
Posted 1 week ago
5.0 - 8.0 years
10 - 14 Lacs
bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Palantir Foundry Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Project Role:Lead Data Engineer Project Role Description:Design, build and enhance applications to meet business process and requirements in Palantir foundry.Work experience:Minimum 6 years Must have Skills: Palantir Foundry, PySparkGood to Have Skills: Experience in PySpark, python and SQLKnowledge on Big Data tools & TechnologiesOrganizational and project management experience.Job Requirements & Key Responsibilities:Responsible for designing, developing, testing, and supporting data pipelines and applications on Palantir foundry.Configure and customize Workshop to design and implement workflows and ontologies.Collaborate with data engineers and stakeholders to ensure successful deployment and operation of Palantir foundry applications.Work with stakeholders including the product owner, data, and design teams to assist with data-related technical issues and understand the requirements and design the data pipeline.Work independently, troubleshoot issues and optimize performance.Communicate design processes, ideas, and solutions clearly and effectively to team and client. Assist junior team members in improving efficiency and productivity.Technical Experience:Proficiency in PySpark, Python and SQL with demonstrable ability to write & optimize SQL and spark jobs.Hands on experience on Palantir foundry related services like Data Connection, Code repository, Contour, Data lineage & Health checks.Good to have working experience with workshop, ontology, slate.Hands-on experience in data engineering and building data pipelines (Code/No Code) for ELT/ETL data migration, data refinement and data quality checks on Palantir Foundry.Experience in ingesting data from different external source systems using data connections and sync.Good Knowledge on Spark Architecture and hands on experience on performance tuning & code optimization.Proficient in managing both structured and unstructured data, with expertise in handling various file formats such as CSV, JSON, Parquet, and ORC.Experience in developing and managing scalable architecture & managing large data sets.Good understanding of data loading mechanism and adeptly implement strategies for capturing CDC.Nice to have test driven development and CI/CD workflows.Experience in version control software such as Git and working with major hosting services (e. g. Azure DevOps, GitHub, Bitbucket, Gitlab).Implementing code best practices involves adhering to guidelines that enhance code readability, maintainability, and overall quality.Educational Qualification:15 years of full-term education Qualification 15 years full time education
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
One of our esteemed clients is a Japanese multinational information technology (IT) service and consulting company headquartered in Tokyo, Japan, which acquired Italy-based Value Team S.p.A. and launched Global One Teams. This dynamic and high-impact firm is currently seeking a Data Quality Engineer specializing in Informatica Data Quality tool. The ideal candidate should possess 8+ years of experience in Informatica Data Quality tool and demonstrate proficiency in using Informatica Data Quality (IDQ) for data profiling to identify data anomalies and patterns. Strong knowledge of SQL for querying databases is essential, along with the ability to design and implement data cleansing routines using IDQ. Experience with database systems such as Lakehouse, PostgreSQL, Teradata, and SQL Server is preferred. Key Responsibilities: - Analyze data quality metrics and generate reports - Standardize, validate, and enrich data - Create and manage data quality rules and workflows in IDQ - Automate data quality checks and validations - Integrate IDQ with other data management tools and platforms - Manage data flows and ensure data consistency - Utilize data manipulation libraries like Pandas and NumPy - Use PySpark for big data processing and analytics - Write complex SQL queries for data extraction and transformation Interested candidates should possess relevant experience in Informatica Data Quality tool and be able to share their updated resume along with details such as Total Experience, Current Location, Current CTC, Expected CTC, and Notice Period. The company assures strict confidentiality in handling all profiles. If you are ready to take your career to new heights and be part of this incredible journey, apply now to join this innovative team. Thank you, Syed Mohammad syed.m@anlage.co.in,
Posted 2 weeks ago
5.0 - 8.0 years
10 - 14 Lacs
bengaluru
Work from Office
About The Role Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Palantir Foundry Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Project Role :Lead Data Engineer Project Role Description :Design, build and enhance applications to meet business process and requirements in Palantir foundry.Work experience :Minimum 6 years Must have Skills :Palantir Foundry , PySpark, TypeScript (for customizing Workshop Forms & UI)Good to Have Skills :Experience in Pyspark, python and SQLKnowledge on Big Data tools & TechnologiesOrganizational and project management experience.Job Requirements & Key Responsibilities :Responsible for designing , developing, testing, and supporting data pipelines and applications on Palantir foundry.Configure and customize Workshop to design and implement workflows and ontologies.Configure and customize Workshop applications, including designing Forms, Workflows, and Ontology-based interactions.Write TypeScript to create dynamic and interactive Forms in Workshop for user-driven data entry and validation.Collaborate with data engineers and stakeholders to ensure successful deployment and operation of Palantir foundry applications.Work with stakeholders including the product owner, data, and design teams to assist with data-related technical issues and understand the requirements and design the data pipeline.Work independently, troubleshoot issues and optimize performance.Communicate design processes, ideas, and solutions clearly and effectively to team and client. Assist junior team members in improving efficiency and productivity. Technical Experience :Proficiency in PySpark, Python and Sql with demonstrable ability to write & optimize SQL and spark jobs.Hands on experience on Palantir foundry related services like Data Connection, Code repository, Contour , Data lineage & Health checks.Good to have working experience with workshop , ontology , slate.Hands-on experience in data engineering and building data pipelines (Code/No Code) for ELT/ETL data migration, data refinement and data quality checks on Palantir Foundry.Experience in TypeScript to create and customize Forms in Workshop, including form validation, user interactions, and data binding with Ontology.Experience in ingesting data from different external source systems using data connections and sync.Good Knowledge on Spark Architecture and hands on experience on performance tuning & code optimization.Proficient in managing both structured and unstructured data, with expertise in handling various file formats such as CSV, JSON, Parquet, and ORC.Experience in developing and managing scalable architecture & managing large data sets.Good understanding of data loading mechanism and adeptly implement strategies for capturing CDC.Nice to have test driven development and CI/CD workflows.Experience in version control software such as Git and working with major hosting services (e. g. Azure DevOps, GitHub, Bitbucket, Gitlab).Implementing code best practices involves adhering to guidelines that enhance code readability, maintainability, and overall quality. Educational Qualification:15 years of full-term education Qualification 15 years full time education
Posted 2 weeks ago
2.0 - 6.0 years
8 - 12 Lacs
pune
Work from Office
5+ years of overall experience with min 2 years of experience in Regulatory Reporting with exposure to the Axiom. Implementation experience preferably in the Banking domain. Transformation of business requirements into technical needs and implementation procedures. Ability to work independently. Good communication skills. Experience in Software Design and Development. Primarily working in the Data Warehousing and Regulatory Reporting area in Banking and Finance domain Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Should have good experience on simple and complex sql query writing using all type of joins, analytical functions, with clause, sub query, set operators. Very well verse and should have good exposure with Oracle SQL Loader, Partitioning, performance tuning. Should have good hands-on oracle procedural language like PL/SQL procedure, function, packages, views, temporary tables, collections, etc. Preferred to have knowledge on UNIX, shell scripting, Autosys jobs Preferred technical and professional experience Experience with Axiom Controller View Regulatory Reporting tool - version 9x/10x. Experience with Axiom implementation for different Regulations
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Mid ETL/DB Tester at GlobalLogic, you will be responsible for ETL/DWH testing and demonstrating proficiency in SQL with knowledge of Sybase, SQL & Oracle Server. Your expertise in tools such as HP ALM, JIRA & Rally, along with experience in Agile, OOPS Methodologies, and Core Java will be crucial for the successful execution of your responsibilities. At GlobalLogic, we prioritize a culture of caring, where we put people first and foster an inclusive environment of acceptance and belonging. You will have the opportunity to build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders from day one. We are committed to your continuous learning and development. With numerous opportunities to try new things, sharpen your skills, and advance your career, you will grow daily in an environment that values personal and professional growth. Our Career Navigator tool and various training programs will support your journey towards success. As part of our team, you will engage in interesting and meaningful work by collaborating with clients worldwide to engineer impactful solutions. Your creative problem-solving skills will be put to the test as you help clients reimagine possibilities and bring new solutions to market, contributing to cutting-edge projects shaping the world today. We believe in the importance of balance and flexibility, offering various career areas, roles, and work arrangements to help you achieve a harmonious work-life balance. With our support, you can integrate and balance the best of both worlds, ensuring that you have fun along the way. GlobalLogic is a high-trust organization where integrity is paramount. By joining us, you are entering a safe, reliable, and ethical global company that values truthfulness, candor, and integrity in every aspect of our operations. Your trust in us is reciprocated with a commitment to maintaining a trustworthy environment for both employees and clients. About GlobalLogic: GlobalLogic, a Hitachi Group Company, is a leading digital engineering partner to the world's largest companies, driving digital transformation through innovative products and experiences. Since 2000, we have been at the forefront of the digital revolution, collaborating with clients to redefine industries and create intelligent solutions that shape the future.,
Posted 2 weeks ago
2.0 - 6.0 years
8 - 12 Lacs
pune
Work from Office
5+ years of overall experience with min 2 years of experience in Regulatory Reporting with exposure to the Axiom. Implementation experience preferably in the Banking domain. Transformation of business requirements into technical needs and implementation procedures. Ability to work independently. Good communication skills. Experience in Software Design and Development. Primarily working in the Data Warehousing and Regulatory Reporting area in Banking and Finance domain Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Should have good experience on simple and complex sql query writing using all type of joins, analytical functions, with clause, sub query, set operators. Very well verse and should have good exposure with Oracle SQL Loader, Partitioning, performance tuning. Should have good hands-on oracle procedural language like PL/SQL procedure, function, packages, views, temporary tables, collections, etc. Preferred to have knowledge on UNIX, shell scripting, Autosys jobs Preferred technical and professional experience Experience with Axiom Controller View Regulatory Reporting tool - version 9x/10x. Experience with Axiom implementation for different Regulations
Posted 2 weeks ago
15.0 - 20.0 years
10 - 14 Lacs
bengaluru
Work from Office
About The Role Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Palantir Foundry Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Project Role :Lead Data Engineer Project Role Description :Design, build and enhance applications to meet business process and requirements in Palantir foundry.Work experience :Minimum 6 years Must have Skills :Palantir Foundry , PySparkGood to Have Skills :Experience in Pyspark, python and SQLKnowledge on Big Data tools & TechnologiesOrganizational and project management experience.Job Requirements & Key Responsibilities :Responsible for designing , developing, testing, and supporting data pipelines and applications on Palantir foundry.Configure and customize Workshop to design and implement workflows and ontologies.Collaborate with data engineers and stakeholders to ensure successful deployment and operation of Palantir foundry applications.Work with stakeholders including the product owner, data, and design teams to assist with data-related technical issues and understand the requirements and design the data pipeline.Work independently, troubleshoot issues and optimize performance.Communicate design processes, ideas, and solutions clearly and effectively to team and client. Assist junior team members in improving efficiency and productivity. Technical Experience :Proficiency in PySpark, Python and Sql with demonstrable ability to write & optimize SQL and spark jobs.Hands on experience on Palantir foundry related services like Data Connection, Code repository, Contour , Data lineage & Health checks.Good to have working experience with workshop , ontology , slate.Hands-on experience in data engineering and building data pipelines (Code/No Code) for ELT/ETL data migration, data refinement and data quality checks on Palantir Foundry.Experience in ingesting data from different external source systems using data connections and sync.Good Knowledge on Spark Architecture and hands on experience on performance tuning & code optimization.Proficient in managing both structured and unstructured data, with expertise in handling various file formats such as CSV, JSON, Parquet, and ORC.Experience in developing and managing scalable architecture & managing large data sets.Good understanding of data loading mechanism and adeptly implement strategies for capturing CDC.Nice to have test driven development and CI/CD workflows.Experience in version control software such as Git and working with major hosting services (e. g. Azure DevOps, GitHub, Bitbucket, Gitlab).Implementing code best practices involves adhering to guidelines that enhance code readability, maintainability, and overall quality. Educational Qualification:15 years of full-term education Qualification 15 years full time education
Posted 2 weeks ago
15.0 - 20.0 years
10 - 14 Lacs
bengaluru
Work from Office
About The Role Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Palantir Foundry Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Project Role :Lead Data Engineer Project Role Description :Design, build and enhance applications to meet business process and requirements in Palantir foundry.Work experience :Minimum 6 years Must have Skills :Palantir Foundry , PySparkGood to Have Skills :Experience in Pyspark, python and SQLKnowledge on Big Data tools & TechnologiesOrganizational and project management experience.Job Requirements & Key Responsibilities :Responsible for designing , developing, testing, and supporting data pipelines and applications on Palantir foundry.Configure and customize Workshop to design and implement workflows and ontologies.Collaborate with data engineers and stakeholders to ensure successful deployment and operation of Palantir foundry applications.Work with stakeholders including the product owner, data, and design teams to assist with data-related technical issues and understand the requirements and design the data pipeline.Work independently, troubleshoot issues and optimize performance.Communicate design processes, ideas, and solutions clearly and effectively to team and client. Assist junior team members in improving efficiency and productivity. Technical Experience :Proficiency in PySpark, Python and Sql with demonstrable ability to write & optimize SQL and spark jobs.Hands on experience on Palantir foundry related services like Data Connection, Code repository, Contour , Data lineage & Health checks.Good to have working experience with workshop , ontology , slate.Hands-on experience in data engineering and building data pipelines (Code/No Code) for ELT/ETL data migration, data refinement and data quality checks on Palantir Foundry.Experience in ingesting data from different external source systems using data connections and sync.Good Knowledge on Spark Architecture and hands on experience on performance tuning & code optimization.Proficient in managing both structured and unstructured data, with expertise in handling various file formats such as CSV, JSON, Parquet, and ORC.Experience in developing and managing scalable architecture & managing large data sets.Good understanding of data loading mechanism and adeptly implement strategies for capturing CDC.Nice to have test driven development and CI/CD workflows.Experience in version control software such as Git and working with major hosting services (e. g. Azure DevOps, GitHub, Bitbucket, Gitlab).Implementing code best practices involves adhering to guidelines that enhance code readability, maintainability, and overall quality. Educational Qualification:15 years of full-term education Qualification 15 years full time education
Posted 2 weeks ago
5.0 - 10.0 years
8 - 12 Lacs
pune
Work from Office
5+ years of overall experience with min 2 years of experience in Regulatory Reporting with exposure to the Axiom. Implementation experience preferably in the Banking domain. Transformation of business requirements into technical needs and implementation procedures. Ability to work independently. Good communication skills. Experience in Software Design and Development. Primarily working in the Data Warehousing and Regulatory Reporting area in Banking and Finance domain Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Should have good experience on simple and complex sql query writing using all type of joins, analytical functions, with clause, sub query, set operators. Very well verse and should have good exposure with Oracle SQL Loader, Partitioning, performance tuning. Should have good hands-on oracle procedural language like PL/SQL procedure, function, packages, views, temporary tables, collections, etc. Preferred to have knowledge on UNIX, shell scripting, Autosys jobs Preferred technical and professional experience Experience with Axiom Controller View Regulatory Reporting tool - version 9x/10x. Experience with Axiom implementation for different Regulations
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |