Jobs
Interviews

16581 Query Jobs - Page 45

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

About the role We’re looking for Senior Engineering Manager to lead our Data / AI Platform and MLOps teams at slice. In this role, you’ll be responsible for building and scaling a high-performing team that powers data infrastructure, real-time streaming, ML enablement, and data accessibility across the company. You'll partner closely with ML, product, platform, and analytics stakeholders to build robust systems that deliver high-quality, reliable data at scale. You will drive AI initiatives to centrally build AP platform and apps which can be leveraged by various functions like legal, CX, product in a secured manner This is a hands-on leadership role perfect for someone who enjoys solving deep technical problems while growing people and teams. What You Will Do Lead and grow the data platform pod focused on all aspects of data (batch + real-time processing, ML platform, AI tooling, Business reporting, Data products – enabling product experience through data) Maintain hands-on technical leadership - lead by example through code reviews, architecture decisions, and direct technical contribution Partner closely with product and business stakeholders to identify data-driven opportunities and translate business requirements into scalable data solutions Own the technical roadmap for our data platform including infra modernization, performance, scalability, and cost efficiency Drive the development of internal data products like self-serve data access, centralized query layers, and feature stores Build and scale ML infrastructure with MLOps best practices including automated pipelines, model monitoring, and real-time inference systems Lead AI platform development for hosting LLMs, building secure AI applications, and enabling self-service AI capabilities across the organization Implement enterprise AI governance including model security, access controls, and compliance frameworks for internal AI applications Collaborate with engineering leaders across backend, ML, and security to align on long-term data architecture Establish and enforce best practices around data governance, access controls, and data quality Ensure regulatory compliance with GDPR, PCI-DSS, SOX through automated compliance monitoring and secure data pipelines Implement real-time data processing for fraud detection and risk management with end-to-end encryption and audit trails Coach engineers and team leads through regular 1:1s, feedback, and performance conversations What You Will Need 10+ years of engineering experience, including 2+ years managing data or infra teams with proven hands-on technical leadership Strong stakeholder management skills with experience translating business requirements into data solutions and identifying product enhancement opportunities Strong technical background in data platforms, cloud infrastructure (preferably AWS), and distributed systems Experience with tools like Apache Spark, Flink, EMR, Airflow, Trino/Presto, Kafka, and Kubeflow/Ray plus modern stack: dbt, Databricks, Snowflake, Terraform Hands on experience building AI/ML platforms including MLOps tools and experience with LLM hosting, model serving, and secure AI application development Proven experience improving performance, cost, and observability in large-scale data systems Expert-level cloud platform knowledge with container orchestration (Kubernetes, Docker) and Infrastructure-as-Code Experience with real-time streaming architectures (Kafka, Redpanda, Kinesis) Understanding of AI/ML frameworks (TensorFlow, PyTorch), LLM hosting platforms, and secure AI application development patterns Comfort working in fast-paced, product-led environments with ability to balance innovation and regulatory constraints Bonus: Experience with data security and compliance (PII/PCI handling), LLM infrastructure, and fintech regulations Life at slice Life so good, you’d think we’re kidding: Competitive salaries. Period. An extensive medical insurance that looks out for our employees & their dependents. We’ll love you and take care of you, our promise. Flexible working hours. Just don’t call us at 3AM, we like our sleep schedule. Tailored vacation & leave policies so that you enjoy every important moment in your life. A reward system that celebrates hard work and milestones throughout the year. Expect a gift coming your way anytime you kill it here. Learning and upskilling opportunities. Seriously, not kidding. Good food, games, and a cool office to make you feel like home. An environment so good, you’ll forget the term “colleagues can’t be your friends”.

Posted 1 week ago

Apply

4.0 years

0 Lacs

India

Remote

Job Summary: Job Title: Golang Developer Location: Remote Job Type: Contract Duration: 1.5 Month Job Description: We are seeking a skilled Go (Golang) Developer with strong backend development experience to join our team on a short-term (1.5-month) remote contract. The ideal candidate will be responsible for building and optimizing backend services, managing MongoDB databases, and ensuring performance at scale. Key Responsibilities: Design, develop, and maintain robust and scalable backend services using Go (Golang) Work with MongoDB to build efficient data models and handle data operations Optimize queries and database interactions for performance and scalability Troubleshoot, debug, and resolve performance bottlenecks in backend systems Collaborate with front-end developers and other team members to integrate APIs and services Ensure code quality through best practices, testing, and documentation Required Skills: 4+ years of hands-on experience with Go (Golang) Strong working knowledge of MongoDB including data modeling and query optimization Experience in creating and consuming RESTful APIs and microservices Deep understanding of backend architecture and system performance tuning Ability to write clean, maintainable, and well-documented code Familiarity with version control systems like Git and bitbucket Self-motivated and able to work independently in a remote environment Nice to Have: Exposure to cloud platforms (Azure, AWS, or GCP) Experience with containerization tools like Docker Familiarity with CI/CD pipelines

Posted 1 week ago

Apply

4.0 - 7.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Intuitive Apps Inc. is one of the fastest growing Consulting companies, working on a mission to take a plunge to provide best digital transformation and intuitive experience for our customers. The Role Key Responsibilities Administer and maintain Atlassian JIRA, JIRA Service Management and Confluence platforms Design and implement custom JIRA workflows, screens, schemes, custom fields and dashboards Develop Confluence spaces, templates, macros and knowledge management structures for teams Collaborate with business users to gather requirements and implement JIRA configurations to meet Agile/Devops delivery models Manage user permissions, groups, and roles with JIRA and Confluence Perform regular system audits, clean-up, upgrades and security patches Integrate JIRA with third-party tools Develop and maintain documentation on configuration, processes, and best practices Provide day-to-day support and training to users on JIRA and Confluence usage and capabilities. Generate reports and analytics using JIRA Query Language (JQL), filters, and dashboards Work with IT and Security teams to ensure compliance and data integrity Required Skills and Ideal Profile Bachelor’s degree in computer science, Information Technology, or a related field 4-7 years of experience in JIRA administration in enterprise environments. Strong expertise in JIRA workflow configuration, automation rules (Automation for JIRA), and JIRA Service Management (JSM) Proficient in Confluence administration and integration with JIRA Hands-on experience with scripting and automation using Groovy (Script Runner), REST APIs, or similar tools Good understanding of Agile and ITIL frameworks Experience in user onboarding, access control and group management Strong troubleshooting and problem-solving skills. Experience with Atlassian Marketplace plugins and licensing management. Preferred Qualifications Atlassian Certification (ACP-610/620/1000) is a plus Experience with Atlassian Cloud and Data Center migrations Familiarity with other Atlassian tools like Bitbucket, Bamboo, and Trello Basic knowledge of Linux and Databases (PostgreSQL, MySQL) for backend support. Ability to work independently and in a collaborative team environment Strong organizational and documentation skills Customer-focused and proactive in identifying and solving issues. What's on Offer? Opening within a company with a solid track record of success A role that offers a breadth of learning opportunities Great work culture

Posted 1 week ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Eazy Naukri is currently hiring for our client on the position of Meta Ads Lead Specialist, a dynamic specialist to own and optimize our lead generation engine through Meta (Facebook & Instagram) ad campaigns. If you’re passionate, motivated, and enthusiastic individual, we’d love to hear from you! Job Title: Meta Ads Lead Specialist Required Experience: 2+ Yrs location - Sector 49, Gurgaon Expected Joining: I mmediate to 15 days Budget: upto 4.8 LPA Key Responsibilities: Plan, launch, and optimize Meta lead generation campaigns for online degree programs. Drive high-quality leads within CPL goals across multiple student personas. Develop creatives (in collaboration with design) for carousel, video, and static ads. Regularly analyze campaign performance, run A/B tests, and fine-tune targeting. Work closely with the counseling and admissions team to qualify and refine lead quality. Stay ahead of algorithm changes, trends, and best practices in Meta advertising. Required Skills & Qualifications: 2–4 years of hands-on Meta Ads experience (lead gen specifically). Deep understanding of Meta Business Manager, pixel, custom audiences, and lead forms. Ability to segment audiences based on behavior, interest, and funnel stage. Strong grasp of performance metrics like CPL, CTR, CVR, ROAS, etc. Familiarity with higher education, ed-tech, or similar B2C campaigns is a bonus. Interested? - Share your resume on eazynaukri@gmail.com or for any job related query, feel free to connect on +91-9950685712. Regards, Eazy Naukri https://www.linkedin.com/company/eazynaukri/

Posted 1 week ago

Apply

6.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Title: Senior Google Cloud Platform (GCP) Data Engineer Location: Hybrid (Bengaluru, India) Job Type: Full-Time Experience Required: Minimum 6 Years Joining: Immediate or within 1 week About the Company: Tech T7 Innovations is a global IT solutions provider known for delivering cutting-edge technology services to enterprises across various domains. With a team of seasoned professionals, we specialize in software development, cloud computing, data engineering, machine learning, and cybersecurity. Our focus is on leveraging the latest technologies and best practices to create scalable, reliable, and secure solutions for our clients. Job Summary: We are seeking a highly skilled Senior GCP Data Engineer with over 6 years of experience in data engineering and extensive hands-on expertise in Google Cloud Platform (GCP). The ideal candidate must have a strong foundation in GCS, BigQuery, Apache Airflow/Composer, and Python, with a demonstrated ability to design and implement robust, scalable data pipelines in a cloud environment. Roles and Responsibilities: Design, develop, and deploy scalable and secure data pipelines using Google Cloud Platform components including GCS, BigQuery, and Airflow. Develop and manage robust ETL/ELT workflows using Python and integrate with orchestration tools such as Apache Airflow or Cloud Composer. Collaborate with data scientists, analysts, and business stakeholders to gather requirements and deliver reliable and efficient data solutions. Optimize BigQuery performance using best practices such as partitioning, clustering, schema design , and query tuning . Manage, monitor, and maintain data lake and data warehouse environments with high availability and integrity. Automate pipeline monitoring, error handling, and alerting mechanisms to ensure seamless and reliable data delivery . Contribute to architecture decisions involving data modeling, data flow, and integration strategies in a cloud-native environment. Ensure compliance with data governance , privacy, and security policies as per enterprise and regulatory standards. Mentor junior engineers and drive best practices in cloud engineering and data operations . Mandatory Skills: Google Cloud Platform (GCP): In-depth hands-on experience with GCS, BigQuery, IAM, and Cloud Functions. BigQuery (BQ): Expertise in large-scale analytics, schema optimization, and data modeling. Google Cloud Storage (GCS): Strong understanding of data lifecycle management, access controls, and best practices. Apache Airflow / Cloud Composer: Proficiency in writing and managing complex DAGs for data orchestration. Python Programming: Advanced skills in automation, API integration, and data processing using libraries like Pandas, PySpark, etc. Preferred Qualifications: Experience with CI/CD pipelines for data infrastructure and workflows. Exposure to other GCP services like Dataflow , Pub/Sub , and Cloud Functions . Familiarity with Infrastructure as Code (IaC) tools such as Terraform . Strong communication and analytical skills for problem-solving and stakeholder engagement. GCP Certifications (e.g., Professional Data Engineer) will be a significant advantage

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job title :- AM - IPD Billing & TPA (Deduction Recovery) To Manage the TPA/Insurance Agreement and tariff updation and coordinate with TPA Insurance for payment follow up along with max internal team as well for day-to-day work. Role & responsibilities To ensure customers TPA outstanding recovery within defined timelines. Achievement of assigned collection Targets. To maintain Insurance Tracker and updating of case status in E Prapti on weekly basis. Recovery of Top up deduction cases. TPA status of Outstanding cases prior to 30 days (Inclusive of Less than Rs. 10000). Bill docket receiving status prior to 30 days (Real time status basis on TPA records). Outstanding details to be shared with TPAs by 7th of every month. Relationship building with TPA key persons and arranging value added services across Pan max units. Working with TPAs networking/ Operation Managers to identify payout delay reasons and mitigate any issues being seen at TPA/MHC end. To provide NEFT dump to On Account team on fortnight/ Monthly basis. Resolution of On Account Team concerns/ requirements with in TAT of 72 hours. To capture correct status, TIN/CIN, Insurance Company name in Insurance Tracker. Post discharge Query resolution and updating in E Prapti. To identify reasons of wrong settlement cases and correction to be done with help of On-Account & Finance Team. Weekly report on action done against cheque Re Issue/paid by TPA but payout not received cases. Weekly TPA Visit Call Report. (Format already shared). On Account Settlement Prior to 90 days. Cashless Troubleshooting Assisting unit Front office/ Billing teams on day-to-day issues faced during hospitalization of patient. Data/ records maintaining of support extending to unit TPA teams for cashless troubleshooting cases. Maintaining relationship & regular visits to Pan Max units. Preferred candidate profile Qualifications - Graduate Experience - 3 to 5 years; Preferably with 2 years of healthcare experience Please share your CV deen.dayal@maxhealthcare.com

Posted 1 week ago

Apply

4.0 - 7.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

Intuitive Apps Inc. is one of the fastest growing Consulting companies, working on a mission to take a plunge to provide best digital transformation and intuitive experience for our customers. The Role Key Responsibilities Administer and maintain Atlassian JIRA, JIRA Service Management and Confluence platforms Design and implement custom JIRA workflows, screens, schemes, custom fields and dashboards Develop Confluence spaces, templates, macros and knowledge management structures for teams Collaborate with business users to gather requirements and implement JIRA configurations to meet Agile/Devops delivery models Manage user permissions, groups, and roles with JIRA and Confluence Perform regular system audits, clean-up, upgrades and security patches Integrate JIRA with third-party tools Develop and maintain documentation on configuration, processes, and best practices Provide day-to-day support and training to users on JIRA and Confluence usage and capabilities. Generate reports and analytics using JIRA Query Language (JQL), filters, and dashboards Work with IT and Security teams to ensure compliance and data integrity Required Skills and Ideal Profile Bachelor’s degree in computer science, Information Technology, or a related field 4-7 years of experience in JIRA administration in enterprise environments. Strong expertise in JIRA workflow configuration, automation rules (Automation for JIRA), and JIRA Service Management (JSM) Proficient in Confluence administration and integration with JIRA Hands-on experience with scripting and automation using Groovy (Script Runner), REST APIs, or similar tools Good understanding of Agile and ITIL frameworks Experience in user onboarding, access control and group management Strong troubleshooting and problem-solving skills. Experience with Atlassian Marketplace plugins and licensing management. Preferred Qualifications Atlassian Certification (ACP-610/620/1000) is a plus Experience with Atlassian Cloud and Data Center migrations Familiarity with other Atlassian tools like Bitbucket, Bamboo, and Trello Basic knowledge of Linux and Databases (PostgreSQL, MySQL) for backend support. Ability to work independently and in a collaborative team environment Strong organizational and documentation skills Customer-focused and proactive in identifying and solving issues. What's on Offer? Opening within a company with a solid track record of success A role that offers a breadth of learning opportunities Great work culture

Posted 1 week ago

Apply

0.0 - 1.0 years

0 - 0 Lacs

Ballupur, Dehradun, Uttarakhand

On-site

Communication skills, cash handling, working with card swipe machine and having good knowledge of computer etc. Preferred who living in Dehradun and have at least 1 year experience in Front Office or Reception. For any Query call / whatsapp: 9897331525 Location: https://maps.app.goo.gl/rNUTYC55rLyvGeVR6 https://careddn.com info@careddn.com Job Types: Full-time, Permanent Pay: ₹9,500.00 - ₹15,000.00 per month Schedule: Fixed shift Supplemental Pay: Overtime pay Performance bonus Ability to commute/relocate: Ballupur, Dehradun, Uttarakhand: Reliably commute or planning to relocate before starting work (Required) Education: Bachelor's (Preferred) Experience: Front desk: 1 year (Required) Work Location: In person

Posted 1 week ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Title: Junior .NET Developer Location: Chennai, IN (5 days onsite) Shift Timing: 2 PM – 11 PM (UK shift) Work Type: Onsite | Full-Time | Permanent Minimum Experience: 3 years Roles and responsibilities: 3+ years of experience with .NET Core, C# and SQL Server Experience building Web APIs and MVC applications using ASP.NET Core Strong knowledge of Windows desktop applications using WinForms or WPF Familiar with Entity Framework Core and writing LINQ queries Good understanding of SQL Server – stored procedures, indexing, and query tuning Experience with Git, Bitbucket version control tools. Experience with JIRA, Confluence. Basic knowledge of HTML, CSS, JavaScript, or Razor Pages Understanding of RESTful APIs, WCF SOAP services and integration with external systems Familiarity with unit testing frameworks like xUnit or NUnit Good grasp of OOPS, SOLID principles, and common design patterns Optional: Exposure to Angular, Azure, CI/CD tools Dexian is a leading provider of staffing, IT, and workforce solutions with over 12,000 employees and 70 locations worldwide. As one of the largest IT staffing companies and the 2nd largest minority-owned staffing company in the U.S., Dexian was formed in 2023 through the merger of DISYS and Signature Consultants. Combining the best elements of its core companies, Dexian's platform connects talent, technology, and organizations to produce game-changing results that help everyone achieve their ambitions and goals. Dexian's brands include Dexian DISYS, Dexian Signature Consultants, Dexian Government Solutions, Dexian Talent Development and Dexian IT Solutions. Visit https://dexian.com/ to learn more. Dexian is an Equal Opportunity Employer that recruits and hires qualified candidates without regard to race, religion, sex, sexual orientation, gender identity, age, national origin, ancestry, citizenship, disability, or veteran status.

Posted 1 week ago

Apply

5.0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

Job Title: Senior Software Developer As a Senior Software Developer (.NET), you will play a critical role in designing, developing, and deploying scalable, secure enterprise applications aligned with Inevia’s business objectives. This role ensures high performance, maintainability, and reliability of applications built using .NET technologies, microservices, and SQL Server. You will work closely with cross-functional teams, including product owners and stakeholders, to deliver innovative solutions while mentoring team members in a collaborative Agile environment. The candidate must be detail-oriented, organized, and capable of managing multiple priorities in a fast-paced environment. Responsibilities: Application Design & Development: Design, develop, and maintain robust web applications using C#, .NET Core, and .NET 6/7/8. Develop reusable components, services, and libraries following clean coding practices. Build, consume, and secure RESTful APIs and microservices. Integrate with Angular-based frontend applications for seamless backend-frontend communication. Ensure adherence to architecture principles and coding standards. System Optimization, Monitoring, and Quality: Perform application performance tuning and optimization. Conduct unit and integration testing, and participate in code reviews. Ensure high availability, scalability, and reliability of applications. Implement robust logging and monitoring mechanisms. Maintain observability and troubleshooting capabilities across environments. Database and Integration: Write optimized SQL queries, stored procedures, and functions in SQL Server. Collaborate on schema design and query performance tuning. Use ORM tools like Entity Framework Core and Dapper for data access. CI/CD and DevOps: Participate in Agile ceremonies and sprint activities. Support CI/CD pipeline setup using Azure DevOps. Participate in containerization using Docker and deployment on cloud platforms. Manage source code repositories and branching strategies. Troubleshooting and Support: Investigate and resolve issues across development, staging, and production environments. Analyze logs and telemetry data to identify root causes and implement fixes. Collaboration and Communication: Collaborate with development teams and stakeholders to gather and clarify requirements. Mentor developers by providing guidance and technical support. Qualifications: Education and Experience: Bachelor’s degree in Computer Science, IT, Engineering, or a related field 5+ years of professional experience in .NET development. Proven experience in building enterprise-grade web applications and APIs. Knowledge and Skills: Expertise in C#, .NET Core, .NET 6/7/8. Strong knowledge of Microservices architecture, RESTful APIs, asynchronous programming, and authentication mechanisms (JWT, OAuth2). Hands-on experience with SQL Server and complex query writing. Familiarity with Entity Framework Core, LINQ, and clean architecture principles. Experience with version control systems such as Azure DevOps and Git Knowledge of cloud technologies, preferably Azure. Exposure to unit testing and test-driven development (TDD). Knowledge of Angular frontend is a plus. Benefits: Opportunity to work on scalable enterprise applications and backend architecture Room for professional growth and learning. Competitive compensation package. Additional Information: This is a full-time position located in Navi Mumbai. Inevia is an equal opportunity employer and encourages applications from candidates of all backgrounds and experiences.

Posted 1 week ago

Apply

0 years

0 Lacs

Ajmer, Rajasthan, India

On-site

Roles and Responsibilities :- Strengthen Operational delivery to maximize Agency business/ Customer acquisition. Deliver Growth through revenue retention and generation initiatives Collaborate with Branch Operations Team to generate Customer Leads. Achieve target by Collaborating with Branch Operations team. Derive new initiative through the existing customer base / new untapped market to bring new sales. Leverage vectors to achieve targets. Manage Customer Parameters – Persistency for sales done through Manage Product mix as agreed from time to time. Track competition on products, structure, and initiatives Compliance – To ensure and function as per the guidelines laid down by Compliance Team Build sustainable relationships and trust with existing customers through open and interactive communication Determine clients’ particular needs and financial situations by scheduling fact-finding appointments and determining the extent of present coverage and investments. Ensure Segmented product-based campaigns Follows communication procedures, guidelines & policies Keep records of field Sales Calls and Home Visits Use appropriate solutions and up-selling methods Perform follow-ups to ensure customer satisfaction and query resolution/ Taking References Provide accurate, valid, and complete information by using the right methods/tools

Posted 1 week ago

Apply

6.0 years

0 Lacs

India

On-site

We are seeking a skilled and proactive Platform Lead with strong Snowflake expertise and AWS cloud exposure to lead the implementation and operational excellence of a scalable, multitenant modern data platform for a leading US-based marketing agency serving nonprofit clients. This role requires hands-on experience in managing Snowflake environments, supporting data pipeline orchestration, enforcing platform-level standards, and ensuring observability, performance, and security across environments. You will collaborate with architects, engineers, and DevOps teams to operationalize the platform’s design and drive its long-term stability and scalability in a cloud-native ecosystem. Job Specific Duties & Responsibilities: Lead the technical implementation and stability of the multitenant Snowflake data platform across dev, QA, and prod environments Design and manage schema isolation, role-based access control (RBAC), masking policies, and cost-optimized Snowflake architecture for multiple nonprofit tenants Implement and maintain CI/CD pipelines for dbt, Snowflake objects, and metadata-driven ingestion processes using GitHub Actions or similar tools Develop and maintain automation accelerators for data ingestion, schema validation, error handling, and onboarding new clients at scale Collaborate with architects and data engineers to ensure seamless integration with source CRMs, ByteSpree connectors, and downstream BI/reporting layers Monitor and optimize performance of Snowflake workloads (e.g., query tuning, warehouse sizing, caching strategy) to ensure reliability and scalability Establish and maintain observability and monitoring practices across data pipelines, ingestion jobs, and platform components (e.g., error tracking, data freshness, job status dashboards) Manage infrastructure-as-code (IaC), configuration templates, and version control practices across the data stack Ensure robust data validation, quality checks, and observability mechanisms are in place across all platform services Support incident response, pipeline failures, and technical escalations in production, coordinating across engineering and client teams Contribute to data governance compliance by implementing platform-level policies for PII, lineage tracking, and tenant-specific metadata tagging Required Skills, Experience & Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related technical field 6+ years of experience in data engineering or platform delivery, including 3+ years of hands-on Snowflake experience in production environments Proven expertise in building and managing multi tenant data platforms, including schema isolation, RBAC, and masking policies Solid knowledge of CI/CD practices for data projects, with experience guiding pipeline implementations using tools like GitHub Actions Hands-on experience with dbt, SQL, and metadata-driven pipeline design for large-scale ingestion and transformation workloads Strong understanding of AWS cloud services relevant to data platforms (e.g., S3, IAM, Lambda, CloudWatch, Secrets Manager) Experience optimizing Snowflake performance, including warehouse sizing, caching, and cost control strategies Familiarity with setting up observability frameworks, monitoring tools, and data quality checks across complex pipeline ecosystems Proficient in infrastructure-as-code (IaC) concepts and managing configuration/versioning across environments Awareness of data governance principles, including lineage, PII handling, and tenant-specific metadata tagging

Posted 1 week ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

Remote

JOB PURPOSE As an HRIS Analyst, you will provide system support, configuration, troubleshooting, reporting, project coordination, and integrations with other systems for UKG Workforce Management to support our global HR function. Systems outside of UKG Workforce Management may include: SAP, SuccessFactors, TeamSense, bSwift, and Jira. You will actively identify process improvements, document business and technical requirements, and ensure delivery to specifications, working closely with our systems support vendors and collaborating across Sloan's HR, IT, Operations, and Payroll departments. JOB DUTIES AND RESPONSIBILITIES Serve as the primary HRIS contact for issue tracking, system updates, and end-user support for UKG Workforce Management. UKG Pro Workforce Management (Kronos) – Techno-Functional: Administer and support UKG Pro WFM, including time clocks, timekeeping, scheduling, and accruals. Troubleshoot time clock issues and maintain time and attendance configurations. Design and implement complex accrual rules, shift differentials, and attendance point systems. Develop and maintain attendance policies, warning templates, and user documentation. Ensure system data integrity through regular audits, testing, and updates. Partner with IT and payroll to ensure data flow and compliance with labor laws. Act as liaison with customer support and/or consultants for HRIS technology cases. Create and maintain Standard Operating Procedures (SOPs), training documents, and workflows. Draft and execute detailed test scenarios for system changes and upgrades. Support HRIS projects, including new module rollouts, upgrades, and integrations. Develop timelines, monitor progress, and ensure timely delivery of quality solutions. Identify and escalate issues, track risks, and ensure follow-through on corrective actions. Deliver training sessions to staff, managers, and end-users on new system features and best practices. Other duties and responsibilities as required. REQUIRED QUALIFICATIONS Bachelor's Degree in in Human Resources, Information Technology, or Business Administration, or a related field. 3+ Years Working as an HRIS Analyst with functional and technical experience in: UKG Pro Workforce Management (Kronos) Ability to use discretion when working with confidential information. Actively seeks information to understand customers' circumstances, problems, expectations, and needs. Advanced level experience working with Microsoft Office Suite (Word, Excel, PowerPoint, etc.). Excellent written and verbal communication skills in English. Experience supporting U.S.-based teams and navigating time zone overlap requirements. Experience working independently in a global, remote HR or shared services environment. Familiarity with U.S. labor law compliance in timekeeping systems (e.g., FLSA, California overtime rules). Must be available during core U.S. working hours (full or partial overlap as agreed). Strong attention to detail, documentation, and stakeholder management. Strong reporting skills (Excel, UKG reports); familiarity with query tools or SQL is a plus. PREFERRED QUALIFICATIONS UKG Ready New Administrator Training, UKG Pro Workforce Management Training, and Kronos Workforce Dimensions Training. US Shift : 07pm IST to 03:30am IST

Posted 1 week ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

We are looking for a skilled and hands-on AI Engineer to design, develop, and deploy an in-house AI assistant powered by LLaMA 3 and integrated with our MS SQL-based ERP system (4QT ERP) . This role includes responsibility for setting up LLM infrastructure , voice input (Whisper) , natural language to SQL translation , and delivering accurate, context-aware responses to ERP-related queries. Key Responsibilities: Setup and deploy LLaMA 3 (8B/FP16) models using llama-cpp-python or Hugging Face Integrate the AI model with FastAPI to create secure REST endpoints Connect with MS SQL database and design query logic for ERP modules (Sales, Payments, Units, etc.) Implement prompt engineering or fine-tuning (LoRA) to improve SQL generation accuracy Build a user-facing interface (React or basic web UI) for interacting via text or voice Integrate Whisper (OpenAI) or any STT system to support voice commands Ensure model responses are secure, efficient, and auditable (only SELECT queries allowed) Supervise or perform supervised fine-tuning with custom ERP datasets Optimize for performance (GPU usage) and accuracy (prompt/RAG tuning) Must-Have Skills: Strong experience with LLM deployment (LLaMA 3, Mistral, GPT-type models) Solid Python development experience using FastAPI or Flask SQL knowledge (esp. MS SQL Server ) – must know how to write and validate queries Experience with llama-cpp-python , Hugging Face Transformers, and LoRA fine-tuning Familiarity with LangChain or similar LLM frameworks Understanding of Whisper (STT) or equivalent Speech-to-Text tools Experience working with GPU inference (NVIDIA 4070/5090 etc.)

Posted 1 week ago

Apply

8.0 years

0 Lacs

Lucknow, Uttar Pradesh, India

On-site

Introduction Db2 is a world class relational database!. 100% of the Fortune 100 companies and more than 80% of the Fortune 500 group have one or more members of the DB2 family installed helping to run the business. IBM is continuing to modernize Db2 to be cloud-native bringing new features and capabilities while delivery mission critical features that the world depends on. Db2 is supported across several hyperscalers like IBM Cloud, AWS and Azure as well in a number deployment models including self managed and fully managed SaaS along with tight integration with cloud native services. The Db2 engine specifically is blazingly fast and is written in C/C++ with deep OS and IO subsystem integrations. It powers low-latency transactions and real-time analytics at scale for the worlds most complex workloads.Seeking new possibilities and always staying curious, we are a team dedicated to creating the world's leading AI-powered, cloud-native software solutions for our customers. Our renowned legendary solutions create endless global opportunities for our IBMers, so the door is always open for those who want to grow their career. Your Role And Responsibilities "As a key member of our dynamic team, you will play a vital role in crafting exceptional software experiences. Your responsibilities will encompass the design and implementation of innovative features, fine-tuning and sustaining existing code for optimal performance, and guaranteeing top-notch quality through rigorous testing and debugging. Collaboration is at the heart of what we do, and you'll be working closely with fellow developers, designers, and product managers to ensure our software aligns seamlessly with user expectations. The role seeks good levels of personal organisation and the ability to work well within a distributed global team in a fast paced and exciting environment. You will be office based, working with senior software engineers who will help you integrate into the team, the department and wider IBM. You will be joining a development squad following Design Thinking and Agile principles where you are expected to collaboratively develop creative solutions. The work can be varied, flexibility to learn new technologies and skills is key as we look look to help grow your career within IBM. A positive attitude and a passion to succeed is essential in joining a high performing software development team at IBM. " Preferred Education Master's Degree Required Technical And Professional Expertise A minimum of 8+ years of experience in software development A minimum of 6+ years of experience in either Golang, Python or C, C++ and API Development A minimum of 1 year experience in programming with Python Experience with Operating System Concepts (serialization, concurrency, multi-threading) and Data Structures (arrays, pointers, hash buckets) Experience with SQL Databases (Db2, Oracle, SQL Server, PostgreSQL, MySQL, etc) Experience with software development best practices including coding standards, code reviews, source control management, build processes, and testing Demonstrated communication, teamwork, and problem-solving skills Experience with cloud-based technologies, showcasing familiarity with modern cloud ecosystems and tools AWS /AZURE/ IBM Cloud 5+ years of experience with Cloud/Container skills: Familiarity with cloud and container technologies, including Docker, Kubernetes, Red Hat OpenShift, etc. Preferred Technical And Professional Experience Knowledge of and/or experience with database design and query optimization Knowledge of and/or experience with optimization problems and the algorithms to solve them, such as dynamic programming Knowledge serverless and stateless computing services like Lambda or Code Engine. Experience using Linux operating systems Security domain expertise Knowledge of version control systems such as GitHub Demonstrated analytical and problem solving skills Familiarity with distributed filesystems and data storage techniques

Posted 1 week ago

Apply

0.0 - 4.0 years

0 - 0 Lacs

Delhi, Delhi

On-site

BNC has been mandated to recruit an an experienced Accounts & Taxation Executive (Non-CA, Non-CA pursuing) to join our CA firm based in West Delhi. The ideal candidate must have prior experience working in a CA firm and should be well-versed with day-to-day accounting and compliance work. Job Title: Accounts & Taxation Executive (Non-CA) Location: West Delhi (Nearby candidates preferred) Firm Type: Chartered Accountancy Firm Salary: Up to ₹35,000 per month (as per CA firm norms) Key Responsibilities: Preparation & filing of Income Tax Returns (ITR) Handling Tax Audits and Statutory Audits Working knowledge of TDS filing and compliance Experience in ROC compliances and MCA-related filings Preparing and filing GST returns, reconciliations, and handling notices Finalization of Balance Sheets and Profit & Loss Statements Managing Digital Signature Certificates (DSC) – procurement, renewal, usage Coordination with clients for data collection and query resolution Candidate Requirements: Non CA candidate with Minimum 2–4 years of experience in a CA firm. Not a CA / Not pursuing CA Strong knowledge of taxation laws, ROC, and GST compliance Proficiency in accounting software like Tally, MS Excel, and utility tools Familiar with Income Tax Portal, GST Portal, MCA Portal, and TDS Portal Good communication skills and ability to handle work independently Candidates residing in West Delhi or nearby areas preferred If interested please share your resume at info@bncglobal.in Job Types: Full-time, Permanent Pay: ₹30,000.00 - ₹35,000.00 per month Application Question(s): Are you a Non CA with Minimum 2–4 years of experience in a CA firm? Do you having strong knowledge of taxation laws, ROC, and GST compliance and proficiency in accounting software like Tally, MS Excel, and utility tools? Do you having familiar with Income Tax Portal, GST Portal, MCA Portal, and TDS Portal and good communication skills and ability to handle work independently? Work Location: In person

Posted 1 week ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

Remote

JOB PURPOSE As a Sr. HRIS Analyst, you are responsible for implementing, maintaining, and optimizing the SuccessFactors platform at Sloan, which includes modules such as Employee Central (core HR), Recruiting, Onboarding, Performance & Goals, Learning, Compensation, Succession & Development, and Analytics. The role involves coordinating data and system integrations between SuccessFactors and other platforms, including ADP, Bswift, SAP, TeamSense, Jira, and UKG Pro Workforce Management. You will identify potential system and process enhancements, document both business and technical requirements, and ensure delivery according to specifications while working with system support vendors and collaborating with Sloan's HR, IT, Operations, and Payroll departments. JOB DUTIES AND RESPONSIBILITIES Serve as the primary HRIS contact for SuccessFactors, including configuration, troubleshooting issues, ensuring the data in the system is compliant with HR processes and laws, reporting, and end-user support. SuccessFactors – Techno-Functional: Provide Tier 1 & Tier 2 technical support for SuccessFactors modules ( Employee Central (core HR), Recruiting, Onboarding, Performance & Goals, Learning, Compensation, Succession & Development, and Analytics). Optimize system functionality and processes. Maintain employee data accuracy and system compliance with HR and legal standards. Assist with imports, data loads, and system integrations with third-party tools. Conduct user testing, implement enhancements, and support configuration needs. Generate cyclical and ad hoc reports. Create dashboards and analytics. Act as liaison with customer support and/or consultants for HRIS technology cases. Create and maintain Standard Operating Procedures (SOPs), training documents, and workflows. Draft and execute detailed test scenarios for system changes and upgrades. Lead HRIS projects, including new module rollouts, upgrades, and integrations. Develop timelines, monitor progress, and ensure timely delivery of quality solutions. Identify and escalate issues, track risks, and ensure follow-through on corrective actions. Deliver training sessions to staff, managers, and end-users on new system features and best practices. Other duties and responsibilities as required. REQUIRED QUALIFICATIONS Bachelor's Degree in in Human Resources, Information Technology, Business Administrator or another related field. 5+ Years Working as a techno-functional systems Analyst in SuccessFactors Employee Central (core HR), Recruiting, Onboarding, Performance & Goals, Learning, Compensation, Succession & Development, and Analytics modules. SAP Certified Associate - Employee Central, Recruiting, Onboarding, Performance & Goals, Succession & Development, Learning, and Compensation. Ability to use discretion when working with confidential information. Actively seeks information to understand customers' circumstances, problems, expectations, and needs. Advanced level experience working with Microsoft Office Suite (Word, Excel, PowerPoint, etc.). Excellent written and verbal communication skills in English. Experience supporting U.S.-based teams and navigating time zone overlap requirements. Experience working independently in a global, remote HR or shared services environment. Familiar with US laws relating to Human Resource processes and operations. Must be available during core U.S. working hours (full or partial overlap as agreed). Strong attention to detail, documentation, and stakeholder management. Strong reporting skills, familiarity with query tools or SQL is a plus. PREFERRED QUALIFICATIONS SuccessFactors Expert (SFX) Accreditation US Shift : 07pm IST to 03:30am IST

Posted 1 week ago

Apply

5.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Leads projects for design, development and maintenance of a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with key business stakeholders, IT experts and subject-matter experts to plan, design and deliver optimal analytics and data science solutions. Works on one or many product teams at a time. Key Responsibilities Designs and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Designs and implements framework to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Designs and provide guidance on building reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Designs and implements physical data models to define the database structure. Optimizing database performance through efficient indexing and table relationships. Participates in optimizing, testing, and troubleshooting of data pipelines. Designs, develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity. Assists with renovating the data management infrastructure to drive automation in data integration and management. Ensures the timeliness and success of critical analytics initiatives by using agile development technologies such as DevOps, Scrum, Kanban Coaches and develops less experienced team members. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience Intermediate experience in a relevant discipline area is required. Knowledge of the latest technologies and trends in data engineering are highly preferred and includes: 5-8 years of experience Familiarity analyzing complex business systems, industry requirements, and/or data regulations Background in processing and managing large data sets Design and development for a Big Data platform using open source and third-party tools SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Experience developing applications requiring large file movement for a Cloud-based environment and other data extraction tools and methods from a variety of sources Experience in building analytical solutions Intermediate Experiences In The Following Are Preferred Experience with IoT technology Experience in Agile software development Qualifications Work closely with business Product Owner to understand product vision. Play a key role across DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. Independently design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. Responsible for creation, maintenence and management of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP). Take part in evaluation of new data tools, POCs and provide suggestions. Take full ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization. Proactively address and resolve issues that compromise data accuracy and usability. Preferred Skills Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. Data Replication: Working knowledge of replication technologies like Qlik Replicate is a plus API: Working knowledge of API to consume data from ERP, CRM

Posted 1 week ago

Apply

4.0 - 5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Supports, develops and maintains a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with the Business and IT teams to understand the requirements to best leverage the technologies to enable agile data delivery at scale. Key Responsibilities Implements and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Implements methods to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Develops reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Develops physical data models and implements data storage architectures as per design guidelines. Analyzes complex data elements and systems, data flow, dependencies, and relationships in order to contribute to conceptual physical and logical data models. Participates in testing and troubleshooting of data pipelines. Develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses agile development technologies, such as DevOps, Scrum, Kanban and continuous improvement cycle, for data driven application. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience 4-5 Years of experience. Relevant experience preferred such as working in a temporary student employment, intern, co-op, or other extracurricular team activities. Knowledge of the latest technologies in data engineering is highly preferred and includes: Exposure to Big Data open source SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Familiarity developing applications requiring large file movement for a Cloud-based environment Exposure to Agile software development Exposure to building analytical solutions Exposure to IoT technology Qualifications Work closely with business Product Owner to understand product vision. Participate in DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. Work under limited supervision to design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. Responsible for creation of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP) with guidance and help from senior data engineers. Take part in evaluation of new data tools, POCs with guidance and help from senior data engineers. Take ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization under limited supervision. Assist to resolve issues that compromise data accuracy and usability. Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Intermediate level expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. API: Working knowledge of API to consume data from ERP, CRM

Posted 1 week ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Job Description: Automation Tester: As a Test Leads / Architect who is passionate about seeing customer succeeded and ensuring best-in-class product quality. The objective of the role is End-to-End ownership of testing of the product and be the custodian of product quality. With the Digital Engineering team, you will have the opportunity to join a fast-growing team that is embarking on a multi-year implementation as part of an on-going digital modernization effort. As the project team ramps up, you will have the chance to help define and shape the vision of how the solution will be maintained and monitored to meet the business’ needs. Experience Level: 5+ Roles and Responsibilities: Experience in Telecom Industry Application is MUST Experience in testing CRM, Web Application, Billing & Order Management Have deep experience of system level debugging (including customer issues) with good understanding of managing and triaging production level issues Should have experience in database query language such as SQL, so that they can query and validate the production data for analysis. Exposer to Database / ETL Testing. Should have understanding for Microsoft Azure Cloud Environment. Responsible for maintaining QE health through facilitated defect management using standardized triage, defect management, and communication processes. Monitor trends and improvement opportunities using Mean Time to Resolution, defect RCAs, and environment downtime as key performance indicators. Conduct daily defect review with key program stakeholders, including dev, test, product, and support teams to ensure the path forward and reduce defect burn down for all prioritized defects. Monitor and ensure that overall defect processes are aligned with a standard one defect process across all test phases and teams. Provide Test Estimation to leads for Intake planning Should have a good understanding of API testing and be able to perform API testing using relevant tools like Postman, REST Assured or SoapUI Experience in Test/Defect Management tool (Preferably in JIRA/Zephyr) Partners with other leads & Architect for test planning, Assignment & reporting Monitor chats and host working sessions with impacted teams and ensure there is a path forward and active investigation on defects until disposition. Escalation Point for Dispute defects & resolution Conduct root cause analysis for defects and ensure the defect is closed under the right root cause category. Mitigate impediments and fosters a work environment for high-performing team dynamics, continuous the team’s workflow, and relentless improvements Responsible for designing holistic test architecture and Test solutions in alignment with business requirements and solution specifications Works with for Test Data team for Test Data Requirement & Fulfillment Primary / Mandatory skills: 5+ years’ experience in Product Testing with minimum 3+ years of experience on Defect Management/Production Validation Testing. Proven experience in Testing/defect management and triaging in a fast-paced software development environment. Experience in using defect tracking tools like JIRA and creating reports using Power BI. Experience of system level debugging (including customer issues) with good understanding of managing and triaging production level issues Should be familiar with database query language such as SQL, so that they can query and validate the production data for analysis Should have exposure to Test Automation (Selenium) Should have a good understanding of API testing and be able to perform API testing using relevant tools like Postman, REST Assured or SoapUI Experience in Test/Defect Management tool (Preferably in JIRA/Zephyr) Expertise in preparing daily status reports/dashboards to Management Decision maker in Entry and Exit of internal /development Testing stages Should be able to Co-ordinate proactively with Testing Team, Dev Leads & Other Members Expertise in Risk Identification & analysis Proven expertise in Agile software development especially Scrum and Kanban Experience in Telecom Industry is added advantage Strong written and verbal communication skills Technical Skills: Selenium , JAVA, JavaScript, JMeter, Rest Assured, SQL, Maven, Eclipse/VS Code, Bitbucket, JIRA, Jenkins, Git/GitHub, DevOps, Postman Additional information (if any): Willing to work in Shift Duties, Willingness to learn is very important as AT&T offers excellent environment to learn Digital Transformation skills such as cloud etc. Weekly Hours: 40 Time Type: Regular Location: Bangalore, Karnataka, India It is the policy of AT&T to provide equal employment opportunity (EEO) to all persons regardless of age, color, national origin, citizenship status, physical or mental disability, race, religion, creed, gender, sex, sexual orientation, gender identity and/or expression, genetic information, marital status, status with regard to public assistance, veteran status, or any other characteristic protected by federal, state or local law. In addition, AT&T will provide reasonable accommodations for qualified individuals with disabilities. AT&T is a fair chance employer and does not initiate a background check until an offer is made.

Posted 1 week ago

Apply

0.0 - 10.0 years

0 Lacs

Tamil Nadu

On-site

Aditya Birla Money Limited Senior Manager - Accounts Payable Location: Chennai-HO-Guindy IE, Tamil Nadu Position / Job Title (Proposed) Section Head - Accounts Payable Designation Manager Function Accounts Department Accounts Reporting To (Title) HOD - ACCOUNTS Superior’s Superior (Title) CFO 1) Job Purpose To be responsible for monitoring & authorize the entire payment process of the company and ensure funds of the Company are used only for the specific approved purpose. Responsible for data security and confidentiality of sensitive information of the Company. Responsible to comply all statutory commitments by all means – payment, return filing, certificate submission to the statutory bodies. To co-ordinate end-to-end for ALL audit deliverables and assure smooth completion of audit and ensure expenses accounting reflect accurate in the Financial statements of the Company. 2) Dimensions: Other Quantitative and Important Parameters for the job: Budgets/ Volumes/No. of Products/Geography/ Markets/ Customers or any other parameter Responsible to verify and authorize Vendor payments. Employee reimbursements, Business payout Payment and ensure accounting entry in Books of Accounts. Responsible for BRS – 14 banks. Information Security & Confidentiality of sensitive data to be maintained and it is the responsibility of the job holder to ensure a process is in place for the same. Statutory payments of PF, ESI, LWF, GST, TDS are released on time and evidence maintained for documentation purposes. Tax compliance of all payment related entries and ensure no payment is released without deducting TDS. Appropriate tax rate to be applied for tax with-hold. Expense Provision for Monthly, Quarterly & yearly Book Closure. End – end – responsible for Data collation to meet Auditors requirement. Verification of Sales team Incentive workings and booking expense booking. Quarterly LR audit plan to be split into monthly and data collection from other departments. Quarterly Vendor Ageing Analysis & GL review. Drive automation initiatives as a regular process and implement once the automation is through. Diplomatic query handling and no inappropriate message to be communicated in the reply. Every process is to be documented by way of SOP – approved by HOD. Fund management and arrangement for payment release. 3) Job Context & Major Challenges (What are the specific aspects of the job that provide a challenge to the jobholder in the context of the Unit/Zone? Job holder is responsible to validate the payment processing initiated by the maker and release payment – Vendor payment + Business pay-out + Employee reimbursements. Next major job is audit co-ordination; being listed entity ABML is subject to quarterly LR audit. Audit plan to be drawn on discussion with functional owners and ensure smooth completion of audit by providing data for 3 months in the limited audit time. Periodic MIS to internal and external stakeholders and query handling pertaining to the same. Certificates and Reports in prescribed form to be submitted to Exchanges and other Regulators. Responsible for reconciliations, ledger reviews, initiate automation requests, preparing data dump to meet MIS requirements. The major challenges are even distribution of time to meet various payment requests that come up for release simultaneously. Explain the type of data requirement, consolidate it in required form and provide to auditors within timeline. Making the Branch managers and executives interactive and to adhere to the process is also a challenge to be overcome. Execute the plan of activities as per timelines. Make automation initiatives a continuous process and implement the same. 4) Principal Accountabilities Accountability Supporting Actions Audit co-ordination Audit plan to be drawn for every quarter LR and internal audit & yearly statutory audit and to be executed as per timeline set. Call for discussion with other departments, explain audit plan, and get data delivered to auditors as per their standards. Payment release Authorize payment for the approved expenses and ensure no double payment, excess payment and strict adherence to the process. Responsible to monitor & verify Vendor payment requests processed by the maker and related accounting entries. Checker for Business payout & Incentive calculation as per approved schemes, accurate & timely release. Accuracy Periodic review & scrutiny of the ledgers by way of verification, DoA check, tax compliance, budget, book entry, actual payment release, bank instruction and investigate any abnormal ageing balances and initiate corrective action. Agreed TAT to be maintained. To complete accounting and payment activities for timely closure of books of accounts – monthly. To approve for the accounting and release of all payments as per DOA. Review DOA at periodic intervals with input from all concerned department heads. Monitoring JVs, Provision entries & capital expenditures. GL Reconciliation. MIS & Dashboard on BP payout and circulate to stake – holders. Check the Operations Accounting entries pertaining to BP activities. Statutory compliances Handle exchange inspections and provide data. NW certificate to be provided timely to the Regulators. PMS audit certificate to be facilitated to PMS clients. Query handling of all stake holders – internal & external Automation initiatives Constant drive automation plans and co-ordinate with IT to make it live. 5) Job Purpose of Direct Reports Responsible to calculate payouts for Franchisee partners, Direct Selling Agents, Branch Sub-brokers get is approved from reporting manager and process payment after accounting in Books. Responsible to collect, verify, check approval, account and process payment for Employee reimbursements – mobile, travel, conveyance, business promotion expenses. Head office & Branch Expense management – verify approval, budget, correctness and release payment on timely basis. Business Partner Operational accounting – full and final settlement and exceptional payment. Submission of Statutory certificates to Regulatory & Auditors. Scrutinize Books of Accounts and ageing analysis report. Minimum Experience Level 6 - 10 years Job Qualifications Post Graduate

Posted 1 week ago

Apply

0.0 years

0 Lacs

Hyderabad, Telangana

Remote

Data Scientist II Hyderabad, Telangana, India Date posted Aug 01, 2025 Job number 1854865 Work site Up to 50% work from home Travel 0-25 % Role type Individual Contributor Profession Research, Applied, & Data Sciences Discipline Data Science Employment type Full-Time Overview Security represents the most critical priorities for our customers in a world awash in digital threats, regulatory scrutiny, and estate complexity. Microsoft Security aspires to make the world a safer place for all. We want to reshape security and empower every user, customer, and developer with a security cloud that protects them with end to end, simplified solutions. The Microsoft Security organization accelerates Microsoft’s mission and bold ambitions to ensure that our company and industry is securing digital technology platforms, devices, and clouds in our customers’ heterogeneous environments, as well as ensuring the security of our own internal estate. Our culture is centered on embracing a growth mindset, a theme of inspiring excellence, and encouraging teams and leaders to bring their best each day. In doing so, we create life-changing innovations that impact billions of lives around the world. The Defender Experts (DEX) Research team is at the forefront of Microsoft’s threat protection strategy, combining world-class hunting expertise with AI-driven analytics to protect customers from advanced cyberattacks. Our mission is to move protection left—disrupting threats early, before damage occurs—by transforming raw signals into intelligence that powers detection, disruption, and customer trust. We’re looking for a passionate and curious Data Scientist to join this high-impact team. In this role, you'll partner with researchers, hunters, and detection engineers to explore attacker behavior, operationalize entity graphs, and develop statistical and ML-driven models that enhance DEX’s detection efficacy. Your work will directly feed into real-time protections used by thousands of enterprises and shape the future of Microsoft Security. This is an opportunity to work on problems that matter—with cutting-edge data, a highly collaborative team, and the scale of Microsoft behind you. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Qualifications Bachelor’s or Master’s degree in Computer Science, Statistics, Applied Mathematics, Data Science, or a related quantitative field 3+ years of experience applying data science or machine learning in a real-world setting, preferably in security, fraud, risk, or anomaly detection Proficiency in Python and/or R, with hands-on experience in data manipulation (e.g., Pandas, NumPy), modeling (e.g., scikit-learn, XGBoost), and visualization (e.g., matplotlib, seaborn) Strong foundation in statistics, probability, and applied machine learning techniques Experience working with large-scale datasets, telemetry, or graph-structured data Ability to clearly communicate technical insights and influence cross-disciplinary teams Demonstrated ability to work independently, take ownership of problems, and drive solutions end-to-end Responsibilities Understand complex cybersecurity and business problems, translate them into well-defined data science problems, and build scalable solutions. Design and build robust, large-scale graph structures to model security entities, behaviors, and relationships. Develop and deploy scalable, production-grade AI/ML systems and intelligent agents for real-time threat detection, classification, and response. Collaborate closely with Security Research teams to integrate domain knowledge into data science workflows and enrich model development. Drive end-to-end ML lifecycle: from data ingestion and feature engineering to model development, evaluation, and deployment. Work with large-scale graph data: create, query, and process it efficiently to extract insights and power models. Lead initiatives involving Graph ML, Generative AI, and agent-based systems, driving innovation across threat detection, risk propagation, and incident response. Collaborate closely with engineering and product teams to integrate solutions into production platforms. Mentor junior team members and contribute to strategic decisions around model architecture, evaluation, and deployment. Benefits/perks listed below may vary depending on the nature of your employment with Microsoft and the country where you work.  Industry leading healthcare  Educational resources  Discounts on products and services  Savings and investments  Maternity and paternity leave  Generous time away  Giving programs  Opportunities to network and connect Microsoft is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 1 week ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka

On-site

General Information Req # WD00086226 Career area: Data Management and Analytics Country/Region: India State: Karnataka City: BANGALORE Date: Friday, August 1, 2025 Working time: Full-time Additional Locations : India - Karnātaka - Bangalore India - Karnātaka - BANGALORE Why Work at Lenovo We are Lenovo. We do what we say. We own what we do. We WOW our customers. Lenovo is a US$57 billion revenue global technology powerhouse, ranked #248 in the Fortune Global 500, and serving millions of customers every day in 180 markets. Focused on a bold vision to deliver Smarter Technology for All, Lenovo has built on its success as the world’s largest PC company with a full-stack portfolio of AI-enabled, AI-ready, and AI-optimized devices (PCs, workstations, smartphones, tablets), infrastructure (server, storage, edge, high performance computing and software defined infrastructure), software, solutions, and services. Lenovo’s continued investment in world-changing innovation is building a more equitable, trustworthy, and smarter future for everyone, everywhere. Lenovo is listed on the Hong Kong stock exchange under Lenovo Group Limited (HKSE: 992) (ADR: LNVGY). This transformation together with Lenovo’s world-changing innovation is building a more inclusive, trustworthy, and smarter future for everyone, everywhere. To find out more visit www.lenovo.com, and read about the latest news via our StoryHub. Description and Requirements BS/BA in Computer Science, Mathematics, Statistics, MIS, or related At least 5 years' experience in the data warehouse space. At least 5 years' experience in custom ETL/ELT design, implementation and maintenance. At least 5 years' experience in writing SQL statements. At least 3 years' experience with Cloud based data platform technologies such as Google Big Query, or Azure/Snowflake data platform equivalent. Ability in managing and communicating data warehouse plans to internal clients. Additional Locations : India - Karnātaka - Bangalore India - Karnātaka - BANGALORE India India - Karnātaka * India - Karnātaka - Bangalore , * India - Karnātaka - BANGALORE NOTICE FOR PUBLIC At Lenovo, we follow strict policies and legal compliance for our recruitment process, which includes role alignment, employment terms discussion, final selection and offer approval, and recording transactions in our internal system. Interviews may be conducted via audio, video, or in-person depending on the role, and you will always meet with an official Lenovo representative. Please beware of fraudulent recruiters posing as Lenovo representatives. They may request cash deposits or personal information. Always apply through official Lenovo channels and never share sensitive information. Lenovo does not solicit money or sensitive information from applicants and will not request payments for training or equipment. Kindly verify job offers through the official Lenovo careers page or contact IndiaTA@lenovo.com. Stay informed and cautious to protect yourself from recruitment fraud. Report any suspicious activity to local authorities.

Posted 1 week ago

Apply

3.0 years

0 Lacs

Ahmedabad, Gujarat

On-site

Job Information Target Date 15/08/2025 Date Opened 01/08/2025 Industry IT Services Job Type Full time Work Experience 3 years City Ahmedabad Province Gujarat Country India Postal Code 380051 Job Description Key Responsibilities: Develop and maintain interactive dashboards, reports, and visualizations using Power BI and/or other BI tools such as Tableau. Design, build, and optimize data models and ETL pipelines to ensure data integrity, scalability, and high performance. Write complex SQL queries and use M language (Power Query) for data transformation and cleansing. Work across the Microsoft Power Platform, including Power Automate, Copilot Studio, and Microsoft Fabric to deliver integrated business solutions. Collaborate with business stakeholders—especially in the multifamily real estate domain—to gather requirements and translate them into scalable technical solutions. Conduct user training sessions and provide ongoing support to enhance adoption and effective use of BI tools. Partner with cross-functional teams to support data-driven decision-making and strategic initiatives. Document technical processes, data models, and dashboards; ensure adherence to data quality standards and governance practices. Requirements Required Qualifications: Minimum 3 years of hands-on experience in Business Intelligence development with Power BI, SQL, and data modeling. Proficiency in Power BI, SQL, and Microsoft Power Platform tools (Power automate, Copilot Studio, Microsoft Fabric). Experience with Tableau is a plus. Experience or strong understanding of the multifamily real estate sector is highly desirable. Excellent analytical, problem-solving, and communication skills to translate data insights into business actions. Self-driven, adaptable, and committed to continuous learning and upskilling. Preferred Qualifications: Familiarity with DAX and advanced data visualization techniques. Experience in conducting BI tool training for end-users. Knowledge of cloud data platforms (e.g., Azure, Snowflake) is a plus.

Posted 1 week ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu

On-site

Job Information Date Opened 08/01/2025 Job Type Permanent RSD NO 11580 Industry IT Services Min Experience 3 Max Experience 8 City Chennai State/Province Tamil Nadu Country India Zip/Postal Code 600086 Job Description Responsibilities • Review and analyze software requirements and technical specifications. • Design, implement, and maintain database structures. Develop and update existing stored procedures and functions using T-SQL. Analyzing the data and develop complex SQL queries. Develop procedures and scripts for data migration. Ensure performance, security, and availability of databases. Collaborate with other team members and stakeholders. Prepare documentation and specifications related to database design and architecture. Required Skills and Qualifications Bachelor’s Degree in Computer Science, Engineering, or a related field (or equivalent practical experience). Minimum of 3+ years of experience as a SQL Developer. Proven work experience as a Senior SQL Developer or similar role. Excellent understanding of T-SQL programming. • Knowledge of SQL Server Management Studio (SSMS), SQL Server Reporting Services (SSRS), and SQL Server Integration Services (SSIS). • Proficient understanding of indexes, views, handling large data, and complex joins. Experience with performance tuning, query optimization, using Performance Monitor, and other related monitoring and troubleshooting tools. Understanding of the health insurance landscape, including Contribution Accounting, Billing and Payments, Claims Processing, and System Integrations. Excellent verbal and written communication skills. Strong organizational skills and keen attention to detail. Ability to work independently and collaboratively in a fast-paced environment. At Indium diversity, equity, and inclusion (DEI) are the cornerstones of our values. We champion DEI through a dedicated council, expert sessions, and tailored training programs, ensuring an inclusive workplace for all. Our initiatives, including the WE@IN women empowerment program and our DEI calendar, foster a culture of respect and belonging. Recognized with the Human Capital Award, we are committed to creating an environment where every individual thrives. Join us in building a workplace that values diversity and drives innovation.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies