Jobs
Interviews

7 Audit Logging Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a skilled professional in software/application development, you will lead the creation of scalable, high-performance application architecture while also designing and developing enterprise-grade applications. Your responsibilities will include managing Azure DevOps processes, optimizing performance, conducting solution design, documenting RCAs, and collaborating with cross-functional teams. The ideal candidate for this role should have a Graduation in Computers/Electronics or a Post-Graduation in Computer Science, along with 3-5 years of experience in software/application development. You must possess the following MANDATORY Technical Skills: - Core Technologies: Python, FastAPI, React/TypeScript, Langchain, LangGraph, AI Agents Docker, Azure Open AI, Prompt Engineering - Cloud & Infrastructure: AWS (Secrets Manager, IAM, ECS/EC2), Azure AD, Azure DevOps, GitHub - Database & Performance: MongoDB (Motor, Beanie ODM), Redis, caching strategies - Security: OAuth2/SAML, JWT, Azure AD integration, audit logging - Soft Skills: Strong problem-solving abilities, mentoring skills, effective technical communication, and the ability to work independently with a high ownership mindset.,

Posted 22 hours ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You will be working with a large-scale/global company located in Viman Nagar, Pune, requiring 5-8 years of experience in software/application development. As a part of your role, you will lead scalable, high-performance application architecture, develop enterprise-grade applications, manage Azure DevOps processes, and conduct solution design and RCA documentation while interacting with cross-functional teams. The ideal candidate should have a Graduation in Computers/Electronics or Post-Graduation in Computer Science with 5-8 years of experience in software/application development. Mandatory technical skills include proficiency in Python, FastAPI, React/TypeScript, Langchain, LangGraph, AI Agents Docker, Azure Open AI, and Prompt Engineering. Knowledge in AWS (Secrets Manager, IAM, ECS/EC2), Azure AD, Azure DevOps, GitHub, MongoDB (Motor, Beanie ODM), Redis, OAuth2/SAML, JWT, Azure AD integration, and audit logging is required. Additionally, soft skills such as problem-solving, mentoring, and technical communication are essential for this role. Joining this company will expose you to a rewarding work culture where your contributions will define the success of the organization. Bajaj Finance Limited is a leading Non-banking financial company in India and ranks among Asia's top 10 Large workplaces. With over 500 locations across India, this is an opportunity for individuals driven to excel in their careers.,

Posted 1 week ago

Apply

9.0 - 12.0 years

14 - 24 Lacs

Gurugram

Remote

We are looking for an experienced Senior Data Engineer to lead the development of scalable AWS-native data lake pipelines with a strong focus on time series forecasting and upsert-ready architectures. This role requires end-to-end ownership of the data lifecycle, from ingestion to partitioning, versioning, and BI delivery. The ideal candidate must be highly proficient in AWS data services, PySpark, versioned storage formats like Apache Hudi/Iceberg, and must understand the nuances of data quality and observability in large-scale analytics systems. Role & responsibilities Design and implement data lake zoning (Raw Clean Modeled) using Amazon S3, AWS Glue, and Athena. Ingest structured and unstructured datasets including POS, USDA, Circana, and internal sales data. Build versioned and upsert-friendly ETL pipelines using Apache Hudi or Iceberg. Create forecast-ready datasets with lagged, rolling, and trend features for revenue and occupancy modelling. Optimize Athena datasets with partitioning, CTAS queries, and metadata tagging. Implement S3 lifecycle policies, intelligent file partitioning, and audit logging. Build reusable transformation logic using dbt-core or PySpark to support KPIs and time series outputs. Integrate robust data quality checks using custom logs, AWS CloudWatch, or other DQ tooling. Design and manage a forecast feature registry with metrics versioning and traceability. Collaborate with BI and business teams to finalize schema design and deliverables for dashboard consumption. Preferred candidate profile 9-12 years of experience in data engineering. Deep hands-on experience with AWS Glue, Athena, S3, Step Functions, and Glue Data Catalog. Strong command over PySpark, dbt-core, CTAS query optimization, and partition strategies. Working knowledge of Apache Hudi, Iceberg, or Delta Lake for versioned ingestion. Experience in S3 metadata tagging and scalable data lake design patterns. Expertise in feature engineering and forecasting dataset preparation (lags, trends, windows). Proficiency in Git-based workflows (Bitbucket), CI/CD, and deployment automation. Strong understanding of time series KPIs, such as revenue forecasts, occupancy trends, or demand volatility. Data observability best practices including field-level logging, anomaly alerts, and classification tagging. Experience with statistical forecasting frameworks such as Prophet, GluonTS, or related libraries. Familiarity with Superset or Streamlit for QA visualization and UAT reporting. Understanding of macroeconomic datasets (USDA, Circana) and third-party data ingestion. Independent, critical thinker with the ability to design for scale and evolving business logic. Strong communication and collaboration with BI, QA, and business stakeholders. High attention to detail in ensuring data accuracy, quality, and documentation. Comfortable interpreting business-level KPIs and transforming them into technical pipelines.

Posted 2 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

As a Senior Cloud Data Integration Consultant, you will be responsible for leading a complex data integration project that involves API frameworks, a data lakehouse architecture, and middleware solutions. The project focuses on technologies such as AWS, Snowflake, Oracle ERP, and Salesforce, with a high transaction volume POS system. Your role will involve building reusable and scalable API frameworks, optimizing middleware, and ensuring security and compliance in a multi-cloud environment. Your expertise in API development and integration will be crucial for this project. You should have deep experience in managing APIs across multiple systems, building reusable components, and ensuring bidirectional data flow for real-time data synchronization. Additionally, your skills in middleware solutions and custom API adapters will be essential for integrating various systems seamlessly. In terms of cloud infrastructure and data processing, your strong experience with AWS services like S3, Lambda, Fargate, and Glue will be required for data processing, storage, and integration. You should also have hands-on experience in optimizing Snowflake for querying and reporting, as well as knowledge of Terraform for automating the provisioning and management of AWS resources. Security and compliance are critical aspects of the project, and your deep understanding of cloud security protocols, API security, and compliance enforcement will be invaluable. You should be able to set up audit logs, ensure traceability, and enforce compliance across cloud services. Handling high-volume transaction systems and real-time data processing requirements will be part of your responsibilities. You should be familiar with optimizing AWS Lambda and Fargate for efficient data processing and be skilled in operational monitoring and error handling mechanisms. Collaboration and support are essential for the success of the project. You will need to provide post-go-live support, collaborate with internal teams and external stakeholders, and ensure seamless integration between systems. To qualify for this role, you should have at least 10 years of experience in enterprise API integration, cloud architecture, and data management. Deep expertise in AWS services, Snowflake, Oracle ERP, and Salesforce integrations is required, along with a proven track record of delivering scalable API frameworks and handling complex middleware systems. Strong problem-solving skills, familiarity with containerization technologies, and experience in retail or e-commerce industries are also desirable. Your key responsibilities will include leading the design and implementation of reusable API frameworks, optimizing data flow through middleware systems, building robust security frameworks, and collaborating with the in-house team for seamless integration between systems. Ongoing support, monitoring, and optimization post-go-live will also be part of your role.,

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

noida, uttar pradesh

On-site

As a highly experienced and motivated Backend Solution Architect, you will be responsible for leading the design and implementation of robust, scalable, and secure backend systems. Your expertise in Node.js and exposure to Python will be crucial in architecting end-to-end backend solutions using microservices and serverless frameworks. You will play a key role in ensuring scalability, maintainability, and security, while also driving innovation through the integration of emerging technologies like AI/ML. Your primary responsibilities will include designing and optimizing backend architecture, managing AWS-based cloud solutions, integrating AI/ML components, containerizing applications, setting up CI/CD pipelines, designing and optimizing databases, implementing security best practices, developing APIs, monitoring system performance, and providing technical leadership and collaboration with cross-functional teams. To be successful in this role, you should have at least 8 years of backend development experience with a minimum of 4 years as a Solution/Technical Architect. Your expertise in Node.js, AWS services, microservices, event-driven architectures, Docker, Kubernetes, CI/CD pipelines, authentication/authorization mechanisms, and API development will be critical. Additionally, hands-on experience with AI/ML workflows, React, Next.js, Angular, and AWS Solution Architect Certification will be advantageous. At TechAhead, a global digital transformation company, you will have the opportunity to work on cutting-edge AI-first product design thinking and bespoke development solutions. By joining our team, you will contribute to shaping the future of digital innovation worldwide and driving impactful results with advanced AI tools and strategies.,

Posted 1 month ago

Apply

8.0 - 13.0 years

85 - 90 Lacs

Noida

Work from Office

About the Role We are seeking a highly skilled Staff Engineer to lead the architecture, development, and scaling of our Marketplace platform - including portals & core services such as Identity & Access Management (IAM), Audit, and Tenant Management services. This is a hands-on technical leadership role where you will drive engineering excellence, mentor teams, and ensure our platforms are secure, compliant, and built for scale. A Day in the Life Design and implement scalable, high-performance backend systems for all the platform capabilities Lead the development and integration of IAM, audit logging, and compliance frameworks, ensuring secure access, traceability, and regulatory adherence. Champion best practices for reliability, availability, and performance across all marketplace and core service components. Mentor engineers, conduct code/design reviews, and establish engineering standards and best practices. Work closely with product, security, compliance, and platform teams to translate business and regulatory requirements into technical solutions. Evaluate and integrate new technologies, tools, and processes to enhance platform efficiency, developer experience, and compliance posture. Take end-to-end responsibility for the full software development lifecycle, from requirements and design through deployment, monitoring, and operational health. What You Need 8+ years of experience in backend or infrastructure engineering, with a focus on distributed systems, cloud platforms, and security. Proven expertise in building and scaling marketplace platforms and developer/admin/API portals. Deep hands-on experience with IAM, audit logging, and compliance tooling. Strong programming skills in languages such as Python or Go. Experience with cloud infrastructure (AWS, Azure), containerization (Docker, Kubernetes), and service mesh architectures. Understanding of security protocols (OAuth, SAML, TLS), authentication/authorization, and regulatory compliance. Demonstrated ability to lead technical projects and mentor engineering teams & excellent problem-solving, communication, and collaboration skills. Proficiency in observability tools such as Prometheus, Grafana, OpenTelemetry. Prior experience with Marketplace & Portals Bachelor's or Masters degree in Computer Science, Engineering, or a related field

Posted 1 month ago

Apply

8.0 - 13.0 years

50 - 85 Lacs

Noida

Work from Office

About the Role We are seeking a highly skilled Staff Engineer to lead the architecture, development, and scaling of our Marketplace platform including portals & core services such as Identity & Access Management (IAM), Audit, and Tenant Management services. This is a hands-on technical leadership role where you will drive engineering excellence, mentor teams, and ensure our platforms are secure, compliant, and built for scale. A Day in the Life Design and implement scalable, high-performance backend systems for all the platform capabilities Lead the development and integration of IAM, audit logging, and compliance frameworks, ensuring secure access, traceability, and regulatory adherence. Champion best practices for reliability, availability, and performance across all marketplace and core service components. Mentor engineers, conduct code/design reviews, and establish engineering standards and best practices. Work closely with product, security, compliance, and platform teams to translate business and regulatory requirements into technical solutions. Evaluate and integrate new technologies, tools, and processes to enhance platform efficiency, developer experience, and compliance posture. Take end-to-end responsibility for the full software development lifecycle, from requirements and design through deployment, monitoring, and operational health. What You Need 8+ years of experience in backend or infrastructure engineering, with a focus on distributed systems, cloud platforms, and security. Proven expertise in building and scaling marketplace platforms and developer/admin/API portals. Deep hands-on experience with IAM, audit logging, and compliance tooling. Strong programming skills in languages such as Python or Go. Experience with cloud infrastructure (AWS, Azure), containerization (Docker, Kubernetes), and service mesh architectures. Understanding of security protocols (OAuth, SAML, TLS), authentication/authorization, and regulatory compliance. Demonstrated ability to lead technical projects and mentor engineering teams & excellent problem-solving, communication, and collaboration skills. Proficiency in observability tools such as Prometheus, Grafana, OpenTelemetry. Prior experience with Marketplace & Portals Bachelor's or Masters degree in Computer Science, Engineering, or a related field

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies