Senior . NET Developer

2 - 5 years

3 - 7 Lacs

Posted:1 month ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description


We are seeking a

highly skilled .NET Backend Developer

with expertise in

C#, SQL Server,

MongoDB, MySQL, and large-scale data processing as core skill

. This role focuses on

efficient data ingestion, structured data integration, and high-speed processing of large datasets

while ensuring optimal memory and resource utilization.
The ideal candidate should have deep experience in

handling structured and unstructured data, multi-threaded processing, efficient database optimization, and real-time data synchronization

to support

scalable and performance-driven backend architecture

.

Key Focus Areas

-

Efficient Data Ingestion & Processing:

Developing scalable

pipelines to process large structured/unstructured data files

. -

Data Integration & Alignment:

Merging

datasets from multiple sources with consistency

. -

Database Expertise & Performance Optimization:

Designing

high-speed relational database structures

for efficient storage and retrieval. -

High-Performance API Development:

Developing

low-latency RESTful APIs

to handle

large data exchanges

efficiently. -

Multi-Threaded Processing & Parallel Execution:

Implementing

concurrent data processing techniques

to optimize system performance. -

Caching Strategies & Load Optimization:

Utilizing

in-memory caching & indexing

to reduce I/O overhead. -

Real-Time Data Processing & Streaming:

Using

message queues and data streaming

for optimized data distribution.

Required Skills & Technologies

  • Backend Development:

    C#, .NET Core, ASP.NET Core Web API

  • Data Processing & Integration:

    Efficient Data Handling, Multi-Source Data Processing

  • Database Expertise:

    SQL Server MongoDB ,MySQL (Schema Optimization, Indexing, Query Optimization, Partitioning, Bulk Processing)

  • Performance Optimization:

    Multi-threading, Parallel Processing, High-Throughput Computing

  • Caching & Memory Management:

    Redis, Memcached, IndexedDB, Database Query Caching

  • Real-Time Data Processing:

    Kafka, RabbitMQ, WebSockets, SignalR

  • File Processing & ETL Pipelines:

    Efficient Data Extraction, Transformation, and Storage Pipelines

  • Logging & Monitoring:

    Serilog, Application Insights, ELK Stack

  • CI/CD & Cloud Deployments:

    Azure DevOps, Kubernetes, Docker

Key Responsibilities

1. Data Ingestion & Processing

  • Develop

    scalable data pipelines

    to handle high-throughput structured and unstructured data ingestion.
  • Implement

    multi-threaded data processing mechanisms

    to optimize efficiency.
  • Optimize

    memory management techniques

    to handle large-scale data operations.

2. Data Integration & Alignment

  • Implement

    high-speed algorithms

    to

    merge and integrate datasets

    efficiently.
  • Ensure

    data consistency and accuracy

    across multiple sources.
  • Optimize

    data buffering & streaming techniques

    to prevent processing bottlenecks.

3. High-Performance API Development

  • Design and develop

    high-speed APIs

    for efficient

    data retrieval and updates

    .
  • Implement

    batch processing & streaming capabilities

    to manage large data payloads.
  • Optimize

    API response times and query execution plans

    .

4. Database Expertise & Optimization (SQL Server , MongoDB ,MySql )

  • Design

    efficient database schema structures

    to support large-scale data transactions.
  • Implement

    bulk data operations, indexing, and partitioning

    for

    high-speed retrieval

    .
  • Optimize

    stored procedures and concurrency controls

    to support

    high-frequency transactions

    .
  • Use

    sharding and distributed database techniques

    for enhanced scalability.

5. Caching & Load Balancing

  • Deploy

    Redis / Memcached / IndexedDB caching

    to improve database query performance.
  • Implement

    data pre-fetching & cache invalidation strategies

    for real-time accuracy.
  • Optimize

    load balancing techniques

    for efficient request distribution.

6. Real-Time Data Synchronization & Streaming

  • Implement

    event-driven architectures

    using

    message queues (Kafka, RabbitMQ, etc.)

    .
  • Utilize

    WebSockets / SignalR

    for

    real-time data synchronization

    .
  • Optimize

    incremental updates instead of full data reloads

    for better resource efficiency.

Preferred Additional Experience

Experience handling large-scale databases and high-throughout data environments

.

Expertise in distributed database architectures

for large-scale structured data storage.

Hands-on experience with query profiling & performance tuning tools

.
Apply arrow_forward_ios
highly skilled .NET Backend Developer
with expertise in
efficient data ingestion, structured data integration, and high-speed processing of large datasets
while ensuring optimal memory and resource utilization.
handling structured and unstructured data, multi-threaded processing, efficient database optimization, and real-time data synchronization
to support
Efficient Data Ingestion & Processing:
Developing scalable
Data Integration & Alignment:
Merging
Database Expertise & Performance Optimization:
Designing
high-speed relational database structures
for efficient storage and retrieval.
High-Performance API Development:
Developing
low-latency RESTful APIs
to handle
large data exchanges
efficiently.
Multi-Threaded Processing & Parallel Execution:
Implementing
concurrent data processing techniques
to optimize system performance.
Caching Strategies & Load Optimization:
Utilizing
in-memory caching & indexing
to reduce I/O overhead.
Real-Time Data Processing & Streaming:
Using
message queues and data streaming
for optimized data distribution.
scalable data pipelines
to handle high-throughput structured and unstructured data ingestion.
multi-threaded data processing mechanisms
to optimize efficiency.
memory management techniques
to handle large-scale data operations.
high-speed algorithms
to
merge and integrate datasets
efficiently.
data consistency and accuracy
across multiple sources.
data buffering & streaming techniques
to prevent processing bottlenecks.
high-speed APIs
for efficient
batch processing & streaming capabilities
to manage large data payloads.
efficient database schema structures
to support large-scale data transactions.
bulk data operations, indexing, and partitioning
for
stored procedures and concurrency controls
to support
sharding and distributed database techniques
for enhanced scalability.
Redis / Memcached / IndexedDB caching
to improve database query performance.
data pre-fetching & cache invalidation strategies
for real-time accuracy.
load balancing techniques
for efficient request distribution.
event-driven architectures
using
WebSockets / SignalR
for
incremental updates instead of full data reloads
for better resource efficiency.
Expertise in distributed database architectures
for large-scale structured data storage.

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You