Posted:3 days ago|
Platform:
On-site
Full Time
Job Title: Data Engineer
Employment Type:- Full Time, Permanent Position
Location: Noida/Hyderabad
Qualifications: BE/B.Tech/MCA Degree in Computer Science, Engineering, or similar relevant field
Total Experience 3+ years
Working Model Work from Office
We are seeking a skilled Data Engineer/ Senior Data Engineer to design, build, and maintain robust data infrastructure and pipelines that enable our organization to leverage data for strategic decision making. The ideal candidate will have strong technical expertise in data engineering, cloud technologies, and data architecture, with a passion for building scalable and efficient data solutions.
• Translate business requirements into technical specifications for data solutions
• Develop and maintain data models and schema designs that support analytical and operational needs
• Implement and manage data warehousing solutions on platforms like Amazon Redshift or Snowflake
• Design and implement middleware solutions to enable seamless data flows between applications and systems
• Design, develop, and maintain scalable ETL/ELT pipelines to ingest, process, and transform data from various sources
• Build automated data workflows using orchestration frameworks like Apache Airflow to ensure reliable and timely data delivery
• Process large-scale datasets using distributed computing frameworks, particularly Apache Spark
• Perform complex data wrangling and transformation tasks using Python libraries (Pandas, NumPy) and Spark Data Frames
• Work with geospatial (GIS) data, understanding spatial data types, coordinate systems, and GIS-specific processing requirements
• Optimize existing pipelines for performance, cost-efficiency, and maintainability
• Optimize cloud resource utilization to balance performance and cost
• Create and maintain documentation for data pipelines, data models, and technical processes
• Advanced proficiency in Python for data engineering tasks, including experience with libraries such as Pandas, NumPy.
• Expert-level SQL skills with ability to write complex queries, optimize query performance, and design efficient database schemas
• Experience with version control systems (Git) and collaborative development practices
• Hands-on experience in building and maintaining ETL/ELT pipelines
• Proficiency with Apache Spark for distributed data processing
• Experience with workflow orchestration tools, particularly Apache Airflow
• Strong understanding of API design and integration patterns (REST, GraphQL)
• Knowledge of message queuing systems (Kafka, RabbitMQ) is a plus
• Experience with SQL/NoSQL databases and understanding of when to use each type
• Hands-on experience with data warehousing platforms such as Amazon Redshift or Snowflake
• Knowledge of or experience working with geospatial (GIS) data.
• Understanding of data modeling techniques (dimensional modeling, normalization, denormalization)
• Experience with AWS cloud platform and managed data services (S3, Redshift, Glue, Lambda, RDS)
• Experience with distributed computing frameworks and big data technologies
• Understanding of cloud cost optimization and resource management
• Strong expertise in data cleaning, validation, and quality assurance techniques
• Experience with data profiling tools and methodologies
• Understanding of data governance principles and best practices
• Ability to implement data lineage and metadata management solutions
• Experience working with BI tools, particularly Power BI
• Ability to design data models that support efficient reporting and visualization
• Understanding of dimensional modeling for analytical use cases
• Strong problem-solving abilities with analytical and critical thinking skills
• Excellent communication skills with ability to explain technical concepts to non-technical stakeholders
• Proven ability to work collaboratively in cross-functional teams
• Experience in client handling, including gathering requirements, managing expectations, and providing regular status updates
• Ability to build and maintain strong client relationships through responsive communication and proactive problem-solving work.
• Bachelor’s degree in computer science, Information Systems, Data Engineering, or related field
• Experience with real-time data streaming and event-driven architectures
• Certifications in AWS :- o AWS Certified Solutions Architect – Associate o AWS Certified Developer – Associate o AWS Certified Solutions Architect – Professional
• Experience with CI/CD practices for data pipelines
• Understanding of data security, encryption, and compliance requirements (GDPR, CCPA)
RDSolutions India
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Nowbengaluru
8.0 - 18.0 Lacs P.A.
bengaluru
10.0 - 20.0 Lacs P.A.
bengaluru
10.0 - 20.0 Lacs P.A.
gurgaon
5.925 - 9.0 Lacs P.A.
pune
5.75 - 9.145 Lacs P.A.
Salary: Not disclosed
bengaluru
5.0575 - 8.0 Lacs P.A.
bengaluru
Experience: Not specified
7.1 - 8.76 Lacs P.A.
chennai
Experience: Not specified
4.5 - 7.0 Lacs P.A.
bengaluru, karnataka, india
Salary: Not disclosed