Big Data Engineer

11 - 16 years

30 - 37 Lacs

Bengaluru

Posted:2 days ago| Platform: Naukri logo

Apply

Skills Required

Performance tuning Data modeling Agile Wellness Workflow Healthcare Unit testing Apache SQL Python

Work Mode

Work from Office

Job Type

Full Time

Job Description

We enable #HumanFirstDigital Job Summary: We are looking for a highly experienced and strategic Data Engineer to drive the design, development, and optimization of our enterprise data platform. This role requires deep technical expertise in AWS, StreamSets, and Snowflake, along with solid experience in Kubernetes, Apache Airflow, and unit testing. The ideal candidate will lead a team of data engineers and play a key role in delivering scalable, secure, and high-performance data solutions for both historical and incremental data loads. Key Responsibilities: Lead the architecture, design, and implementation of end-to-end data pipelines using StreamSets and Snowflake. Oversee the development of scalable ETL/ELT processes for historical data migration and incremental data ingestion. Guide the team in leveraging AWS services (S3, Lambda, Glue, IAM, etc.) to build cloud-native data solutions. Provide technical leadership in deploying and managing containerized applications using Kubernetes. Define and implement workflow orchestration strategies using Apache Airflow. Establish best practices for unit testing, code quality, and data validation. Collaborate with data architects, analysts, and business stakeholders to align data solutions with business goals. Mentor junior engineers and foster a culture of continuous improvement and innovation. Monitor and optimize data workflows for performance, scalability, and cost-efficiency. Required Skills & Qualifications: High proficiency in AWS, including hands-on experience with core services (S3, Lambda, Glue, IAM, CloudWatch). Expert-level experience with StreamSets, including Data Collector, Transformer, and Control Hub. Strong Snowflake expertise, including data modeling, SnowSQL, and performance tuning. Medium-level experience with Kubernetes, including container orchestration and deployment. Working knowledge of Apache Airflow for workflow scheduling and monitoring. Experience with unit testing frameworks and practices in data engineering. Proven experience in building and managing ETL pipelines for both batch and real-time data. Strong command of SQL and scripting languages such as Python or Shell. Experience with CI/CD pipelines and version control tools (e.g., Git, Jenkins). Preferred Qualifications: AWS certification (e.g., AWS Certified Data Analytics, Solutions Architect). Experience with data governance, security, and compliance frameworks. Familiarity with Agile methodologies and tools like Jira and Confluence. Prior experience in a leadership or mentoring role within a data engineering team. Our Commitment to Diversity & Inclusion: Our Perks and Benefits: Our benefits and rewards program has been thoughtfully designed to recognize your skills and contributions, elevate your learning/upskilling experience and provide care and support for you and your loved ones. As an Apexon Associate, you get continuous skill-based development, opportunities for career advancement, and access to comprehensive health and well-being benefits and assistance. We also offer: o Group Health Insurance covering family of 4 o Term Insurance and Accident Insurance o Paid Holidays & Earned Leaves o Paid Parental LeaveoLearning & Career Development o Employee Wellness Job Location : Bengaluru, India

Mock Interview

Practice Video Interview with JobPe AI

Start Performance Tuning Interview Now
Apexon
Apexon

Technology / Digital Services

Jacksonville

501-1000 Employees

89 Jobs

    Key People

  • Shivendra Singh

    CEO
  • Ravi K. N. Murthy

    CTO

RecommendedJobs for You

Chennai, Tamil Nadu, India