Senior Data Engineer

1 - 6 years

0 Lacs

Posted:4 hours ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

SUMMARY

Wissen Technology is Hiring for Senior Data Engineer 

 

About Wissen Technology:   

At Wissen Technology, we deliver niche, custom-built products that solve complex business challenges across industries worldwide. Founded in 2015, our core philosophy is built around a strong product engineering mindset ensuring every solution is architected and delivered right the first time. Today, Wissen Technology has a global footprint with 2000+ employees across offices in the US, UK, UAE, India, and Australia. Our commitment to excellence translates into delivering 2X impact compared to traditional service providers. How do we achieve this? Through a combination of deep domain knowledge, cutting-edge technology expertise, and a relentless focus on quality. We don’t just meet expectations we exceed them by ensuring faster time-to-market, reduced rework, and greater alignment with client objectives. We have a proven track record of building mission-critical systems across industries, including financial services, healthcare, retail, manufacturing, and more. Wissen stands apart through its unique delivery models. Our outcome-based projects ensure predictable costs and timelines, while our agile pods provide clients with the flexibility to adapt to their evolving business needs. Wissen leverages its thought leadership and technology prowess to drive superior business outcomes. Our success is powered by top-tier talentOur mission is clear: to be the partner of choice for building world-class custom products that deliver exceptional impact the first time, every time. 

 

Job Summary: We are seeking an experienced and passionate Senior Data Engineer to join our dynamic data engineering team. The ideal candidate will have deep expertise in building scalable data pipelines and distributed data processing systems using Python , Apache Spark , Kafka , and Snowflake/Databricks . This role involves designing, developing, and optimizing high-performance data solutions that enable advanced analytics and business intelligence across the organization.   

 

Experience: 7 12 years 

Location:  Bangalore

Mode of Work: Full time hybrid 

 

Key Responsibilities:  

  • Design, build, and maintain robust, scalable data pipelines using Python and Apache Spark. 
  • Integrate and process large volumes of data from diverse sources using Kafka and other streaming technologies.  
  • Develop and optimize ETL/ELT workflows for structured and unstructured data. 
  • Work with Snowflake and Databricks to enable efficient data storage, transformation, and analysis. 
  • Implement data quality, validation, and monitoring frameworks to ensure accuracy and reliability. 
  • Collaborate with data scientists, analysts, and business stakeholders to translate requirements into scalable data solutions. 
  • Optimize data workflows for performance, scalability, and cost efficiency in cloud environments (AWS/Azure). 
  • Stay current with emerging technologies in data engineering and contribute to continuous improvement initiatives. 
  • Ensure compliance with data governance, security, and privacy standards. 

 

Requirements:  

  • 7 8 years of hands-on experience in data engineering or a related technical field. 
  • Strong programming proficiency in Python for data manipulation, automation, and pipeline development. 
  • Proven experience with Apache Spark or other distributed data processing frameworks (e.g., Hadoop). 
  • Hands-on experience with Snowflake and/or Databricks for cloud-based data warehousing and analytics. 
  • Experience with Kafka or similar message queues/streaming platforms. 
  • Familiarity with cloud data platforms such as AWS or Azure and their associated data services (e.g., S3, Glue, Data Factory). 
  • Solid understanding of data warehousing concepts, ETL design patterns, and data modeling. 
  • Strong problem-solving, analytical, and debugging skills. 
  • Excellent communication, collaboration, and stakeholder management skills. 
  • Ability to work effectively both independently and in geographically distributed teams. 

 Good to have skills: 

  • Experience with workflow orchestration tools such as Airflow or Prefect. 
  • Knowledge of CI/CD pipelines for data engineering and DevOps practices. 
  • Familiarity with containerization (Docker, Kubernetes). 
  • Exposure to real-time analytics and data lakehouse architectures.  
  • Experience in financial services, e-commerce, or large-scale enterprise data ecosystems. 

 

Wissen Sites:  

Website: www.wissen.com      

LinkedIn: https://www.linkedin.com/company/wissen - technology      

Wissen Leadership: https://www.wissen.com/company/leadership - team/      

Wissen Live: https://www.linkedin.com/company/wissen - technology/posts/feedView=All      
Wissen Thought Leadership: https://www.wissen.com/articles/      

 

 

 


Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Wissen Technology logo
Wissen Technology

IT Services and IT Consulting

Bangalore Karnataka

RecommendedJobs for You

navi mumbai, maharashtra, india