Wissen Technology is Hiring for Data Engineer
About Wissen Technology:
At Wissen Technology, we deliver niche, custom-built products that solve complex business challenges across industries worldwide. Founded in 2015, our core philosophy is built around a strong product engineering mindset—ensuring every solution is architected and delivered right the first time. Today, Wissen Technology has a global footprint with 2000+ employees across offices in the US, UK, UAE, India, and Australia. Our commitment to excellence translates into delivering 2X impact compared to traditional service providers. How do we achieve this? Through a combination of deep domain knowledge, cutting-edge technology expertise, and a relentless focus on quality. We don’t just meet expectations—we exceed them by ensuring faster time-to-market, reduced rework, and greater alignment with client objectives. We have a proven track record of building mission-critical systems across industries, including financial services, healthcare, retail, manufacturing, and more. Wissen stands apart through its unique delivery models. Our outcome-based projects ensure predictable costs and timelines, while our agile pods provide clients the flexibility to adapt to their evolving business needs. Wissen leverages its thought leadership and technology prowess to drive superior business outcomes. Our success is powered by top-tier talent. Our mission is clear: to be the partner of choice for building world-class custom products that deliver exceptional impact—the first time, every time.
Job Summary:
As a Data Development Engineer, you will be a core member of an elite team responsible for designing, developing, and scaling high-performance, data-intensive applications. This role demands deep technical expertise, particularly in Python and big data ecosystems such as Apache Spark, along with a strong understanding of modern data pipelines and cloud platforms. You will also have the opportunity to evolve into a technical leadership position within the Data Initiative.
Experience:
5-8 Years
Location:
Bangalore
Mode of work:
Full time
Key Responsibilities:
- Collaborate with the team to define and implement high-level technical architecture for backend services and data monetization components.
- Design, develop, and enhance features in scalable data applications and services.
- Develop technical documentation, data flow diagrams, and architectural designs.
- Partner with QA, DevOps, Data Engineering, and Product teams for deployment, testing, training, and production support.
- Build and maintain robust integrations with enterprise data platforms and tools (e.g., Databricks, Kafka).
- Write clean, efficient, and testable Python and PySpark code.
- Ensure compliance with development, coding, security, and privacy standards.
- Proactively learn and adapt to new technologies based on evolving business needs.
- Mentor junior developers and contribute to establishing best practices.
Qualifications:
- 5+ years of hands-on Python development experience , specifically in data-intensive environments.
- Strong expertise in Apache Spark and PySpark for distributed data processing.
- Proficient in SQL , query optimization, and working with relational databases (e.g., Oracle, SparkSQL ).
- Solid understanding of software development lifecycle (SDLC) and agile methodologies.
- Proven experience in writing unit, integration, and performance tests for data pipelines.
- Hands-on experience with Databricks and large-scale data environments.
- Deep understanding of data pipelines , including data engineering workflows, data lineage, transformation, and quality frameworks.
- Familiarity with AWS (or other cloud providers) for deploying and managing data infrastructure.
- Excellent communication skills and a strong sense of ownership and accountability.
Good to have Skills:
- Experience in foreign exchange (FX) or capital markets is highly desirable.
- Knowledge of modern data serialization formats (e.g., AVRO, Parquet).
- Experience with Apache Kafka and real-time data streaming.
- Familiarity with Apache Airflow or other orchestration tools.
- Comfort working in Linux environments and scripting.
- Exposure to data governance, compliance, and data security best practices.
Wissen Sites:
Website: www.wissen.comLinkedIn: https://www.linkedin.com/company/wissen-technologyWissen Leadership: https://www.wissen.com/company/leadership-team/