Technogen
Technogen
Technogen
Technogen India Pvt. Ltd. is a boutique Talent & IT Solutions company, founded in 2008, has been serving global customers for over last 2 decades,. Talent Solutions: We assist several GCCs, Global MNCs and IT majors on their critical and unique IT talent needs through our services around Recruitment Process Outsourcing (RPO), contract staffing, permanent hiring, Hire-Train-Deploy (HTD), Build-Operate-Transfer (BOT) and Offshore staffing.
Job Title :
Required Experience : 8
Work Mode: WFO-4 Days from Office.
Shift Time : UK Shift Time-12:00 PM IST to 09:00 PM IST.
Location :
- seeking a
Data Engineer
and Problem Manager
, based out of our Technology & Innovation Center in Hyderabad, India, reporting to the IT Director for Enterprise Data and Analytics. - The person in this role will be responsible for managing, monitoring, and maintaining scalable data integration and analytics pipelines to support enterprise reporting and data-driven decision-making.
- This role requires close collaboration with cross-functional teams to integrate data from various source systems into a centralized, cloud-based data warehouse
Primarily leveraging tools such as Google Big Query, Python, SQL, DBT, and Cloud Composer (Airflow).
- The Data Engineer will also be responsible for implementing data quality checks, managing orchestration workflows, and delivering business-ready datasets aligned with enterprise data strategy.
What Your Impact Will Be:-
- Experience in
Incident Management + Problem Management + RCA activity
- ITIL Certification will be preferred
- Experience in O2C, R2P business processes
- Monitoring and analysing data integration pipelines to ingest structured and semi-structured data from enterprise systems (e.g., ERP, CRM, E-commerce, Order Management) into a centralized cloud data warehouse using Google BigQuery.
- Build analytics-ready pipelines that transform raw data into trusted, curated datasets for reporting, dashboards, and advanced analytics.
- Implement transformation logic using DBT to create modular, maintainable, and reusable data models that evolve with business needs.
- Apply BigQuery best practicesincluding partitioning, clustering, and query optimizationto ensure high performance and scalability.
- Automate data workflows using Cloud Composer (Airflow), ensuring reliable execution, task dependency management, and timely data delivery.
- Develop efficient, reusable Python and SQL code for data ingestion, transformation, validation, and performance tuning across the pipeline lifecycle.
- Establish robust data quality checks and testing strategies to validate both technical accuracy and alignment with business logic.
- Collaborate with cross-functional teamsincluding data analysts, BI developers, and product owners—to understand integration needs and deliver impactful, business-aligned data solutions.
- Leverage modern ETL platforms such as Ascend.io, Databricks, Dataflow, or Fivetran to accelerate development and improve observability and orchestration.
- Contribute to technical documentation, CI/CD workflows, and monitoring processes to drive transparency, reliability, and continuous improvement across the data engineering ecosystem.
What We’re Looking For:-
- Bachelor's or master's degree in computer science, Data Engineering, Information Systems, or related technical field.
- 8+ years of hands-on experience in data engineering with a focus on data integrations, warehousing, and analytics pipelines.
- Hands on experience in troubleshooting and diagnosis problem and root cause finding and communicate to development team
- Techno Functional knowledge in ERP application integration in O2C and R2P area
Hands-on experience with:-
- Google Big Query as a centralized data warehousing and analytics platform.
- Python for scripting, data processing, and integration logic.
- SQL for data transformation, complex querying, and performance tuning.
- DBT for building modular, maintainable, and reusable transformation models.
- Airflow / Cloud Composer for orchestration, dependency management, and job scheduling.
- Solid understanding of ITIL Incident management and problem management.
- Strong knowledge of data testing frameworks, validation methods, and best practices.
Preferred Skills (Optional):-
- Experience with Ascend.io or comparable ETL platforms such as Databricks, Dataflow, or Fivetran.
- Familiarity with data cataloging and governance tools like Collibra.
- Knowledge of CI/CD practices, Git-based workflows, and infrastructure automation tools.
- Exposure to event-driven or real-time streaming pipelines using tools like Pub/Sub or Kafka.
- Strong problem-solving and analytical mindset with the ability to think broadly and identify innovative solutions and able to quick to learn new technologies, programming languages, and frameworks.
- Excellent communication skills, both written and verbal.
- Ability to work in a fast-paced and collaborative environment.
- Good experience in Agile Methodologies like Scrum, Kanban, and managing IT backlogs.
What It’s Like to Work Here:-
- We are a purpose-driven company aiming to empower the next generation to explore the wonder of childhood and reach their full potential. We live up to our purpose employing the following behaviors:
- We collaborate: Being a part of means being part of one team with shared values and common goals.
- Every person counts and working closely together always brings better results.
- Partnership is our process, and our collective capabilities are our superpowers.
- We innovate: At we always aim to find new and better ways to create innovative products and experiences.
- No matter where you work in the organization, you can always make a difference and have a real impact.
- We welcome new ideas and value new initiatives that challenge conventional thinking.
- We execute: We are a performance driven company. We strive for excellence and are focused on pursuing best in class outcomes.
- We believe in accountability and ownership and know that our people are at their best when they are empowered to create and deliver results.