Job
Description
About The Role
Project Role :Data Engineer
Project Role Description :Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems.
Must have skills :Google BigQuery
Good to have skills :Microsoft SQL Server
Minimum 5 year(s) of experience is required
Educational Qualification :15 years full time education
Summary:Seeking a forward-thinking professional with an AI-first mindset to design, develop, and deploy enterprise-grade solutions using Generative and Agentic AI frameworks that drive innovation, efficiency, and business transformation. As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes.
Project Role :Analytics and Modelor Project Role Description:Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision- making. Must have Skills :Google Big Query, SSI:NON SSI:Good to Have Skills :SSI:No Technology Specialization NON SSI :Job Requirements :Roles & Responsibilities:- Lead AI-driven solution design and delivery by applying GenAI and Agentic AI to address complex business challenges, automate processes, and integrate intelligent insights into enterprise workflows for measurable impact. 1:Data Proc, Pub, Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX)2:Proven track record of delivering data integration, data warehousing soln3:Strong SQL And Hands-on (No FLEX)4:Exp with data integration and migration projects3:Proficient in BigQuery SQL language (No FLEX)5:understanding on cloud native services :bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes6:Exp in cloud solutions, mainly data platform services , GCP Certifications5:Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience.Professional & Technical Skills:
- Strong grasp of Generative and Agentic AI, prompt engineering, and AI evaluation frameworks. Ability to align AI capabilities with business objectives while ensuring scalability, responsible use, and tangible value realization.1:Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred2:Strong hands-on experience with building solutions using cloud native services:bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX)3:Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline4:Open mindset, ability to quickly adapt new technologies5:Performance tuning of Big Query SQL scripts6:GCP Certified preferred7:Working in agile environment
Professional Attributes :1:Must have good communication skills2:Must have ability to collaborate with different teams and suggest solutions3:Ability to work independently with little supervision or as a team4:Good analytical problem solving skills 5:Good team handling skills Educational Qualification:15 years of Full time education Additional Information :Candidate should be ready for Shift B and work as individual contributor Qualification 15 years full time education