Job
Description
About The Role
Project Role :Data Platform Engineer
Project Role Description :Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models.
Must have skills :Databricks Unified Data Analytics Platform
Good to have skills :NA
Minimum 12 year(s) of experience is required
Educational Qualification :15 years full time education
Summary:Seeking a forward-thinking professional with an AI-first mindset to design, develop, and deploy enterprise-grade solutions using Generative and Agentic AI frameworks that drive innovation, efficiency, and business transformation. As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in shaping the data platform components.
Roles and Responsibilities:Lead AI-driven solution design and delivery by applying GenAI and Agentic AI to address complex business challenges, automate processes, and integrate intelligent insights into enterprise workflows for measurable impact.
Lead the design, development, and optimization of complex data pipelines, ensuring high performance and scalability using Py Spark, Spark, and Big Data technologies.Architect data engineering solutions across major cloud platforms like AWS, Azure, or GCP, enabling smooth and secure data ingestion, transformation, and storage.Lead efforts to design and build modular, reusable ETL/ELT pipelines to integrate data from various sources into cloud-based data lakes and data warehouses.Collaborate closely with cross-functional teams, including data scientists, analysts, and other engineering teams to develop and maintain large-scale data processing systems.Guide and mentor junior engineers in best practices for data engineering, cloud technologies, and big data solutions.Optimize and tune Spark and big data processing jobs to handle high-volume data efficiently and at scale.Ensure data security, privacy, and governance are maintained by enforcing best practices and policies across cloud platforms.Implement automated data testing and monitoring solutions to ensure data quality and pipeline reliability.Drive initiatives for real-time data processing, batch jobs, and analytics pipelines to support both operational and analytical needs.Advocate for continuous improvement, researching new tools, and technologies to drive data engineering excellence.Required Skills &
Qualifications:Strong grasp of Generative and Agentic AI, prompt engineering, and AI evaluation frameworks. Ability to align AI capabilities with business objectives while ensuring scalability, responsible use, and tangible value realization.Proven experience with AWS, Azure, or GCP in designing data engineering workflows and solutions.Expert-level knowledge of Py Spark, Spark, and Big Data technologies.Extensive experience in data pipeline architecture, design, and optimization.Strong hands-on experience with large-scale ETL/ELT processes and integrating data into cloud-based platforms like S3, Blob Storage, Big Query, and Redshift.Proficiency in Python, SQL, and scripting for data pipeline development.Ability to work with real-time data streaming tools like Kafka and batch processing tools.Experience leading cross-functional teams in the implementation of cloud-based data platforms and workflows.Solid understanding of data security, compliance, and governance in a cloud environment.Nice to Have:The candidate should have a minimum of 12 years of experience in Databricks Unified Data Analytics Platform.Familiarity with Dbt (Data Build Tool) for managing data transformation and data quality.Exposure to CI/CD pipelines, DevOps practices, and containerization technologies (Docker, Kubernetes).Certifications or experience with data tools like Apache Airflow, Five Tran, Informatica, or Talend.Experience with analytics and BI tools like Power BI, Tableau, or Looker for visualization over cloud-based data platforms.
Educational Qualification:- 15 years full time education is required. Qualification 15 years full time education