: 10+YearsTo design and implement scalable data platform solutions for analytics and modern data use cases. The role involvesend-to-end pipeline building,data modeling, andperformance optimization on Snowflake while working closely with cloud platforms and cross-functional engineering teams forPacific Data Platform. Additionally, it involves developing data pipelines, optimizing database performance, and ensuring data integrity and security. The engineer collaborates with other data engineers, application engineers, and analysts to support data-driven decision-making and leverage advanced technologies to enhance data processing capabilities.Key responsibilities includeUnderstandingdata models anddatabase design, ETL processes, and implementing best practices for data management.Strong programming skills, experience with SQL, and proficiency in cloud platforms are essential.He/she must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of data process automation and optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives.Job Title:Senior Snowflake Data EngineersJob Location: HyderabadStart Date: As soon as possibleKey ResponsibilitiesDesign and implement scalable data architectures to support data storage, processing, and analytics.Build scalable Snowflake-based data pipelines.Integrate diverse structured & unstructured data sourcesOptimize Snowflake performance — micro-partitioning, clustering, cachingDevelop, maintain, and optimize ETL (Extract, Transform, Load) processes for Pacific Data PlatformManage and optimize database / data warehouse systems such as snowflake, ensuring high availability and performance.Work closely with Senior Snowflake Data Engineers in creating and maintaining data models that accurately represent business requirements and facilitate efficient data storage and retrieval.Analyze and tune database performance, identify bottlenecks, and implement improvements to enhance query performance.Ensure data integrity, consistency, and accuracy through rigorous data quality checks and validations.Document data processes, architectures, and workflows while establishing best practices for data management and engineering.Set up monitoring solutions to track data pipelines and database performance, ensuring timely maintenance and fault resolution.Ability to quickly analyze existing SQL code and make improvements to enhance performance, take advantage of new SQL features, close security gaps, and increase robustness and maintainability of the code.Implement data security measures and ensure compliance with relevant regulations regarding data protection and privacy.Provide guidance and mentorship to junior data engineers, fostering a culture of learning and continuous improvement.Key QualificationsExperience: 10+ years would be preferable.‒ Bachelor’s or master's degree in computer science, Information Technology, or a related field with at least8+ years of software development experience‒ Expert knowledgeSnowflake with strongprogramming experience in SQL‒ Knowledge in Oracle,PostgreSQL or any other database is a plus‒ Competence in data preparation and/orETL tools likesnapLogic orAzure Data Factory to build and maintain data pipelines and flows. Transformation tool likedbt is a plus‒ Proficiency in cloud platforms are essential fordata engineering.‒ Programming language experience inPython, Shell scripts (bash/zsh, grep/sed/awk etc..).‒ Deep knowledge ofdatabases,stored procedures, optimizations of huge data‒ In-depth knowledge ofingestion techniques, data cleaning, de-dupe, and partitioning.‒ Experience with building the infrastructure required fordata ingestion and analytics‒ Solid understanding ofnormalization and denormalization of data, database exception handling, transactions,profiling queries, performance counters, debugging, database & query optimization techniques‒ Familiarity withSQL security techniques such as data encryption at the column level, Transparent Data Encryption (TDE), signed stored procedures, and assignment of user permissions‒ Experience in understanding the source data from various platforms and mapping them intoEntity Relationship Models (ER) for data integration and reporting‒ Knowledge of data visualization tools (e.g.,Tableau, Power BI) is a plus.‒ Exposure to Source control likeGIT, Azure DevOps‒ Understanding ofAgile methodologies (Scrum, Kanban)‒ Experience with CI/CD automation tools Terraform is a plusPersonal Strengthsv Very good communication skills.v Ability to easily fit into a distributed development team.v Ability to manage timelines for multiple initiatives.v Ability to articulate insights from the data and help business teams make decisionsv Able to work with ambiguous requirements, to seek clarity around uncertainty and to manage risksv Ability to communicate complex concepts to non-data audiences