Home
Jobs

1123 Snowflake Jobs - Page 21

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 7.0 years

6 - 10 Lacs

Gurugram

Work from Office

Naukri logo

Public Services Industry Strategist Join our team in Strategy for an exciting career opportunity to work on the Industry Strategy agenda of our most strategic clients across the globe! Practice: Industry Strategy, Global Network (GN) Areas of Work: Strategy experience in Public Services Industry " Operating Model and Organization Design, Strategic Roadmap Design, Citizen Experience, Business Case Development (incl Financial Modelling), Transformation office, Sustainability, Digital Strategy, Data Strategy, Gen AI, Cloud strategy, Cost Optimization strategy Domain:Public Services " Social Services, Education, Global Critical Infrastructure Services, Revenue, Post & Parcel Level: Consultant Location: Gurgaon, Mumbai, Bengaluru, Chennai, Kolkata, Hyderabad & Pune Years of Exp: 4-7 years of strategy experience post MBA from a Tier 1 institute Explore an Exciting Career at Accenture Do you believe in creating an impact? Are you a problem solver who enjoys working on transformative strategies for global clients? Are you passionate about being part of an inclusive, diverse, and collaborative culture? Then, this is the right place for you! Welcome to a host of exciting global opportunities in Accenture Strategy. The Practice- A Brief Sketch: The GNStrategy Industry Group is a part of Accenture Strategy and focuses on the CXOs most strategic priorities. We help clients with strategies that are at the intersection of business and technology, drive value and impact, shape new businesses & design operating models for the future. As a part of this high performing team, you will: Apply advanced corporate finance to drive value using financial levers, value case shaping, and feasibility studies to evaluate new business opportunities Analyze competitive benchmarking to advise C-suite on 360 value opportunities, scenario planning to solve complex C-suite questions, lead & enable strategic conversations Identify strategic cost take-out opportunities, drive business transformation, and suggest value-based decisions based on insights from data. Apply advanced data analyses to unlock client value aligned with clients business strategy Build future focused PoV and develop strategic ecosystem partners. Build Client Strategy definition leveraging Disruptive technology solutions, like Data & AI, including Gen AI, and Cloud Build relationships with C-suite executives and be a trusted advisor enabling clients to realize value of human-centered change Create Thought Leadership in Industry/Functional areas, Reinvention Agendas, Solution tablets and assets for value definition, and use it, along with your understanding of Industry value chain and macroeconomic analyses, to inform clients strategy Partner with CXOs to architect future proof operating models embracing Future of Work, Workforce and Workplace powered by transformational technology, ecosystems and analytics Work with our ecosystem partners, help clients reach their sustainability goals through digital transformation Prepare and deliver presentations to clients to communicate strategic plans and recommendations on PS domains such as Digital Citizen, Public Infrastructure, Smart Buildings, Net Zero Monitor industry trends and keep clients informed of potential opportunities and threats The candidate will be required to have exposure to core-strategy projects in Public Services domain with a focus on one of the sub-industries within the Public Service (mentioned below), specifically: Public Service Experience: The candidate must have strategy experience in at least one of the below Public Service sub-industries: Social Services + (Employment, Pensions, Education, Child welfare, Government as a platform, Digital Citizen Services) Education Global Critical Infrastructure Services (Urban & city planning, Smart Cities, High Performing City Operating Model) Admin (Citizen experience, Federal Funds Strategy, Workforce Strategy, Intelligent Back Office, Revenue industry strategy, Post & Parcel) Strategy Skills and Mindsets Expected: A Strategic Mindset to shape innovative, fact-based strategies and operating models Communication and Presentation Skills to hold C-Suite influential dialogues, narratives, conversations, and share ideas Ability to solve problems in unstructured scenarios, to decode and solve complex and unstructured business questions Analytical and outcome-driven approach to perform data analyses & generate insights, and application of these insights for strategic insights and outcomes Qualifications Value Driven Business Acumen to drive actionable outcomes for clients with the latest industry trends, innovations and disruptions, metrics and value drivers Financial Acumen and Value Creation to develop relevant financial models to back up a business case Articulation of strategic and future vision Ability to identify Technology Disruptions in the Public Services industry What's in it for you? An opportunity to work on transformative projects with key G20OO clients and CxOs Potential to co-create with leaders in strategy, industry experts, enterprise function practitioners and, business intelligence professionals to shape and recommend innovative solutions that leverage emerging technologies Ability to embed responsible business into everythingfrom how you service your clients to how you operate as a responsible professional Personalized training modules to develop your strategy & consulting acumen to grow your skills, industry knowledge and capabilities Opportunity to thrive in a culture that is committed to accelerate equality for all Engage in boundaryless collaboration across the entire organization About Accenture: Accenture is a leading global professional services company, providing a broad range of services and solutions in strategy, consulting, digital, technology and operations. Combining unmatched experience and specialized skills across more than 40 industries and all business functions underpinned by the world's largest delivery network Accenture works at the intersection of business and technology to help clients improve their performance and create sustainable value for their stakeholders. With more than 732,000 p eople serving clients in more than 120 countries, Accenture drives innovation to improve the way the world works and lives. Visit us at About Accenture Strategy & Consulting: Accenture Strategy shapes our clients' future, combining deep business insight with the understanding of how technology will impact industry and business models. Our focus on issues such as digital disruption, redefining competitiveness, operating and business models as well as the workforce of the future helps our clients find future value and growth in a digital world. Today, digital is changing the way organizations engage with their employees, business partners, customers and communities. This is our unique differentiator. To bring this global perspective to our clients, Accenture Strategy's services include those provided by our Global Network a distributed management consulting organization that provides management consulting and strategy expertise across the client lifecycle. Our Global Network teams complement our in-country teams to deliver cutting-edge expertise and measurable value to clients all around the world. For more information visit en /careers/local/capability-network- careers Accenture Global Network | AGcenture in One Word At the heart of every great change is a great human. If you have ideas, ingenuity and a passion for making a difference, come and be a part of our team .

Posted 2 weeks ago

Apply

3.0 - 7.0 years

17 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title :Industry & Function AI Data Engineer + S&C GN Management Level :09 - Consultant Location :Primary - Bengaluru, Secondary - Gurugram Must-Have Skills :Data Engineering expertise, Cloud platforms:AWS, Azure, GCP, Proficiency in Python, SQL, PySpark and ETL frameworks Good-to-Have Skills :LLM Architecture, Containerization tools:Docker, Kubernetes, Real-time data processing tools:Kafka, Flink, Certifications like AWS Certified Data Analytics Specialty, Google Professional Data Engineer,Snowflake,DBT,etc. Job Summary : As a Data Engineer, you will play a critical role in designing, implementing, and optimizing data infrastructure to power analytics, machine learning, and enterprise decision-making. Your work will ensure high-quality, reliable data is accessible for actionable insights. This involves leveraging technical expertise, collaborating with stakeholders, and staying updated with the latest tools and technologies to deliver scalable and efficient data solutions. Roles & Responsibilities: Build and Maintain Data Infrastructure:Design, implement, and optimize scalable data pipelines and systems for seamless ingestion, transformation, and storage of data. Collaborate with Stakeholders:Work closely with business teams, data analysts, and data scientists to understand data requirements and deliver actionable solutions. Leverage Tools and Technologies:Utilize Python, SQL, PySpark, and ETL frameworks to manage large datasets efficiently. Cloud Integration:Develop secure, scalable, and cost-efficient solutions using cloud platforms such as Azure, AWS, and GCP. Ensure Data Quality:Focus on data reliability, consistency, and quality using automation and monitoring techniques. Document and Share Best Practices:Create detailed documentation, share best practices, and mentor team members to promote a strong data culture. Continuous Learning:Stay updated with the latest tools and technologies in data engineering through professional development opportunities. Professional & Technical Skills: Strong proficiency in programming languages such as Python, SQL, and PySpark Experience with cloud platforms (AWS, Azure, GCP) and their data services Familiarity with ETL frameworks and data pipeline design Strong knowledge of traditional statistical methods, basic machine learning techniques. Knowledge of containerization tools (Docker, Kubernetes) Knowing LLM, RAG & Agentic AI architecture Certification in Data Science or related fields (e.g., AWS Certified Data Analytics Specialty, Google Professional Data Engineer) Additional Information: The ideal candidate has a robust educational background in data engineering or a related field and a proven track record of building scalable, high-quality data solutions in the Consumer Goods sector. This position offers opportunities to design and implement cutting-edge data systems that drive business transformation, collaborate with global teams to solve complex data challenges and deliver measurable business outcomes and enhance your expertise by working on innovative projects utilizing the latest technologies in cloud, data engineering, and AI. About Our Company | Qualifications Experience :Minimum 3-7 years in data engineering or related fields, with a focus on the Consumer Goods Industry Educational Qualification :Bachelors or Masters degree in Computer Science, Information Systems, Engineering, or a related field

Posted 2 weeks ago

Apply

6.0 - 10.0 years

15 - 25 Lacs

Chennai

Work from Office

Naukri logo

Job Title: Lead Data Scientist Location: Chennai Reports To: CEO Job Summary: We are seeking a highly skilled and experienced Data Scientist with 6+ years of hands-on experience in data science, analytics, and stakeholder engagement. The ideal candidate should have strong expertise in Python , Tableau , Snowflake , Machine Learning, Statistical testing and should be comfortable driving business insights through storytelling and daily interactions with stakeholders. Key Responsibilities: Design, build, and deploy scalable machine learning models to solve complex business problems Write and optimize complex SQL queries, particularly on the Snowflake platform Develop insightful dashboards and visualizations using Tableau Conduct data exploration, cleaning, and transformation using Python & R and relevant libraries (Pandas, NumPy, Scikit-learn, etc.) Perform A/B testing & other statistical techniques Translate analytical insights into clear, compelling stories and recommendations for stakeholders Collaborate cross-functionally with product, engineering, and business teams to understand data needs and deliver solutions Present findings and recommendations to both technical and non-technical stakeholders regularly Requirements: 6+ years of professional experience in data science or advanced analytics Strong proficiency in Python for data manipulation, analysis, and modelling Proven experience in building dashboards and reports using Tableau Expertise in writing complex SQL queries , especially on Snowflake Solid understanding of machine learning techniques, statistical tests and model deployment best practices Excellent communication and storytelling skills to convey data-driven insights Comfortable working closely with stakeholders daily to gather requirements and present findings

Posted 2 weeks ago

Apply

3.0 - 8.0 years

16 - 17 Lacs

Gurugram

Work from Office

Naukri logo

Role - AI/ ML Engineer Location - Gurugram We are seeking a highly skilled and innovative AI Data Engineer to join our Development team. In this role, you will design, develop, and deploy AI systems that can generate content, reason autonomously, and act as intelligent agents in dynamic environments. Key Responsibilities • Design and implement generative AI models (e.g., LLMs, diffusion models) for text, image, audio, or multimodal content generation. • Develop agentic AI systems capable of autonomous decision-making, planning, and tool use in complex environments. • Integrate AI agents with APIs, databases, and external tools to enable real-world task execution. • Fine-tune foundation models for domain-specific applications using techniques like RLHF, prompt engineering, and retrieval-augmented generation (RAG). • Collaborate with cross-functional teams including product, design, and engineering to bring AI-powered features to production. • Conduct research and stay up to date with the latest advancements in generative and agentic AI. • Ensure ethical, safe, and responsible AI development practices. Required Qualifications Bachelors or Master’s degree in Computer Science, AI, Machine Learning, or a related field. • 3+ years of experience in machine learning, with a focus on generative models or autonomous agents. • Proficiency in Python and ML frameworks such as PyTorch • Experience with LLMs (e.g., GPT, Claude, LLaMA, Cortex), transformers, and diffusion models. • Familiarity with agent frameworks (e.g., LangChain, AutoGPT, ReAct, OpenAgents). • Experience with AWS and Snowflake services • Prior Healthcare experience • Strong understanding of reinforcement learning, planning algorithms, and multi-agent systems. • Excellent problem-solving and communication skills.

Posted 2 weeks ago

Apply

1.0 - 3.0 years

4 - 8 Lacs

Bengaluru

Hybrid

Naukri logo

Working Mode : Hybrid Payroll: IDESLABS Location : Pan India PF Detection is mandatory Job Description: Snowflake: Administration Experience: Managing user access, roles, and security protocols Setting up and maintaining database replication and failover procedures Setting up programmatic access OpenSearch OpenSearch Experience: Deploying and scaling OpenSearch domains Managing security and access controls Setting up monitoring and alerting General AWS Skills: Infrastructure as Code (CloudFormation)Experience building cloud native infrastructure, applications and services on AWS, Azure Hands-on experience managing Kubernetes clusters (Administrative knowledge), ideally AWS EKS and/or Azure AKS Experience with Istio or other Service Mesh technologies Experience with container technology and best practices, including container and supply chain security Experience with declarative infrastructure-as-code with tools like Terraform, Crossplane Experience with GitOps with tools like ArgoCD

Posted 2 weeks ago

Apply

4.0 - 9.0 years

8 - 11 Lacs

Hyderabad, Bengaluru, Delhi / NCR

Hybrid

Naukri logo

Job Title: DBT Admin Location: Pan India Job Description: Job Summary: DBT Administrator with minimum 8+ years of experience in IT and relevant experience of 4+ years in DBT. The ideal candidate will be responsible for managing and optimizing DBT environment, deployment activities along with ensuring efficient data transformation processes, and supporting data analytics initiatives. Required Skills: Experience: DBT Administrator with minimum 8+ years of experience in IT and relevant experience of 4+ years in DBT Technical Skills: Proficiency in SQL, DBT Cloud, DBT Core, JINJA Templating, CI/CD tools and Git Analytical Skills: Strong analytical and problem-solving skills with the ability to interpret complex data sets. Communication: Excellent communication skills, both written and verbal, with the ability to collaborate effectively with cross-functional teams. Attention to Detail: High level of accuracy and attention to detail in managing data and processes. Certifications: Relevant certifications in DBT, SQL, or cloud platforms. Responsibilities: DBT Cloud and Core Environment Management: Install, configure, and maintain DBT environments, ensuring optimal performance and reliability at an enterprise level. Data Transformation: Develop, test, and deploy DBT models to transform raw data into actionable insights. Performance Tuning: Monitor and optimize DBT processes to improve performance and reduce execution time. CI/CD Pipeline Management: Configure and manage CI/CD pipelines for seamless code deployment, particularly for DBT models and related scripts Version Control: Enforce Git best practices, including branching strategies, version control, and merge request processes Upgrades and Rollouts: Manage upgrades and rollouts to the DBT platform, ensuring minimal disruption and seamless integration of new features Collaboration: Work closely with data engineers, analysts, and other stakeholders to understand data requirements and deliver solutions. Documentation: Maintain comprehensive documentation of core concepts of DBT models, configurations, and processes. Troubleshooting: Identify and resolve issues related to DBT processes and data transformations. Security: Implement and manage security protocols to protect data integrity and privacy. Training: Provide training and support to team members on DBT best practices and usage.

Posted 2 weeks ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

WHAT YOU WILL WORK ON Serve as a liaison between product, engineering & data consumers by analyzing the data, finding gaps and help drive roadmap Support and troubleshoot issues (data & process), identify root cause, and proactively recommend sustainable corrective actions by collaborating with engineering/product teams Communicate actionable insights using data, often for the stakeholders and non-technical audience. Ability to write technical specifications describing requirements for data movement, transformation & quality checks WHAT YOU BRING Bachelor s Degree in Computer Science, MIS, other quantitative disciplines, or related fields 3-7 years of relevant analytical experiences that can translate into defining strategic vision into requirements and working with the best engineers, product managers, and data scientists Ability to conduct data analysis, develop and test hypothesis and deliver insights with minimal supervision Experience identifying and defining KPI s using data for business areas such as Sales, Consumer Behaviour, Supply Chain etc. Exceptional SQL skills Experience with modern visualization tool stack, such as: Tableau, Power BI, Domo etc. Knowledge of open-source, big data and cloud infrastructure such as AWS, Hive, Snowflake, Presto etc. Incredible attention to detail, with structured problem-solving approach Excellent communications skills (written and verbal) Experience with agile development methodologies Experience with retail or ecommerce domains is a plus.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Hyderabad

Hybrid

Naukri logo

Immediate Openings on Qlik developer _ Pan India_ Contract Experience:5+ Years Skill: Qlik developer Notice Period: Immediate Employment Type: Contract Key Responsibilities: Design, develop, and maintain Qlik View applications and dashboards. Collaborate with business stakeholders to gather requirements and translate them into technical specifications. Perform data analysis and create data models to support business intelligence initiatives. Optimize Qlik View applications for performance and scalability. Provide technical support and troubleshooting for Qlik View applications. Ensure data accuracy and integrity in all Qlik View applications. Integrate Snowflake with Qlik View to enhance data processing and analytics capabilities. Stay updated with the latest Qlik View features and best practices. Conduct training sessions for end-users to maximize the utilization of Qlik View applications. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience between 2-5 years as a Qlik View Developer. Strong knowledge of Qlik View architecture, data modeling, and scripting. Proficiency in SQL and database management. Knowledge of Snowflake and its integration with Qlik View. Excellent analytical and problem-solving skills. Ability to work independently and as part of a team. Strong communication and interpersonal skills.

Posted 2 weeks ago

Apply

8.0 - 13.0 years

5 - 8 Lacs

Hyderabad

Hybrid

Naukri logo

Immediate Openings on # Snowflake _ Pan India_ Contract Experience:8+ Years Skill: Snowflake Notice Period: Immediate Employment Type: Contract Work Mode: WFO/Hybrid Job Description : Snowflake Data Warehouse Lead (India - Lead 8 to 10 yrs exp): Lead the technical design and architecture of Snowflake platforms ensuring alignment with customer requirements, industry best practices, and project objectives. Conduct code reviews, ensure adherence to coding standards and best practices, and drive continuous improvement in code quality and performance Provide technical support, troubleshooting problems, and providing timely resolution for Incidents, Services Requests and Minor Enhancements as required for Snowflake platforms and its related services. Datalake and Storage management Adding, updating, or deleting datasets in Snowflake Monitoring storage usage and handling capacity planning Strong communication and presentation skills

Posted 2 weeks ago

Apply

5.0 - 10.0 years

15 - 19 Lacs

Pune

Work from Office

Naukri logo

ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it , our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage an d passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. What you’ll do Lead end to end projects using cloud technologies to solve complex business problems Provide technology expertise to maximize value for clients and project teams Drive strong delivery methodology to ensure projects are delivered on time, within budget and to client’s satisfaction Ensure technology solutions are scalable, resilient, and optimized for performance and cost Guide coach and mentor project team members for continuous learning and professional growth Demonstrate expertise , facilitation, and strong interpersonal skills in internal and client interactions Collaborate with ZS experts to drive innovation and minimize project risks Work globally with team members to ensure a smooth project delivery Bring structure to unstructured work for developing business cases with clients Assist ZS Leadership with business case development, innovation, thought leadership and team initiatives What you’ll bring Candidates must either be in their junior year of a Bachelor's degree or in their first year of a Master's degree specializing in Business Analytics, Computer Science, MIS, MBA, or a related field with academic excellence 5+ years of consulting experience in leading large-scale technology implementations Strong communication skills to convey technical concepts to diverse audiences Significant supervisory, coaching, and hands on project management skills Extensive experience with major cloud platforms like AWS, Azure and GCP Deep knowledge of enterprise data management, advanced analytics, process automation, and application development Familiarity with industry- standard products and platforms such as Snowflake, Databricks, Redshift, Salesforce, Power BI, Cloud. Experience in delivering projects using agile methodologies Additional skills Capable of managing a virtual global team for the timely delivery of multiple projects

Posted 2 weeks ago

Apply

7.0 - 12.0 years

15 - 19 Lacs

Pune

Work from Office

Naukri logo

ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it , our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage an d passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. What You’ll Do Design & Implement an enterprise data management strategy aligned with business process, focusing on data models designs, database development standards and data management frameworks Develop and maintain data management and governance frameworks to ensure data quality, consistency and compliance for different Discover domains such as Multi omics, In Vivo, Ex Vivo, In Vitro datasets Design and develop scalable cloud based (AWS or Azure) solutions following enterprise standards Design robust data model for semi-structured/structured datasets by following various modelling techniques Design & implement of complex ETL data-pipelines to handle various semi-structured/structured datasets coming from Labs and scientific platforms Work with LAB ecosystems (ELNs, LIMS, CDS etc) to build Integration & data solutions around them Collaborate with various stakeholders, including data scientists, researchers, and IT, to optimize data utilization and align data strategies with organizational goals Stay abreast of the latest trends in Data management technologies and introduce innovative approaches to data analysis and pipeline development. Lead projects from conception to completion, ensuring alignment with enterprise goals and standards. Communicate complex technical details effectively to both technical and non-technical stakeholders. What You’ll Bring Minimum of 7+ years of hands-on experience in developing data management solutions solving problems in Discovery/ Research domain Advanced knowledge of data management tools and frameworks, such as SQL/NoSQL, ETL/ELT tools, and data visualization tools across various private clouds Strong experience in following Cloud based DBMS/Data warehouse offerings – AWS Redshift, AWS RDS/Aurora, Snowflake, Databricks ETL tools – Cloud based tools Well versed with different cloud computing offerings in AWS and Azure Well aware of Industry followed data security and governance norms Building API Integration layers b/w multiple systems Hands-on experience with data platforms technologies likeDatabricks, AWS, Snowflake, HPC ( certifications will be a plus) Strong programming skills in languages such as Python, R Strong organizational and leadership skills. Bachelor’s or Master’s degree in Computational Biology, Computer Science, or a related field. Ph.D. is a plus. Preferred/Good To Have MLOps expertise leveraging ML Platforms like Dataiku, Databricks, Sagemaker Experience with Other technologies like Data Sharing (eg. Starburst), Data Virtualization (Denodo), API Management (mulesoft etc) Cloud Solution Architect certification (like AWS SA Professional or others) Perks & Benefits: ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel: Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application: Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At www.zs.com

Posted 2 weeks ago

Apply

3.0 - 8.0 years

10 - 14 Lacs

Hyderabad, Chennai

Work from Office

Naukri logo

Are you ready to make an impact at DTCC Pay and Benefits: Competitive compensation, including base pay and annual incentive. Comprehensive health and life insurance and well-being benefits, based on location. Pension / Retirement benefits Paid Time Off and Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and well-being. DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (Tuesdays, Wednesdays, and a day unique to each team or employee). The impact you will have in this role: Data Quality and Integration role is a highly technical position considered a technical expert in system implementation - with an emphasis on providing design, ETL, data quality and warehouse modeling expertise. This role will be accountable for, knowledge of capital development efforts. Performs in an experienced level the technical design of application components, builds applications, interfaces between applications, and understands data security, retention, and recovery. Can research technologies independently and recommend appropriate solutions. Contributes to technology-specific best practices & standards; gives to success criteria from design through deployment, including, reliability, cost-effectiveness, performance, data integrity, maintainability, reuse, extensibility, usability and scalability; gives expertise on significant application components, vendor products, program languages, databases, operating systems, etc, completes the plan by building components, testing, configuring, tuning, and deploying solutions. Software Engineer (SE) for Data Quality and Integration applies specific technical knowledge of data quality and data integration in order to assist in the design and construction of critical systems. The SE works as part of an AD project squad and may interact with the business, Functional Architects, and domain experts on related integrating systems. The SE will contribute to the design of components or individual programs and participates fully in the construction and testing. This involves working with the Senior Application Architect, and other technical contributors at all levels. This position contributes expertise to project teams through all phases, including post-deployment support. This means researching specific technologies, and applications, and contributing to the solution design, supporting development teams, testing, troubleshooting, and production support. The ASD must possess experience in integrating large volumes of data, efficiently and in a timely manner. This position requires working closely with the functional and governance functions, and more senior technical resources, reviewing technical designs and specifications, and contributing to cost estimates and schedules. What You'll Do: Technology Expertise is a domain expert on one or more of programming languages, vendor products specifically, Informatica Data Quality and Informatica Data Integration Hub, DTCC applications, data structures, business lines. Platforms works with Infrastructure partners to stand up development, testing, and production environments Elaboration works with the Functional Architect to ensure designs satisfy functional requirements Data Modeling reviews and extends data models Data Quality Concepts Experience in Data Profiling, Scorecards, Monitoring, Matching, Cleansing Is aware of frameworks that promote concepts of isolation, extensibility, and extendibility System Performance contributes to solutions that satisfy performance requirements; constructs test cases and strategies that account for performance requirements; tunes application performance issues Security implements solutions and complete test plans working mentoring other team members in standard process Standards is aware of technology standards and understands technical solutions need to be consistent with them Documentation develops and maintains system documentation Is familiar with different software development methodologies (Waterfall, Agile, Scrum, Kanban) Aligns risk and control processes into day to day responsibilities to monitor and mitigate risk; escalates appropriately Educational background and work experience that includes mathematics and conversion of expressions into run time executable code. Ensures own and teams practices support success across all geographic locations Mitigates risk by following established procedures and monitoring controls, spotting key errors and demonstrating strong ethical behavior. Helps roll out standards and policies to other team members. Financial Industry Experience including Trades, Clearing and Settlement Education: Bachelor's degree or equivalent experience. Talents Needed for Success: Minimum of 3+ years in Data Quality and Integration. Basic understanding of Logical Data Modeling and Database design is a plus Technical experience with multiple database platformsSybase, Oracle, DB2 and distributed databases like Teradata/Greenplum/Redshift/Snowflake containing high volumes of data. Knowledge of data management processes and standard methodologies preferred Proficiency with Microsoft Office tools required Supports team in managing client expectations and resolving issues on time. Technical skills highly preferred along with strong analytical skills.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Job Description: Must Have: TSQL, SSIS , SSRS or Informatica PC and Data warehousing Good to have : Snowflake Good knowledge of T-SQL, including the ability to write stored procedures, views, functions etc.. Good experience in designing ,developing, unit testing and implementation of data integration solutions using ETL in SSIS and SSRS reporting platform Experience with data warehousing concepts and enterprise data modeling techniques Good knowledge of relational and dimensional database structures, theories, principles and best practices Conduct thorough analysis of existing MSBI (Microsoft Business Intelligence) legacy applications and Informatica PC Identify and document the functionalities, workflows, and dependencies of legacy systems Create detailed mapping specifications for data integration and transformation processes Collaborate with business stakeholders/architects and data modelers to understand their needs and translate into technical documentation Ensure accurate documentation of data sources, targets, and transformation rules Perform data validation, cleansing, and analysis to ensure data accuracy and integrity Update the Design documents after successful code changes and testing Provide Deployment support Possess good knowledge of Agile and Waterfall methodologies Requirements: Bachelor s degree in computer science, Engineering, or a related field Highly skilled at handling complex technical situations and have exceptional verbal and written communication skills 5+ years experience with understanding of data lifecycle, governance, and migration processes 5+ years experience with SSIS, SSRS (or Informatica PC) and MS SQL Server, TSQL 5+ years experience with Data Warehouse technologies 3+ years experience with Agile methodologies (Scrum, Kanban, JIRA) Nice to have experience in wealth management domain

Posted 2 weeks ago

Apply

8.0 - 13.0 years

6 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

Data Engineering Team As a Lead Data Engineer for India, you will be accountable for leading the technical aspects of product engineering by being hands on, working on the enhancement, maintenance and support of the product on which your team is working, within your technology area. You will be responsible for your own hands-on coding, provide the design thinking and design solutions, ensuring the quality of your teams output, representing your team in product-level technical forums and ensuring your team provides technical input to and aligns with the overall product road-map. How will you make an impact? You will work with Engineers in other technology areas to define the overall technical direction for the product on alignment with Groups technology roadmap, standards and frameworks, with product owners and business stakeholders to shape the product's delivery roadmap and with support teams to ensure its smooth operation. You will be accountable for the overall technical quality of the work produced by India that is in line with the expectation of the stakeholders, clients and Group. You will also be responsible for line management of your team of Engineers, ensuring that they perform to the expected levels and that their career development is fully supported. Key responsibilities o Produce Quality Code o Code follows team standards, is structured to ensure readability and maintainability and goes through review smoothly, even for complex changes o Designs respect best practices and are favourably reviewed by peers o Critical paths through code are covered by appropriate tests o High-level designs / architectures align to wider technical strategy, presenting reusable APIs where possible and minimizing system dependencies o Data updates are monitored and complete within SLA o Technical designs follow team and group standards and frameworks, is structured to ensure reusability, extensibility and maintainability and goes through review smoothly, even for complex changes o Designs respect best practices and are favourably reviewed by peers o High-level designs / architectures align to wider technical strategy, presenting reusable APIs where possible and minimizing system dependencies o Estimates are consistently challenging, but realistic o Most tasks are delivered within estimate o Complex or larger tasks are delivered autonomously o Sprint goals are consistently achieved o Demonstrate commitment to continuous improvement of squad activities o The product backlog is consistently well-groomed, with a responsible balance of new features and technical debt mitigation o Other Engineers in the Squad feel supported in their development o Direct reports have meaningful objectives recorded in Quantium's Performance Portal, and understand how those objectives relate to business strategy o Direct reports' career aspirations are understood / documented, with action plans in place to move towards those goals o Direct reports have regular catch-ups to discuss performance, career development and their ongoing happiness / engagement in their role o Any performance issues are identified, documented and agreed, with realistic remedial plans in place o Squad Collaboration o People Management o Produce Quality Technical Design o Operate at high level of productivity Key activities Build technical product/application engineering capability in the team by that is in line with the Groups technical roadmap, standards and frameworks Write polished code, aligned to team standards, including appropriate unit / integration tests Review code and test cases produced by others, to ensure changes satisfy the associated business requirement, follow best practices, and integrate with the existing code-base Provide constructive feedback to other team members on quality of code and test cases Collaborate with other Lead / Senior Engineers to produce high-level designs for larger pieces of work Validate technical designs and estimates produced by other team members Merge reviewed code into release branches, resolving any conflicts that arise, and periodically deploy updates to production and non-production environments Troubleshoot production problems and raise / prioritize bug tickets to resolve any issues Proactively monitor system health and act to report / resolve any issues Provide out of hours support for periodic ETL processes, ensuring SLAs are met Work with business stakeholders and other leads to define and estimate new epics Contribute to backlog refinement sessions, helping to break down each epic into a collection of smaller user stories that will deliver the overall feature Work closely with Product Owners to ensure the product backlog is prioritized to maximize business value and manage technical debt Lead work breakdown sessions to define the technical tasks required to implement each user story Contribute to sprint planning sessions, ensuring the team takes a realistic but challenging amount of work into each sprint and each team member will be productively occupied Contribute to the teams daily stand-up, highlighting any delays or impediments to progress and proposing mitigation for those issues Contribute to sprint review and sprint retro sessions, to maintain a culture of continuous improvement within the team Coach / mentor more junior Engineers to support their continuing development Set and periodically review delivery and development objectives for direct reports Identify each direct reports longer-term career objectives and, as far as possible, factor this into work assignments Hold fortnightly catch-ups with direct reports to review progress against objectives, assess engagement and give them the opportunity to raise concerns about the product or team Work through the annual performance review process for all team members Conduct technical interviews as necessary to recruit new Engineers The superpowers youll be bringing to the team: 8+ years of experience in design, develop, and implement end-to-end data solutions (storage, integration, processing, access) in Google Cloud Platform (GCP) or similar cloud platforms. 2. Strong experience with SQL 3. Values delivering high-quality, peer-reviewed, well-tested code 4. Create ETL/ELT pipelines that transform and process terabytes of structured and unstructured data in real-time 5. Knowledge of DevOps functions and to contribute to CI / CD pipelines 6. Strong knowledge of data warehousing and data modelling and techniques like dimensional modelling etc 7. Strong hands-on experience with BigQuery/Snowflake, Airflow/Argo, Dataflow, Data catalog, VertexAI, Pub/Sub etc or equivalent products in other cloud platforms 8. Solid grip over programming languages like Python or Scala 9. Hands on experience in manipulating SPARK at scale with true in-depth knowledge of SPARK API 10. Experience working with stakeholders and mentoring experience for juniors in the team is good to have 11. Recognized as a go-to person for high-level designs and estimations 12. Experience working with source control tools (GIT preferred) with good understanding of branching / merging strategies 13. Experience in Kubernetes and Azure will be an advantage 14. Understanding of GNU/Linux systems and Bash/scripting 15. Bachelors degree in Computer Science, Information Technology or a related discipline 16. Comfortable working in a fast moving, agile development environment 17. Excellent problem solving / analytical skills 18. Good written / verbal communication skills 19. Commercially aware, with the ability to work with a diverse range of stakeholders 20. Enthusiasm for coaching and mentoring junior engineers 21. Experience in lading teams, including line management responsibilities What could your Quantium Experience look like? Working at Quantium will allow you to challenge your imagination. You will get to solve complex problems using rigor, precision and by asking great questions but it also means you can think big, outside the box and push your problem-solving skills to the max. By joining the Quantium team, youll get to: Forge your path: So many of our team have moved around different teams or offices. Youll be in the drivers seat, and we empower you to make your career your own. Find your kind: Embrace diversity and connect with your tribe (think foodies, dog lovers, readers, or runners). Make an impact: Leave your mark. Your contributions resonate, regardless of your role or rank. On top of the Quantium Experience, you will enjoy a range of great benefits that go beyond the ordinary. Some of these include: Flexible work arrangements : Achieve work life balance at your own pace with hybrid and flexible work arrangements. Continuous learning : Our vision is empowering analytics talent to thrive. The Analytics Community fosters the development of individuals, thought leadership and technical excellence at Quantium through building strong connections, fostering collaboration, and co-creation of best practice. Remote working : Embrace the opportunity to work outside of your assigned home location for up to 2 months every year.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

7 - 11 Lacs

Pune

Work from Office

Naukri logo

Provide expertise in analysis, requirements gathering, design, coordination, customization, testing and support of reports, in client’s environment Develop and maintain a strong working relationship with business and technical members of the team Relentless focus on quality and continuous improvement Perform root cause analysis of reports issues Development / evolutionary maintenance of the environment, performance, capability and availability. Assisting in defining technical requirements and developing solutions Effective content and source-code management, troubleshooting and debugging Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Tableau Desktop Specialist, SQL -Strong understanding of SQL for Querying database Good to have - Python ; Snowflake, Statistics, ETL experience. Extensive knowledge on using creating impactful visualization using Tableau. Must have thorough understanding of SQL & advance SQL (Joining & Relationships). Must have experience in working with different databases and how to blend & create relationships in Tableau. Must have extensive knowledge to creating Custom SQL to pull desired data from databases. Troubleshooting capabilities to debug Data controls Preferred technical and professional experience Troubleshooting capabilities to debug Data controls Capable of converting business requirements into workable model. Good communication skills, willingness to learn new technologies, Team Player, Self-Motivated, Positive Attitude. Must have thorough understanding of SQL & advance SQL (Joining & Relationships)

Posted 2 weeks ago

Apply

3.0 - 8.0 years

9 - 13 Lacs

Gurugram

Work from Office

Naukri logo

About the Role: Grade Level (for internal use): 10 The Team As a member of the Data Transformation team you will work on building ML powered products and capabilities to power natural language understanding, data extraction, information retrieval and data sourcing solutions for S&P Global Market Intelligence and our clients. You will spearhead development of production-ready AI products and pipelines while leading-by-example in a highly engaging work environment. You will work in a (truly) global team and encouraged for thoughtful risk-taking and self-initiative. The Impact The Data Transformation team has already delivered breakthrough products and significant business value over the last 3 years. In this role you will be developing our next generation of new products while enhancing existing ones aiming at solving high-impact business problems. What’s in it for you Be a part of a global company and build solutions at enterprise scale Collaborate with a highly skilled and technically strong team Contribute to solving high complexity, high impact problems Key Responsibilities Build production ready data acquisition and transformation pipelines from ideation to deployment Being a hands-on problem solver and developer helping to extend and manage the data platforms Apply best practices in data modeling and building ETL pipelines (streaming and batch) using cloud-native solutions What We’re Looking For 3-5 years of professional software work experience Expertise in Python and Apache Spark OOP Design patterns, Test-Driven Development and Enterprise System design Experience building data processing workflows and APIs using frameworks such as FastAPI, Flask etc. Proficiency in API integration, experience working with REST APIs and integrating external & internal data sources SQL (any variant, bonus if this is a big data variant) Linux OS (e.g. bash toolset and other utilities) Version control system experience with Git, GitHub, or Azure DevOps. Problem-solving and debugging skills Software craftsmanship, adherence to Agile principles and taking pride in writing good code Techniques to communicate change to non-technical people Nice to have Core Java 17+, preferably Java 21+, and associated toolchain DevOps with a keen interest in automation Apache Avro Apache Kafka Kubernetes Cloud expertise (AWS and GCP preferably) Other JVM based languages - e.g. Kotlin, Scala C# - in particular .NET Core Data warehouses (e.g., Redshift, Snowflake, BigQuery) What’s In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Were more than 35,000 strong worldwide—so were able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all.From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the worlds leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Flexible DowntimeGenerous time off helps keep you energized for your time on. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIt’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email toEEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning)

Posted 2 weeks ago

Apply

7.0 - 10.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Role & responsibilities 3+ years of experience with Snowflake (Snowpipe, Streams, Tasks) Strong proficiency in SQL for high-performance data transformations Hands-on experience building ELT pipelines using cloud-native tools Proficiency in dbt for data modeling and workflow automation Python skills (Pandas, PySpark, SQLAlchemy) for data processing Experience with orchestration tools like Airflow or Prefect Preferred candidate profile Hands-on with Python, including libraries like Pandas, PySpark, or SQLAlchemy. Experience with data cataloging, metadata manage1nent, and column-level lineage. Exposure to BI tools like Tableau, or Power Bl. Certifications: Snowflake SnowPro Core Certification preferred. Contact details: Sindhu@iflowonline.com or 9154984810

Posted 2 weeks ago

Apply

7.0 - 10.0 years

10 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Job location Bangalore Experience 7-10 Years Job Description Must have hands on exp (min 6-8 years) in SnapLogic Pipeline Development with good debugging skills. ETL jobs migration exp into Snaplogic, Platform Moderation and cloud exposure on AWS Good to have SnapLogic developer certification, hands on exp in Snowflake. Should be strong in SQL, PL/SQL and RDBMS. Should be strong in ETL Tools like DataStage, informatica etc with data quality. Proficiency in configuring SnapLogic components, including snaps, pipelines, and transformations Designing and developing data integration pipelines using the SnapLogic platform to connect various systems, applications, and data sources. Building and configuring SnapLogic components such as snaps, pipelines, and transformations to handle data transformation, cleansing, and mapping. Experience in Design, development and deploying the reliable solutions. Ability to work with business partners and provide long lasting solutions Snaplogic Integration - Pipeline Development. Staying updated with the latest SnapLogic features, enhancements, and best practices to leverage the platform effectively.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Mumbai

Hybrid

Naukri logo

Job type: Contract to Hire Job Description:- Python Data side developers(Pandas, Numpi, SQL, data pipelines,etc.) 5-7 years of experience as a Python Developer with below skills. Snowflake exposure Building API using python Microservice, API Gateway, Authentication Oauth2&mTLS Web service development Unit Testing, Test Driven Development Multi-tier web or desktop application development experience Application container Docker Linux experience, Python virtual environment Tools: Eclipse IDE/IntelliJ, GIT, Jira

Posted 2 weeks ago

Apply

4.0 - 9.0 years

8 - 16 Lacs

Hyderabad

Work from Office

Naukri logo

Snowflake Developer_ Walk in_ Hyderabad_7th June Role: Snowflake Developer Experience: 4-10 years Walk in Date: 7th June 25 Location: TCS Deccan Park, Plot No.1, Hitech City Main Rd, Software Units Layout, HUDA Techno Enclave, Madhapur, Hyderabad, Telangana 500081 Desired Competencies (Technical/Behavioral Competency): Proficient in SQL programming (stored procedures, user defined functions, CTEs, window functions), Design and implement Snowflake data warehousing solutions, including data modelling and schema designing Snowflake Able to source data from APIs, data lake, on premise systems to Snowflake. Process semi structured data using Snowflake specific features like variant, lateral flatten Experience in using Snow pipe to load micro batch data. Good knowledge of caching layers, micro partitions, clustering keys, clustering depth, materialized views, scale in/out vs scale up/down of warehouses. Ability to implement data pipelines to handle data retention, data redaction use cases. Proficient in designing and implementing complex data models, ETL processes, and data governance frameworks. Strong hands on in migration projects to Snowflake Deep understanding of cloud-based data platforms and data integration techniques. Skilled in writing efficient SQL queries and optimizing database performance. Ability to development and implementation of a real-time data streaming solution using Snowflake

Posted 2 weeks ago

Apply

6.0 - 8.0 years

8 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Primary Skills - Snowflake, DBT, AWS; Good to have Skills - Fivetran (HVR), Python Responsibilities: Design, develop, and maintain data pipelines using Snowflake, DBT, and AWS. Collaborate with cross-functional teams to understand data requirements and deliver solutions. Optimize and troubleshoot existing data workflows to ensure efficiency and reliability. Implement best practices for data management and governance. Stay updated with the latest industry trends and technologies to continuously improve our data infrastructure. Required Skills: Proficiency in Snowflake, DBT, and AWS. Experience with data modeling, ETL processes, and data warehousing. Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities. Preferred Skills: Knowledge of Fivetran (HVR) and Python. Familiarity with data integration tools and techniques. Ability to work in a fast-paced and agile environment. Education: Bachelor's degree in Computer Science, Information Technology, or a related field.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

12 - 19 Lacs

Pune, Chennai, Coimbatore

Work from Office

Naukri logo

Responsibilities: Translate project requirements into effective and comprehensive test cases. Define clear testing objectives that align with overall project goals. Establish the testing scope, prioritizing critical features and functionalities. Document expected deliverables, such as detailed test plans, scripts, and reports. Use dbt to build tests that ensures the ETL process is working as intended Automate the common manual testing done by the QA through creating macros in dbt Build and monitor automated system health checks Collaborate with Enterprise Data Engineers to investigate root cause of issues and suggest resolutions. Orchestrate data testing solutions using airflow Be able to support the team in doing releases Develop and maintain test automation frameworks, integrating them with CI/CD pipelines. Collaborate effectively with developers to implement testing strategies at lower levels, facilitating a "shift left" approach and promoting early defect detection. Take ownership of application quality from requirements gathering through development and testing, ensuring a high standard of product excellence Qualification: Strong SQL and data transformation skills Experience in programming or scripting languages such as Python, C#, Java JavaScript/TypeScript Understanding of ETL/ELT process fundamentals Experience in designing, developing, and maintaining robust and scalable test automation frameworks such as Playwright, Selenium or Cypress Experience in testing of data with tools such as PowerBI, DBT and Snowflake (nice to have) Experience with GitHub Actions or similar platforms for automating and managing test workflows. Technical Skills You are a proactive advocate for "shifting left", aiming to identify and address defects earlier in the development lifecycle. You are passionate about test automation and committed to continuously improving testing processes. You enjoy collaborating with your team members to build solutions which improves the data quality within the data warehouse You have a drive to automate processes that mainly focuses on ensuring data quality and process rigidity You have experience within data engineering. You have experience working with relational databases and investigating root cause of issues. Your SQL and data transformation skills are a key skill of yours You have an understanding of database management systems and scripting.

Posted 2 weeks ago

Apply

8.0 - 12.0 years

35 - 45 Lacs

Chennai

Work from Office

Naukri logo

STAFF ENGINEER (Accounts Payable) Toast is a technology company that specializes in providing a comprehensive all-in-one SaaS product and financial technology solutions tailored for the restaurant industry. Toast offers a suite of tools to help restaurants manage their operations, including point of sale, payment processing, supplier management, digital ordering and delivery, marketing and loyalty, employee scheduling and team management. The platform is designed to streamline operations, enhance customer experiences, and improve overall efficiency for the restaurants. Are you bready* for a change? As a Staff Engineer on the Accounts Payable team you will be responsible for developing and maintaining back-end systems that support AP operations, automating processes, enhancing user interfaces, and integrating various systems. In this role, you will work on architecting, developing, and maintaining backend systems and services that support our business and technical goals. You will collaborate closely with product managers, frontend engineers, and other stakeholders to deliver high-quality, scalable, and reliable backend solutions. Join us to improve our platform and add the next generation of products. About this roll* (Responsibilities) As a Staff Engineer on our team, you will: Be part of a team working collaboratively with UX, PM, QA and other engineers designing, building and maintaining high performance, flexible and highly scalable Saas applications Lead technical initiatives, mentor junior engineers, and provide guidance on best practices for backend development. Champion design reviews and help drive the technical direction of the team. Develop automated workflows for invoice processing, payment approvals, and vendor management. Optimize query performance and ensure data integrity within large datasets. Implement machine learning or Optical Character Recognition (OCR) to streamline data extraction from invoices and minimize manual intervention. Lead, mentor and coach engineers on best in class industry standard development best practices Collaborate with other engineering teams to ensure that developed solutions are scalable, reliable, and secure. Use cutting-edge technologies and best practices to optimize for performance and usability, ultimately enhancing the overall restaurant management experience. Advocate best coding practices to raise the bar for you, your team and the company Dedicated to building a high-quality, reliable, and high-performing framework for reporting, analytics, and insights on toast platform Document solution design, write & review code, test and rollout solutions to production, Work with PM in capturing & actioning customer feedback to iteratively enhance customer experience Propose and implement improvements to enhance system efficiency, scalability, and user experience. Present findings and insights to senior leadership and stakeholders. Passionate about making users happy and seeing people use your product in the wild. Do you have the right ingredients*? (Requirements) 8+ years of hands on experience delivering high quality, reliable services / software development using C#, Java, Kotlin or other object-oriented languages Build and maintain RESTful APIs, GraphQL endpoints, or other integrations with internal and external services. Design, optimize, and maintain relational (SQL) and NoSQL databases (SQL Server, Postgres, DynamoDB). Work on data modeling, query optimization, and performance tuning. Identify bottlenecks, optimize application performance, and scale backend systems to handle high traffic and large data volumes. Strong experience with automated testing (unit, integration, end-to-end tests) and test-driven development (TDD). Proficient with data warehousing solutions such as Snowflake, Redshift, or BigQuery. Experience working in a team with Agile/Scrum methodology Must have experience supporting and debugging large distributed applications. Experience in monitoring, troubleshooting, and improve system performance through logging and metrics Familiarity with data platforms to process large datasets for scalable data processing will be a plus Strong problem-solving skills, with the ability to identify, diagnose, and resolve complex technical issues. Excellent communication skills to work with both technical and non-technical stakeholders. Self-motivated, with a passion for learning and staying current with new technologies. A minimum of a bachelor's degree is required. This role follows a hybrid work model, requiring a minimum of two days per week in the office

Posted 2 weeks ago

Apply

5.0 - 10.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Were seeking a Senior Software Engineer or a Lead Software Engineer to join one of our Data Layer teams. As the name implies, the Data Layer is at the core of all things data at Zeta. Our responsibilities include: Developing and maintaining the Zeta Identity Graph platform, which collects billions of behavioural, demographic, locations and transactional signals to power people-based marketing. Ingesting vast amounts of identity and event data from our customers and partners. Facilitating data transfers across systems. Ensuring the integrity and health of our datasets. And much more. As a member of this team, the data engineer will be responsible for designing and expanding our existing data infrastructure, enabling easy access to data, supporting complex data analyses, and automating optimization workflows for business and marketing operations. Essential Responsibilities: As a Senior Software Engineer or a Lead Software Engineer, your responsibilities will include: Building, refining, tuning, and maintaining our real-time and batch data infrastructure Daily use technologies such as Spark, Airflow, Snowflake, Hive, Scylla, Django, FastAPI, etc. Maintaining data quality and accuracy across production data systems Working with Data Engineers to optimize data models and workflows Working with Data Analysts to develop ETL processes for analysis and reporting Working with Product Managers to design and build data products Working with our DevOps team to scale and optimize our data infrastructure Participate in architecture discussions, influence the road map, take ownership and responsibility over new projects Participating in on-call rotation in their respective time zones (be available by phone or email in case something goes wrong) Desired Characteristics: Minimum 5-10 years of software engineering experience. Proven long term experience and enthusiasm for distributed data processing at scale, eagerness to learn new things. Expertise in designing and architecting distributed low latency and scalable solutions in either cloud and on-premises environment. Exposure to the whole software development lifecycle from inception to production and monitoring Fluency in Python or solid experience in Scala, Java Proficient with relational databases and Advanced SQL Expert in usage of services like Spark and Hive Experience with web frameworks such as Flask, Django Experience in adequate usage of any scheduler such as Apache Airflow, Apache Luigi, Chronos etc. Experience in Kafka or any other stream message processing solutions. Experience in adequate usage of cloud services (AWS) at scale Experience in agile software development processes Excellent interpersonal and communication skills Nice to have: Experience with large scale / multi-tenant distributed systems Experience with columnar / NoSQL databases Vertica, Snowflake, HBase, Scylla, Couchbase Experience in real team streaming frameworks Flink, Storm Experience in open table formats such as Iceberg, Hudi or Deltalake

Posted 2 weeks ago

Apply

6.0 - 11.0 years

15 - 20 Lacs

Hyderabad, Pune, Chennai

Work from Office

Naukri logo

Hiring For Top IT Company- Designation:ETL Tester Skills:ETL Testing + Data warehouse + Snowflakes + Azure Location:Bang/Hyd/Pune/Chennai Exp: 5-10 yrs Call: Nisha:8875876654 Afreen:9610352987 Garima:8875813216 Kajal:8875831472 Team Converse

Posted 2 weeks ago

Apply

Exploring Snowflake Jobs in India

Snowflake has become one of the most sought-after skills in the tech industry, with a growing demand for professionals who are proficient in handling data warehousing and analytics using this cloud-based platform. In India, the job market for Snowflake roles is flourishing, offering numerous opportunities for job seekers with the right skill set.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Chennai

These cities are known for their thriving tech industries and have a high demand for Snowflake professionals.

Average Salary Range

The average salary range for Snowflake professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum

Career Path

A typical career path in Snowflake may include roles such as: - Junior Snowflake Developer - Snowflake Developer - Senior Snowflake Developer - Snowflake Architect - Snowflake Consultant - Snowflake Administrator

Related Skills

In addition to expertise in Snowflake, professionals in this field are often expected to have knowledge in: - SQL - Data warehousing concepts - ETL tools - Cloud platforms (AWS, Azure, GCP) - Database management

Interview Questions

  • What is Snowflake and how does it differ from traditional data warehousing solutions? (basic)
  • Explain how Snowflake handles data storage and compute resources in the cloud. (medium)
  • How do you optimize query performance in Snowflake? (medium)
  • Can you explain how data sharing works in Snowflake? (medium)
  • What are the different stages in the Snowflake architecture? (advanced)
  • How do you handle data encryption in Snowflake? (medium)
  • Describe a challenging project you worked on using Snowflake and how you overcame obstacles. (advanced)
  • How does Snowflake ensure data security and compliance? (medium)
  • What are the benefits of using Snowflake over traditional data warehouses? (basic)
  • Explain the concept of virtual warehouses in Snowflake. (medium)
  • How do you monitor and troubleshoot performance issues in Snowflake? (medium)
  • Can you discuss your experience with Snowflake's semi-structured data handling capabilities? (advanced)
  • What are Snowflake's data loading options and best practices? (medium)
  • How do you manage access control and permissions in Snowflake? (medium)
  • Describe a scenario where you had to optimize a Snowflake data pipeline for efficiency. (advanced)
  • How do you handle versioning and change management in Snowflake? (medium)
  • What are the limitations of Snowflake and how would you work around them? (advanced)
  • Explain how Snowflake supports semi-structured data formats like JSON and XML. (medium)
  • What are the considerations for scaling Snowflake for large datasets and high concurrency? (advanced)
  • How do you approach data modeling in Snowflake compared to traditional databases? (medium)
  • Discuss your experience with Snowflake's time travel and data retention features. (medium)
  • How would you migrate an on-premise data warehouse to Snowflake in a production environment? (advanced)
  • What are the best practices for data governance and metadata management in Snowflake? (medium)
  • How do you ensure data quality and integrity in Snowflake pipelines? (medium)

Closing Remark

As you explore opportunities in the Snowflake job market in India, remember to showcase your expertise in handling data analytics and warehousing using this powerful platform. Prepare thoroughly for interviews, demonstrate your skills confidently, and keep abreast of the latest developments in Snowflake to stay competitive in the tech industry. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies