Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 11.0 years
15 - 30 Lacs
Noida, Pune, Bengaluru
Hybrid
We are looking for a Snowflake Data Engineer with deep expertise in Snowflake and DBT to help us build and scale our modern data platform. Key Responsibilities: Design and build scalable ELT pipelines in Snowflake using DBT . Develop efficient, well-tested DBT models (staging, intermediate, and marts layers). Implement data quality, testing, and monitoring frameworks to ensure data reliability and accuracy. Optimize Snowflake queries, storage, and compute resources for performance and cost-efficiency. Collaborate with cross-functional teams to gather data requirements and deliver data solutions. Required Qualifications: 5+ years of experience as a Data Engineer, with at least 4 years working with Snowflake . Proficient with DBT (Data Build Tool) including Jinja templating, macros, and model dependency management. Strong understanding of ELT patterns and modern data stack principles. Advanced SQL skills and experience with performance tuning in Snowflake. Interested candidates share your CV at himani.girnar@alikethoughts.com with below details Candidate's name- Email and Alternate Email ID- Contact and Alternate Contact no- Total exp- Relevant experience- Current Org- Notice period- CCTC- ECTC- Current Location- Preferred Location- Pancard No-
Posted 2 weeks ago
6.0 - 11.0 years
5 - 15 Lacs
Ahmedabad, Mumbai (All Areas)
Work from Office
6 years of exp. with AWS, Snowflake, Microsoft SQL Server, SSMS, Visual Studio, and Data Warehouse ETL processes. 4 yrs of programming exp. with Python, C#, VB.NET, T-SQL. Minimum of 3 years of exp. building end-to-end pipelines within AWS Stack. Required Candidate profile Strong collaborative team-oriented style Impeccable customer service skills Exp. with healthcare information systems and healthcare practice processes. Exp. with SaaS applications. Good Communication
Posted 2 weeks ago
7.0 - 8.0 years
7 - 9 Lacs
Bengaluru
Work from Office
We are seeking an experienced Data Engineer to join our innovative data team data team and help build scalable data infrastructure, software consultancy, and development services that powers business intelligence, analytics, and machine learning initiatives. The ideal candidate will design, develop, and maintain robust high-performance data pipelines and solutions while ensuring data quality, reliability, and accessibility across the organization working with cutting-edge technologies like Python, Microsoft Fabric, Snowflake, Dataiku, SQL Server, Oracle, PostgreSQL, etc. Required Qualifications 5 + years of experience in Data engineering role. Programming Languages: Proficiency in Python Cloud Platforms: Hands-on experience with Azure (Fabric, Synapse, Data Factory, Event Hubs) Databases: Strong SQL skills and experience with both relational (Microsoft SQL Server, PostgreSQL, MySQL) and NoSQL (MongoDB, Cassandra) databases Version Control: Proficiency with Git and collaborative development workflows Proven track record of building production-grade data pipelines handling large-scale data or solutions. Desired experience with containerization (Docker) and orchestration (Kubernetes) technologies . Knowledge of machine learning workflows and MLOps practices Familiarity with data visualization tools (Tableau, Looker, Power BI) Experience with stream processing and real-time analytics Experience with data governance and compliance frameworks (GDPR, CCPA) Contributions to open-source data engineering projects Relevant Cloud certifications (e.g., Microsoft Certified: Azure Data Engineer Associate, AWS Certified Data Engineer, Google Cloud Professional Data Engineer). Specific experience or certifications in Microsoft Fabric, or Dataiku, Snowflake.
Posted 2 weeks ago
9.0 - 14.0 years
20 - 32 Lacs
Hyderabad, Bengaluru, Delhi / NCR
Hybrid
Data Architect/Data Modeler role: Scope & Responsibilities: Enterprise data architecture for a functional domain or for a product group. Design and governs delivery of the Domain Data Architecture / ensures delivery as per design. Ensures consistency in approach for the data modelling of the different solutions of the domain. Design and ensure delivery of data integration across the solutions of the domain. General Expertise: Critical: Methodology expertise on data architecture and modeling from business requirements and functional specifications to data modeling. Critical: Data warehousing and Business intelligence data products modeling (Inmon/Kimball/Data Vault/Codd modeling patterns). Business/Functional knowledge of the domain. This requires business terminology understanding, knowledge of business processes related to the domain, awareness of key principles and objectives, business trends and evolution. Master data management / data management and stewardship processes awareness. Data persistency technologies knowledge: SQL (Ansi-2003 for structured relational data querying & Ansi-2023 for XML, JSON, Property Graph querying) / Snowflake specificities, database structures for performance optimization. NoSQL – Other data persistency technologies awareness. Proficient level of Business English & “Technical writing” Nice to have: Project delivery expertise through agile approach and methodologies experience/knowledge: Scrum, SAFe 5.0, Product-based-organization. Technical Stack expertise: SAP Power Designer modeling (CDM, LDM, PDM) Snowflake General Concepts and specifically DDL & DML, Snow Sight, Data Exchange/Data Sharing concepts AWS S3 & Athena (as a query user) Confluence & Jira (as a contributing user) Nice to have: Bitbucket (as a basic user)
Posted 2 weeks ago
2.0 - 4.0 years
4 - 6 Lacs
Hyderabad
Work from Office
About the Team: - The team is responsible for managing all the reporting activities to the Senior Management, Limited Partners, and other regulatory reporting. The role will require working closely with the Investor Product Strategy and Valuations team in New York. It has 3 major functions: - 1. Data Collection & Governance: - Collecting the required data from the Portfolio companies, scrubbing the data, and uploading the same to the firms portfolio monitoring system. 2. Data Reporting: - Majorly responsible for the regular reporting needs of the LPs, Sr. Mgmt., etc. 3. Data Analysis: - Handle any substantive analytics and ad-hoc requests. Key Responsibilities: - • Develop a good understanding of the investment strategies for Private Equity, Tactical Opportunity, Life Sciences, Infra, Energy, and Growth funds. • Review the reporting packages like Fund Track Records, Deal Attributes, Valuation Bridges, Portfolio Summaries, etc. • Review the below quarterly activities by adhering to the deadlines: - New and ongoing Due Diligence Questionnaires. LP Requests, Valuation Memos, and Other quarterly Presentation Materials/Reports. Qvidian library upload and maintenance. • Run monthly/quarterly reports for LPs (Memos, Bridges, One-Pagers, etc.), Corporates (DRPs, SOX reports, Publics Schedule, etc.) & Internal stakeholders (KPI Dashboards, Portfolio Watchlist, Valuation Summaries, etc.) • Gross and Net cashflow reporting across all the funds. • Preparing Ad-Hoc reports/dashboards as per managements requirement. • Monitoring the deal portfolio through data collection and reconciliation. • Calculation of fund performance metrics like IRR, MOIC, etc. • Preparing the process documentation and creation of SOPs. • Identify process gaps and initiate process improvement projects. Desired Candidate Profile: - • Candidate must be a Postgraduate in finance with 2 - 4 years of relevant experience in various private equity concepts, fee structures, and performing end-to-end business valuations. • Basic conceptual knowledge of various valuation techniques like DCFs, Comparable Company Analysis, etc. • Should have very good working knowledge of Answering Due Diligence Questionnaires and Other Fund/Investment level reporting like Valuation reporting, LP reporting. • Strong Microsoft Office skills (MS Excel, MS PowerPoint, and MS Word). • Experience in iLevel, 73Strings, Tableau, Snowflake or any data reporting/analysis software is advantageous. • The ability to effectively work as an individual contributor and possess strong analytical, problem-solving, critical thinking, and decision-making skills, multitask, and deliver under tight deadlines. • The profile involves effective communication across clients facilities globally and hence possessing excellent interpersonal and communication skills in verbal and written English is a must. • A desire to work in an international team environment, often under pressure and with multiple stakeholders.
Posted 2 weeks ago
7.0 - 10.0 years
5 - 15 Lacs
Hyderabad
Hybrid
Job Summary: • We are seeking an experienced Lead Snowflake Data Engineer to join our Data & Analytics team. This role involves designing, implementing, and optimizing Snowflake-based data solutions while providing strategic direction and leadership to a team of junior and mid-level data engineers. The ideal candidate will have deep expertise in Snowflake, cloud data platforms, ETL/ELT processes, and Medallion data architecture best practices. The lead data engineer role has a strong focus on performance optimization, security, scalability, and Snowflake credit control and management. This is a tactical role requiring independent in-depth data analysis and data discovery to understand our existing source systems, fact and dimension data models, and implement an enterprise data warehouse solution in Snowflake. Essential Functions and Tasks: • Lead the design, development, and maintenance of a scalable Snowflake data solution serving our enterprise data & analytics team. • Architect and implement data pipelines, ETL/ELT workflows, and data warehouse solutions using Snowflake and related technologies. • Optimize Snowflake database performance, storage, and security. • Provide guidance on Snowflake best practices • Collaborate with cross-functional teams of data analysts, business analysts, data scientists, and software engineers, to define and implement data solutions. • Ensure data quality, integrity, and governance across the organization. • Provide technical leadership and mentorship to junior and mid-level data engineers. • Troubleshoot and resolve data-related issues, ensuring high availability and performance of the data platform. Education and Experience Requirements: • Bachelors or Master’s degree in Computer Science, Information Systems, or a related field. • 7+ years of experience in-depth data engineering, with at least 3+ minimum years of dedicated experience engineering solutions in a Snowflake environment. • Tactical expertise in ANSI SQL, performance tuning, and data modeling techniques. • Strong experience with cloud platforms (preference to Azure) and their data services. • Proficiency in ETL/ELT development using tools such as Azure Data Factory, dbt, Matillion, Talend, or Fivetran. • Hands-on experience with scripting languages like Python for data processing. • Strong understanding of data governance, security, and compliance best practices. • Snowflake SnowPro certification; preference to the engineering course path. • Experience with CI/CD pipelines, DevOps practices, and Infrastructure as Code (IaC). • Knowledge of streaming data processing frameworks such as Apache Kafka or Spark Streaming. • Familiarity with BI and visualization tools such as PowerBI Knowledge, Skills, and Abilities: • Familiarity working in an agile scum team, including sprint planning, daily stand-ups, backlog grooming, and retrospectives. • Ability to self-manage large complex deliverables and document user stories and tasks through Azure Dev Ops. • Personal accountability to committed sprint user stories and tasks • Strong analytical and problem-solving skills with the ability to handle complex data challenges • Ability to read, understand, and apply state/federal laws, regulations, and policies. • Ability to communicate with diverse personalities in a tactful, mature, and professional manner. • Ability to remain flexible and work within a collaborative and fast paced environment. • Understand and comply with company policies and procedures. • Strong oral, written, and interpersonal communication skills. • Strong time management and organizational skills. Physical Demands: • 40 hours per week • Occasional Standing • Occasional Walking • Sitting for prolonged periods of time • Frequent hand, finger movement • Communicate verbally and in writing • Extensive use of computer keyboard and viewing of computer screen • Specific vision abilities required by this job include close vision
Posted 2 weeks ago
3.0 - 6.0 years
22 - 25 Lacs
Hyderabad
Remote
Company Overview: We are a fast-growing startup revolutionizing the contact center industry with GenAI-powered solutions. Our innovative platform is designed to enhance customer engagement Job Description: We are looking for a skilled and experienced Data Engineer to design, build, and optimize scalable data pipelines and architectures that power data-driven decision-making across the organization. Candidate with a proven track record of writing complex stored procedures and optimizing query performance on large datasets. Requirement: Architect, develop, and maintain scalable and secure data pipelines to process structured and unstructured data from diverse sources. Collaborate with data scientists, BI analysts and business stakeholders to understand data requirements. Optimize data workflows and processing for performance, ensure data quality, reliability and governance Hands-on experience with modern data platforms such as Snowflake, Redshift, BigQuery, or Databricks. Strong knowledge of T-SQL and SQL Server Management Studio (SSMS) Experience in writing complex stored procedures, Views and query performance tuning on large datasets Strong understanding of database management systems (SQL,NoSQL) and data warehousing concepts. Good knowledge and hands on experience in tuning the Database at Memory level, able to tweak SQL queries. In-depth knowledge of data modeling principles and methodologies (e.g., relational, dimensional, NoSQL). Excellent analytical and problem-solving skills with a meticulous attention to detail. Hands-on experience with data transformation techniques, including data mapping, cleansing, and validation. Proven ability to work independently and manage multiple priorities in a fast-paced environment. Work closely with cross-functional teams to gather and analyse requirements, develop database solutions, and support application development efforts Knowledge of cloud database solutions (e.g., Azure SQL Database, AWS RDS).
Posted 2 weeks ago
2.0 - 7.0 years
2 - 5 Lacs
Hyderabad
Work from Office
Totango System Administrator We are seeking a Totango System Administrator to lead the implementation, configuration, and ongoing management of Totango, our Customer Success platform. This role will play a critical part in ensuring the platform effectively supports customer engagement, adoption, and retention strategies. This role will work closely with the Director of Customer Success and the Principal Program Manager for Digital Customer Success to align Totangos capabilities with business objectives, streamline processes, and optimize system performance. Additionally, this role will collaborate with cross-functional teams, including Customer Success, Sales, IT, and Data Analytics, to enhance customer journey tracking, automate workflows, and drive business insights. Key Responsibilities Totango Administration & Configuration: Configure, maintain, and optimize Totangos SuccessBlocks to align with business needs. User Management & Security: Administer user roles, permissions, and security settings while ensuring compliance with GDPR and data governance policies. Data Integration & Management: Manage and oversee integrations between Totango and other enterprise systems such as Salesforce, Snowflake Datawarehouse, Gong and other business critical platforms. Process Automation Implement automated workflows and triggers within Totango to support customer onboarding, adoption, and retention strategies. Partnership with Principal Program Manager: Work closely with the Principal Program Manager for Digital Customer Success to align Totangos usage with business priorities, drive user adoption, and enhance reporting capabilities. Troubleshooting & System Support: Identify and resolve system issues, working with vendors and internal teams to optimize platform performance. Reporting & Analytics Develop dashboards and reports within Totango to provide actionable insights into customer health, engagement, and churn risk. End-User Training & Adoption: Conduct training sessions and create enablement materials to enhance Totango adoption among Customer Success teams. Continuous Improvement: Stay up to date with Totango updates, best practices, and new features, making recommendations to optimize system capabilities. Qualifications Required Qualifications & Experience 2+ years of experience in Totango administration, configuration, or a similar Customer Success Platform (Gainsight, ChurnZero, etc.). Strong understanding of Customer Success operations and customer lifecycle management. 2+ years of experience with CRM platforms such as Salesforce, HubSpot, or Microsoft Dynamics. Experience with API-based integrations and data management tools (Snowflake, SQL, ETL processes). Ability to translate business needs into technical solutions through system configuration and workflow automation. Strong analytical and problem-solving skills, with the ability to troubleshoot technical issues independently. Excellent communication skills, with experience working cross-functionally across technical and non-technical teams. Bachelors degree in Computer Science, Business, Data Science, or a related field. Preferred Qualifications Experience in SaaS or enterprise software industries. Knowledge of customer engagement strategies and data analytics. Familiarity with enterprise security and compliance best practices. Certifications in Totango or related platforms are a plus. Certifications in Salesforce administration is a plus Additional Information We are an equal opportunity employer. All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran, or disability status. **At this time insightsoftware is not able to offer employment sponsorship** #LI-Remote ** At this time insightsoftware is not able to offer sponsorship to candidates who are not eligible to work in the country where the position is located . ** Background checks are required for employment with insightsoftware, where permitted by country, state/province.
Posted 2 weeks ago
6.0 - 11.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Senior Software Engineer Insightsoftware is a growing, dynamic computer software company that helpsbusinessesachievegreater levels of financial intelligence across their organization with our world-class financial reporting solutions. At Insightsoftware, you will learn and grow in a fast-paced, supportive environment that will take your career to the next level. We are looking for future Insighters who can demonstrate teamwork, results orientation, a growth mindset, disciplined execution, and a winning attitude to join our growing team! The Equity Management team is growing fast, retaining an agile startup mindset but delivering enterprise solutions to the top players in the financial domain. This is your chance to be part of a team that delivers high quality software that changes the way our customers do business while having fun at work. We are seeking a highly skilled Senior Software Engineer with a strong background in cloud-based ERP systems to join our dynamic technology team. The ideal candidate will bring a solid mix of technical and functional ERP expertise. A strong command of SQL/PLSQL, experience with enterprise databases (e.g., Snowflake, Oracle, Azure, PostgreSQL), and the ability to troubleshoot and enhance performance are essential for success in this position. Responsibilities Design, develop, and maintain ERP-related solutions and customizations on platforms like Oracle Cloud ERP, EBS, Workday, SAP S4/Hana or similar. Work closely with functional teams to understand business requirements and translate them into technical specifications and solutions. Serve as a bridge between functional stakeholders and technical teams, offering insight into ERP system capabilities and configurations. Write and optimize complex SQL and PL/SQL queries to ensure fast, accurate data retrieval across large datasets. Manage and maintain various databases including Oracle, Snowflake, Azure, and PostgreSQL. Build and maintain robust ETL pipelines for seamless data movement between systems. Ensure data integrity and consistency across ERP modules and third-party integrations. Identify performance bottlenecks and resolve system issues in ERP applications and underlying databases. Participate in Agile/Scrum ceremonies and contribute to continuous improvement initiatives. Develop automation scripts for routine tasks, deployments, or testing where applicable. Leverage scripting to improve operational efficiency and reduce manual effort. Qualifications Qualifications A bachelors degree in computer science, or equivalent experience Overall 6+years of experience in technical leadership Experience working with database technologies including PostgreSQL, Snowflake, Oracle and others Proficient in SQL, PL/SQL, and familiar with ETL tools or frameworks Excellent analytical and troubleshooting skills Strong communication and interpersonal skills with collaborative mindset Experience in programming languages like C# is a plus Additional Information Additional Information ** At this time insightsoftware is not able to offer sponsorship to candidates who are not eligible to work in the country where the position is located . ** Background checks are required for employment with insightsoftware, where permitted by country, state/province. ** At this time insightsoftware is not able to offer sponsorship to candidates who are not eligible to work in the country where the position is located . ** Background checks are required for employment with insightsoftware, where permitted by country, state/province.
Posted 2 weeks ago
6.0 - 10.0 years
16 - 25 Lacs
Hyderabad
Work from Office
Key Responsibilities Architect and implement modular, test-driven ELT pipelines using dbt on Snowflake. Design layered data models (e.g., staging, intermediate, mart layers / medallion architecture) aligned with dbt best practices. Lead ingestion of structured and semi-structured data from APIs, flat files, cloud storage (Azure Data Lake, AWS S3), and databases into Snowflake. Optimize Snowflake for performance and cost: warehouse sizing, clustering, materializations, query profiling, and credit monitoring. Apply advanced dbt capabilities including macros, packages, custom tests, sources, exposures, and documentation using dbt docs. Orchestrate workflows using dbt Cloud, Airflow, or Azure Data Factory, integrated with CI/CD pipelines. Define and enforce data governance and compliance practices using Snowflake RBAC, secure data sharing, and encryption strategies. Collaborate with analysts, data scientists, architects, and business stakeholders to deliver validated, business-ready data assets. Mentor junior engineers, lead architectural/code reviews, and help establish reusable frameworks and standards. Engage with clients to gather requirements, present solutions, and manage end-to-end project delivery in a consulting setup Required Qualifications 5 to 8 years of experience in data engineering roles, with 3+ years of hands-on experience working with Snowflake and dbt in production environments. Technical Skills: o Cloud Data Warehouse & Transformation Stack: Expert-level knowledge of SQL and Snowflake, including performance optimization, storage layers, query profiling, clustering, and cost management. Experience in dbt development: modular model design, macros, tests, documentation, and version control using Git. o Orchestration and Integration: Proficiency in orchestrating workflows using dbt Cloud, Airflow, or Azure Data Factory. Comfortable working with data ingestion from cloud storage (e.g., Azure Data Lake, AWS S3) and APIs. Data Modelling and Architecture: Dimensional modelling (Star/Snowflake schemas), Slowly changing dimensions. ' Knowledge of modern data warehousing principles. Experience implementing Medallion Architecture (Bronze/Silver/Gold layers). Experience working with Parquet, JSON, CSV, or other data formats. o Programming Languages: Python: For data transformation, notebook development, automation. SQL: Strong grasp of SQL for querying and performance tuning. Jinja (nice to have): Exposure to Jinja for advanced dbt development. o Data Engineering & Analytical Skills: ETL/ELT pipeline design and optimization. Exposure to AI/ML data pipelines, feature stores, or MLflow for model tracking (good to have). Exposure to data quality and validation frameworks. o Security & Governance: Experience implementing data quality checks using dbt tests. Data encryption, secure key management and security best practices for Snowflake and dbt. Soft Skills & Leadership: Ability to thrive in client-facing roles with competing/changing priorities and fast-paced delivery cycles. Stakeholder Communication: Collaborate with business stakeholders to understand objectives and convert them into actionable data engineering designs. Project Ownership: End-to-end delivery including design, implementation, and monitoring. Mentorship: Guide junior engineers and establish best practices; Build new skill in the team. Agile Practices: Work in sprints, participate in scrum ceremonies, story estimation. Education: Bachelors or masters degree in computer science, Data Engineering, or a related field. Certifications such as Snowflake SnowPro Advanced, dbt Certified Developer are a plus.
Posted 2 weeks ago
5.0 - 8.0 years
7 - 12 Lacs
Pune
Hybrid
We are looking for a highly skilled Senior Python Developer for a 6-month contractual role. The position involves designing and implementing data-oriented and scalable backend solutions using Python and related technologies. The candidate must have 5-8 years of experience and be well-versed in distributed systems, cloud platforms (AWS/GCP), and data pipelines. Strong expertise in Airflow, Kafka, SQL, and modern software development practices (TDD, CI/CD, DevSecOps) is essential. Exposure to AdTech, ML/AI, SaaS, and container technologies (Docker/Kubernetes) is a strong plus. The position is hybrid, based in Pune, and only immediate joiners are eligible.
Posted 2 weeks ago
5.0 - 10.0 years
17 - 30 Lacs
Pune, Bengaluru
Hybrid
Key Responsibilities: Data modeling and design : Create data models and designs for data warehousing solutions. ETL/ELT development : Develop ETL/ELT pipelines using Snowflake's data loading and transformation capabilities. Data quality and integrity : Ensure data quality and integrity by implementing data validation, data cleansing, and data normalization techniques. Team leadership : Lead a team of developers, provide technical guidance, and ensure timely delivery of projects. Collaboration with stakeholders : Collaborate with stakeholders to understand business requirements, provide technical solutions, and ensure that solutions meet business needs. Performance optimization : Optimize performance of data warehousing solutions by implementing best practices, indexing, and caching. Security and governance : Ensure security and governance of data warehousing solutions by implementing access controls, auditing, and data masking. Requirements: 5+ years of experience : Experience in data warehousing, ETL/ELT development, and data modeling. Snowflake experience : Strong experience in Snowflake, including data loading, data transformation, and data querying. Data modeling skills : Strong data modeling skills, including experience with data modeling tools such as Erwin or PowerDesigner. ETL/ELT development skills : Strong ETL/ELT development skills, including experience with ETL/ELT tools such as Informatica or Talend. Leadership skills : Strong leadership skills, including experience in leading teams and managing projects. Communication skills : Strong communication skills, including experience in collaborating with stakeholders and communicating technical solutions. Bachelor's degree : Bachelor's degree in Computer Science, Information Technology, or related field.
Posted 2 weeks ago
4.0 - 8.0 years
10 - 20 Lacs
Bengaluru, Mumbai (All Areas)
Hybrid
Strong Snowflake Cloud database experience Database developer. Knowledge of Spark and Databricks is desirable. Strong technical background in data modelling, database design and optimization for data warehouses, specifically on column oriented MPP architecture Familiar with technologies relevant to data lakes such as Snowflake Candidate should have strong ETL & database design/modelling skills. Experience creating data pipelines Strong SQL skills and debugging knowledge and Performance Tuning exp. Experience with Databricks / Azure is add on /good to have . Experience working with global teams and global application environments Strong understanding of SDLC methodologies with track record of high quality deliverables and data quality, including detailed technical design documentation desired
Posted 2 weeks ago
3.0 - 6.0 years
25 - 33 Lacs
Bengaluru
Work from Office
Overview Overview Annalect is currently seeking a Senior Data Engineer to join our Technology team. In this role you will build Annalect products which sit atop cloud-based data infrastructure. We are looking for people who have a shared passion for technology, design & development, data, and fusing these disciplines together to build cool things. In this role, you will work on one or more software and data products in the Annalect Engineering Team. You will participate in technical architecture, design and development of software products as well as research and evaluation of new technical solutions Responsibilities Designing, building, testing, and deploying data transfers across various cloud environments (Azure, GCP, AWS, Snowflake, etc). Developing data pipelines, monitoring, maintaining, and tuning. Write at-scale data transformations in SQL and Python. Perform code reviews and provide leadership and guidance to junior developers. Qualifications Curiosity in learning the business requirements that are driving the engineering requirements. Interest in new technologies and eagerness to bring those technologies and out of the box ideas to the team. 3+ years of SQL experience. 3+ years of professional Python experience. 3+ years of professional Linux experience. Preferred familiarity with Snowflake, AWS, GCP, Azure cloud environments. Intellectual curiosity and drive; self-starters will thrive in this position. Passion for Technology: Excitement for new technology, bleeding edge applications, and a positive attitude towards solving real world challenges. Additional Skills BS BS, MS or PhD in Computer Science, Engineering, or equivalent real-world experience. Experience with big data and/or infrastructure. Bonus for having experience in setting up Petabytes of data so they can be easily accessed. Understanding of data organization, ie partitioning, clustering, file sizes, file formats. Experience working with classical relational databases (Postgres, Mysql, MSSQL). Experience with Hadoop, Hive, Spark, Redshift, or other data processing tools (Lots of time will be spent building and optimizing transformations) Proven ability to independently execute projects from concept to implementation to launch and to maintain a live product. Perks of working at Annalect We have an incredibly fun, collaborative, and friendly environment, and often host social and learning activities such as game night, speaker series, and so much more! Halloween is a special day on our calendar since it is our Founding Day – we go all out with decorations, costumes, and prizes! Generous vacation policy. Paid time off (PTO) includes vacation days, personal days, and a Summer Friday program. Extended time off around the holiday season. Our office is closed between Xmas and New Year to encourage our hardworking employees to rest, recharge and celebrate the season with family and friends. As part of Omnicom, we have the backing and resources of a global billion-dollar company, but also have the flexibility and pace of a “startup” - we move fast, break things, and innovate. Work with modern stack and environment to keep on learning and improving helping to experiment and shape latest technologies
Posted 2 weeks ago
10.0 - 15.0 years
8 - 15 Lacs
Hyderabad
Hybrid
Job Description: This person will help bring rigor and discipline in day-to-day operations & production supports Ability to work in a fast-paced, high-energy environment and bring sense of urgency & attention to details skills to the table. Coordinates closely with other BI team members to help ensure meaningful prioritization Escalates potential issues in timely fashion and seeks paths for resolution Excellent communication skills and ability to manage expectations Required skills/experience : 10+ years of progressive experience in Snowflake and BI relevant cloud technologies with extensive experience in Extraction, Modelling, & Reporting Worked under Implementation, Enhancement and Support projects. Conduct workshops with stakeholders to understand and analyze business requirements, problem statements, design gaps in existing process to provide scope & solution aligning with organizations IT Architectural landscape tools. Familiar with the concepts of SDLC with proficiency in mapping business requirements, technical documentation, application design, development and troubleshooting for information systems management Expertise in Power BI and Dashboarding skills Production Support - Experience in process chain management, Monitoring and scheduling the jobs. KEY RESPONSIBILITIES Good to have: Experience in Informatica (IICS/IDMC) is a plus Experienced in upgrade projects for warehousing, ETL and Reporting applications Hands on experience in SQL Server and/or Oracle, design and development; SAP functional Knowledge, Advanced analytics is a plus PROFESSIONAL EXPERIENCE/QUALIFICATIONS 10+ years of progressive experience in Snowflake and BI relevant cloud technologies with extensive experience in Extraction, Modelling, & Reporting Worked under Implementation, Enhancement and Support projects. Bachelors or Masters or similar educational qualification
Posted 2 weeks ago
15.0 - 20.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and facilitating communication among stakeholders to drive project success. You will also engage in problem-solving activities, providing guidance and support to your team while ensuring that best practices are followed throughout the development process. Your role will be pivotal in shaping the direction of application projects and ensuring that they meet the highest standards of quality and functionality. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training and development opportunities for team members to enhance their skills.- Monitor project progress and implement necessary adjustments to ensure timely delivery. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data warehousing concepts and architecture.- Experience with ETL processes and data integration techniques.- Familiarity with SQL and data modeling best practices.- Ability to analyze and optimize performance of data queries and processes. Additional Information:- The candidate should have minimum 7.5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
3.0 - 8.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Project Role : Data Governance Practitioner Project Role Description : Establish and enforce data governance policies to ensure the accuracy, integrity, and security of organizational data. Collaborate with key stakeholders to define data standards, facilitate effective data collection, storage, access, and usage; and drive data stewardship initiatives for comprehensive and effective data governance. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Governance Practitioner, you will establish and enforce data governance policies to ensure the accuracy, integrity, and security of organizational data. Your typical day will involve collaborating with key stakeholders to define data standards, facilitating effective data collection, storage, access, and usage, and driving data stewardship initiatives for comprehensive and effective data governance. You will engage in discussions that shape the data landscape of the organization, ensuring that data practices align with established policies and standards. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the development and implementation of data governance frameworks and policies.- Monitor compliance with data governance policies and report on data quality metrics. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data governance principles and best practices.- Experience with data quality assessment and improvement techniques.- Familiarity with data management tools and technologies.- Ability to communicate complex data concepts to non-technical stakeholders. Additional Information:- The candidate should have minimum 3 years of experience in Snowflake Data Warehouse.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
7.0 - 12.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will collaborate with teams to ensure successful project delivery and implementation. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the development and implementation of new software applications- Conduct code reviews and provide technical guidance to team members- Stay updated on industry trends and best practices to enhance application development processes Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse- Strong understanding of cloud-based data warehousing solutions- Experience in ETL processes and data modeling- Knowledge of SQL and database management systems- Hands-on experience with data integration and data migration- Good To Have Skills: Experience with AWS or Azure cloud platforms Additional Information:- The candidate should have a minimum of 7.5 years of experience in Snowflake Data Warehouse- This position is based at our Bengaluru office- A 15 years full time education is required Qualification 15 years full time education
Posted 2 weeks ago
7.0 - 12.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will collaborate with teams to ensure successful project delivery and implementation. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the application development process- Implement best practices for application design and development- Conduct code reviews and ensure code quality Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse- Strong understanding of data warehousing concepts- Experience in ETL processes and data modeling- Knowledge of SQL and database management- Experience in cloud data platforms Additional Information:- The candidate should have a minimum of 7.5 years of experience in Snowflake Data Warehouse- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education
Posted 2 weeks ago
7.0 - 12.0 years
5 - 9 Lacs
Chennai
Work from Office
Project Role : Database Administrator Project Role Description : Administer, develop, test, or demonstrate databases. Perform many related database functions across one or more teams or clients, including designing, implementing and maintaining new databases, backup/recovery and configuration management. Install database management systems (DBMS) and provide input for modification of procedures and documentation used for problem resolution and day-to-day maintenance. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Database Administrator, you will administer, develop, test, or demonstrate databases. Perform many related database functions across one or more teams or clients, including designing, implementing and maintaining new databases, backup/recovery and configuration management. Install database management systems (DBMS) and provide input for modification of procedures and documentation used for problem resolution and day-to-day maintenance. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Ensure data security and integrity- Optimize database performance- Implement data backup and recovery strategies Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse- Strong understanding of database management systems- Experience in database design and implementation- Knowledge of backup and recovery procedures- Familiarity with configuration management- Good To Have Skills: Experience with cloud-based data warehousing solutions Additional Information:- The candidate should have a minimum of 7.5 years of experience in Snowflake Data Warehouse.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
15.0 - 20.0 years
9 - 13 Lacs
Pune
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture strategy. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor and evaluate team performance to ensure alignment with project goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling concepts and best practices.- Experience with ETL processes and data integration techniques.- Familiarity with cloud-based data solutions and architectures.- Ability to troubleshoot and optimize data workflows for performance. Additional Information:- The candidate should have minimum 7.5 years of experience in Snowflake Data Warehouse.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
3.0 - 8.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will play a crucial role in developing solutions that align with organizational goals and objectives. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to analyze, design, and implement new features.- Develop high-quality software design and architecture.- Write clean, scalable code using Snowflake Data Warehouse.- Perform code reviews and provide constructive feedback to team members.- Stay updated on emerging technologies and apply them to projects. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of cloud-based data warehousing concepts.- Experience in ETL processes and data modeling.- Knowledge of SQL and database management systems.- Hands-on experience with data integration and data migration.- Good To Have Skills: Experience with AWS or Azure cloud platforms. Additional Information:- The candidate should have a minimum of 3 years of experience in Snowflake Data Warehouse.- This position is based at our Hyderabad office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 2 weeks ago
7.0 - 12.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will collaborate with teams to ensure successful project delivery and implementation. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead and mentor junior professionals- Conduct regular team meetings to discuss progress and challenges- Stay updated on industry trends and best practices Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse- Strong understanding of data warehousing concepts- Experience in ETL processes and data modeling- Knowledge of SQL and database management- Hands-on experience in cloud-based data platforms Additional Information:- The candidate should have a minimum of 7.5 years of experience in Snowflake Data Warehouse- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education
Posted 2 weeks ago
12.0 - 15.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and facilitating communication among stakeholders. You will also engage in problem-solving activities, providing guidance and support to your team while ensuring that best practices are followed throughout the development process. Your role will be pivotal in driving the success of application initiatives and fostering a collaborative environment. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate training and development opportunities for team members to enhance their skills.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling and ETL processes.- Experience with cloud-based data solutions and architecture.- Familiarity with SQL and data querying techniques.- Knowledge of data governance and security best practices. Additional Information:- The candidate should have minimum 12 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
15.0 - 20.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions, and ensuring that applications are aligned with business objectives. You will also engage in problem-solving activities, providing support and enhancements to existing applications, while continuously seeking ways to improve processes and user experiences. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of data modeling and ETL processes.- Experience with SQL and database management.- Familiarity with cloud computing concepts and services.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Snowflake has become one of the most sought-after skills in the tech industry, with a growing demand for professionals who are proficient in handling data warehousing and analytics using this cloud-based platform. In India, the job market for Snowflake roles is flourishing, offering numerous opportunities for job seekers with the right skill set.
These cities are known for their thriving tech industries and have a high demand for Snowflake professionals.
The average salary range for Snowflake professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum
A typical career path in Snowflake may include roles such as: - Junior Snowflake Developer - Snowflake Developer - Senior Snowflake Developer - Snowflake Architect - Snowflake Consultant - Snowflake Administrator
In addition to expertise in Snowflake, professionals in this field are often expected to have knowledge in: - SQL - Data warehousing concepts - ETL tools - Cloud platforms (AWS, Azure, GCP) - Database management
As you explore opportunities in the Snowflake job market in India, remember to showcase your expertise in handling data analytics and warehousing using this powerful platform. Prepare thoroughly for interviews, demonstrate your skills confidently, and keep abreast of the latest developments in Snowflake to stay competitive in the tech industry. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.