Home
Jobs

179 Snowflake Jobs - Page 2

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 9.0 years

4 - 9 Lacs

Chennai, Tamil Nadu, India

On-site

Foundit logo

Core requirements - Solid SQL language skills Basic knowledge of data modeling Working knowledge with Snowflake in Azure, CI/CD process (with any tooling) Nice to have - Azure ADF ETL/ELT frameworks ER/Studio Really nice to have - Healthcare / life sciences experience GxP processes Sr DW Engineer (in addition to the above) Overseeing engineers while also performing the same work himself/herself Conducting design reviews, code reviews, and deployment reviews with engineers Solid data modeling, preferably using ER/Studio (or equivalent tool is fine) Solid Snowflake SQL optimization (recognizing and fixing poor-performing statements) Familiarity with medallion architecture (raw, refined, published or similar terminology)

Posted 1 day ago

Apply

4.0 - 9.0 years

4 - 9 Lacs

Chennai, Tamil Nadu, India

On-site

Foundit logo

Job Summary Core requirements - Solid SQL language skills Basic knowledge of data modeling Working knowledge with Snowflake in Azure, CI/CD process (with any tooling) Nice to have - Azure ADF ETL/ELT frameworks ER/Studio Really nice to have - Healthcare / life sciences experience GxP processes Sr DW Engineer (in addition to the above) Overseeing engineers while also performing the same work himself/herself Conducting design reviews, code reviews, and deployment reviews with engineers Solid data modeling, preferably using ER/Studio (or equivalent tool is fine) Solid Snowflake SQL optimization (recognizing and fixing poor-performing statements) Familiarity with medallion architecture (raw, refined, published or similar terminology)

Posted 1 day ago

Apply

2.0 - 5.0 years

2 - 5 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Foundit logo

Must have experience working as a Snowflake Admin/Development in Data Warehouse, ETL, BI projects. Must have prior experience with end to end implementation of Snowflake cloud data warehouse and end to end data warehouse implementations on-premise preferably on Oracle/Sql server. Expertise in Snowflake - data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, Zero copy clone, time travel and understand how to use these features - Expertise in deploying Snowflake features such as data sharing. Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python Experience in Data Migration from RDBMS to Snowflake cloud data warehouse Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling) Experience with data security and data access controls and design- Experience with AWS or Azure data storage and management technologies such as S3 and Blob Build processes supporting data transformation, data structures, metadata, dependency and workload management- Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot. Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface. Must have experience of Agile development methodologies. Good to have CI/CD in Talend using Jenkins and Nexus. TAC configuration with LDAP, Job servers, Log servers, database. Job conductor, scheduler and monitoring. GIT repository, creating user & roles and provide access to them. Agile methodology and 24/7 Admin and Platform support. Estimation of effort based on the requirement. Strong written communication skills. Is effective and persuasive in both written and oral communication.

Posted 1 day ago

Apply

10.0 - 14.0 years

10 - 14 Lacs

Delhi, India

On-site

Foundit logo

Primary Skills Minimum 10+ years of overall IT experience , with strong, recent hands-on experience on Workato Proven experience in designing and implementing integrations using Workato's low-code platform Good to have experience with Salesforce (SFDC) , Snowflake , and Oracle applications Strong knowledge of REST APIs , Webhooks , and workflow automation Proficiency in developing, testing, and deploying Workato recipes Ability to handle error processing , scalability, and performance optimization in integrations Boomi Integration Architect Key Responsibilities Design and architect robust integration solutions using Boomi AtomSphere Strong hands-on expertise in Boomi B2B/EDI integrations (e.g., X12, RN) Proficiency in Boomi API Management : development, deployment, and support Guide development teams on best practices , security , and performance standards Create and manage reusable integration patterns , components, and frameworks Lead full integration lifecycle from architecture/design to deployment and post-go-live support Ensure data integrity , compliance , and high availability across hybrid/multi-cloud environments Collaborate with enterprise architects and provide technical mentorship to Boomi developers Act as a technical escalation point for complex integration issues

Posted 1 day ago

Apply

10.0 - 14.0 years

10 - 14 Lacs

Pune, Maharashtra, India

On-site

Foundit logo

Primary Skills Minimum 10+ years of overall IT experience , with strong, recent hands-on experience on Workato Proven experience in designing and implementing integrations using Workato's low-code platform Good to have experience with Salesforce (SFDC) , Snowflake , and Oracle applications Strong knowledge of REST APIs , Webhooks , and workflow automation Proficiency in developing, testing, and deploying Workato recipes Ability to handle error processing , scalability, and performance optimization in integrations Boomi Integration Architect Key Responsibilities Design and architect robust integration solutions using Boomi AtomSphere Strong hands-on expertise in Boomi B2B/EDI integrations (e.g., X12, RN) Proficiency in Boomi API Management : development, deployment, and support Guide development teams on best practices , security , and performance standards Create and manage reusable integration patterns , components, and frameworks Lead full integration lifecycle from architecture/design to deployment and post-go-live support Ensure data integrity , compliance , and high availability across hybrid/multi-cloud environments Collaborate with enterprise architects and provide technical mentorship to Boomi developers Act as a technical escalation point for complex integration issues

Posted 1 day ago

Apply

10.0 - 14.0 years

10 - 14 Lacs

Chennai, Tamil Nadu, India

On-site

Foundit logo

Primary Skills Minimum 10+ years of overall IT experience , with strong, recent hands-on experience on Workato Proven experience in designing and implementing integrations using Workato's low-code platform Good to have experience with Salesforce (SFDC) , Snowflake , and Oracle applications Strong knowledge of REST APIs , Webhooks , and workflow automation Proficiency in developing, testing, and deploying Workato recipes Ability to handle error processing , scalability, and performance optimization in integrations Boomi Integration Architect Key Responsibilities Design and architect robust integration solutions using Boomi AtomSphere Strong hands-on expertise in Boomi B2B/EDI integrations (e.g., X12, RN) Proficiency in Boomi API Management : development, deployment, and support Guide development teams on best practices , security , and performance standards Create and manage reusable integration patterns , components, and frameworks Lead full integration lifecycle from architecture/design to deployment and post-go-live support Ensure data integrity , compliance , and high availability across hybrid/multi-cloud environments Collaborate with enterprise architects and provide technical mentorship to Boomi developers Act as a technical escalation point for complex integration issues

Posted 1 day ago

Apply

5.0 - 7.0 years

5 - 7 Lacs

Mumbai, Maharashtra, India

On-site

Foundit logo

About the Role - We are looking for an experienced Snowflake Admin to manage and optimize Snowflake cloud data platforms. The ideal candidate should have strong expertise in Snowflake architecture, performance tuning, security, and administration. This role requires the ability to troubleshoot issues, automate processes, and collaborate with cross-functional teams. Key Responsibilities: Administer and optimize Snowflake environments for performance and security. Manage user roles, permissions, and access controls. Implement best practices for database performance tuning and query optimization. Monitor system performance and troubleshoot issues proactively Work with data engineering teams to support ETL processes and integrations. Automate administrative tasks using SQL and scripting. Required Skills: 5+ years of experience in Snowflake administration. Expertise in Snowflake architecture, data sharing, and workload optimization. Strong knowledge of SQL, Python/Shell scripting for automation. Experience with data security, access management, and governance policies. Understanding of cloud environments (AWS/Azure/GCP) and Snowflake integrations. Contract Duration: 3 Months (C2C)

Posted 1 day ago

Apply

5.0 - 8.0 years

5 - 8 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Foundit logo

Bachelors Degree plus at least 5-7 years of experience with minimum 3+years in SQL development Strong working knowledge on advanced SQL capabilities like Analytics and Windowing function Working knowledge of 3+ years on some RDBMS database is must have Exposure to Shell scripts for invoking the SQL calls Exposure to the ETL tools would be good to have Working knowledge on Snowflake is good to have

Posted 1 day ago

Apply

3.0 - 5.0 years

0 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Foundit logo

Req ID: 325282 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Snowflake to join our team in Hyderabad, Telangana (IN-TG), India (IN). Snowflake and Data Vault 2 (optional) Consultant Extensive expertise in DBT , including macros, modeling, and automation techniques. Proficiency in SQL, Python , or other scripting languages for automation. Experience leveraging Snowflake for scalable data solutions. Familiarity with Data Vault 2.0 methodologies is an advantage. Strong capability in optimizing database performance and managing large datasets. Excellent problem-solving and analytical skills. Minimum of 3+ years of relevant experience, with a total of 5+ years of overall experience. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.

Posted 2 days ago

Apply

0.0 years

0 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients . Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant - ML Engineer ! In this role, we are looking for candidates who have relevant years of experienc e in d esigning and developing machine learning and deep learning system . Who have professional software development experience . Hands on r unning machine learning tests and experiments . Implementing appropriate ML algorithms engineers. Responsibilities Drive the vision for modern data and analytics platform to deliver well architected and engineered data and analytics products leveraging cloud tech stack and third-party products Close the gap between ML research and production to create ground-breaking new products, features and solve problems for our customers Design, develop, test, and deploy data pipelines, machine learning infrastructure and client-facing products and services Build and implement machine learning models and prototype solutions for proof-of-concept Scale existing ML models into production on a variety of cloud platforms Analyze and resolve architectural problems, working closely with engineering, data science and operations teams Qualifications we seek in you! Minimum Q ualifications / Skills Good years experience B achelor%27s degree in computer science engineering, information technology or BSc in Computer Science, Mathematics or similar field Master&rsquos degree is a plus Integration - APIs, micro- services and ETL/ELT patterns DevOps (Good to have) - Ansible, Jenkins, ELK Containerization - Docker, Kubernetes etc Orchestration - Google composer Languages and scripting: Python, Scala Java etc Cloud Services - GCP, Snowflake Analytics and ML tooling - Sagemaker , ML Studio Execution Paradigm - low latency/Streaming, batch Preferred Q ualifications / Skills Data platforms - DBT, Fivetran and Data Warehouse (Teradata, Redshift, BigQuery , Snowflake etc.) Visualization Tools - PowerBI , Tableau Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Get to know us at and on , , , and . Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training .

Posted 2 days ago

Apply

0.0 years

0 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Foundit logo

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients . Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant - ML Engineer ! In this role, we are looking for candidates who have relevant years of experienc e in d esigning and developing machine learning and deep learning system . Who have professional software development experience . Hands on r unning machine learning tests and experiments . Implementing appropriate ML algorithms engineers. Responsibilities Drive the vision for modern data and analytics platform to deliver well architected and engineered data and analytics products leveraging cloud tech stack and third-party products Close the gap between ML research and production to create ground-breaking new products, features and solve problems for our customers Design, develop, test, and deploy data pipelines, machine learning infrastructure and client-facing products and services Build and implement machine learning models and prototype solutions for proof-of-concept Scale existing ML models into production on a variety of cloud platforms Analyze and resolve architectural problems, working closely with engineering, data science and operations teams Qualifications we seek in you! Minimum Q ualifications / Skills Good years experience B achelor%27s degree in computer science engineering, information technology or BSc in Computer Science, Mathematics or similar field Master&rsquos degree is a plus Integration - APIs, micro- services and ETL/ELT patterns DevOps (Good to have) - Ansible, Jenkins, ELK Containerization - Docker, Kubernetes etc Orchestration - Google composer Languages and scripting: Python, Scala Java etc Cloud Services - GCP, Snowflake Analytics and ML tooling - Sagemaker , ML Studio Execution Paradigm - low latency/Streaming, batch Preferred Q ualifications / Skills Data platforms - DBT, Fivetran and Data Warehouse (Teradata, Redshift, BigQuery , Snowflake etc.) Visualization Tools - PowerBI , Tableau Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Get to know us at and on , , , and . Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training .

Posted 2 days ago

Apply

0.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Foundit logo

Inviting applications for the role of Lead Consultant -Data Engineer! . Design, document & implement the data pipelines to feed data models for subsequent consumption in Snowflake using dbt, and airflow. . Ensure correctness and completeness of the data being transformed via engineering pipelines for end consumption in Analytical Dashboards. . Actively monitor and triage technical challenges in critical situations that require immediate resolution. . Evaluate viable technical solutions and share MVPs or PoCs in support of the research . Develop relationships with external stakeholders to maintain awareness of data and security issues and trends . Review work from other tech team members and provide feedback for growth . Implement Data Performance and data security policies that align with governance objectives and regulatory requirements . Effectively mentor and develop your team members . You have experience in data warehousing, data modeling, and the building of data engineering pipelines. . You are well versed in data engineering methods, such as ETL and ELT techniques through scripting and/or tooling. . You are good at analyzing performance bottlenecks and providing enhancement recommendations you have a passion for customer service and a desire to learn and grow as a professional and a technologist. . Strong analytical skills related to working with structured, semi-structured, and unstructured datasets. . Collaborating with product owners to identify requirements, define desired and deliver trusted results. . Building processes supporting data transformation, data structures, metadata, dependency, and workload management. . In this role, SQL is heavily focused. An ideal candidate must have hands-on experience with SQL database design. Plus, Python. . Demonstrably deep understanding of SQL (level: advanced) and analytical data warehouses (Snowflake preferred). . Demonstrated ability to write new code i.e., well-documented and stored in a version control system (we use GitHub & Bitbucket) . Extremely talented in applying SCD, CDC, and DQ/DV framework. . Familiar with JIRA & Confluence. . Must have exposure to technologies such as dbt, Apache airflow, and Snowflake. . Desire to continually keep up with advancements in data engineering practices. Qualifications we seek in you! Minimum qualifications: Essential Education Bachelor%27s degree or equivalent combination of education and experience. Bachelor%27s degree in information science, data management, computer science or related field preferred. Essential Experience & Job Requirements . IT experience with a major focus on data warehouse/database-related projects . Must have exposure to technologies such as dbt, Apache Airflow, and Snowflake. . Experience in other data platforms: Oracle, SQL Server, MDM, etc . Expertise in writing SQL and database objects - Stored procedures, functions, and views. Hands-on experience in ETL/ELT and data security, SQL performance optimization and job orchestration tools and technologies e.g., dbt, APIs, Apache Airflow, etc. . Experience in data modeling and relational database design . Well-versed in applying SCD, CDC, and DQ/DV framework. . Demonstrate ability to write new code i.e., well-documented and stored in a version control system (we use GitHub & Bitbucket) . Good to have experience with Cloud Platforms such as AWS, Azure, GCP and Snowflake . Good to have strong programming/ scripting skills (Python, PowerShell, etc.) . Experience working with agile methodologies (Scrum, Kanban) and Meta Scrum with cross-functional teams (Product Owners, Scrum Master, Architects, and data SMEs) o Excellent written, and oral communication and presentation skills to present architecture, features, and solution recommendations Global functional product portfolio technical leaders (Finance, HR, Marketing, Legal, Risk, IT), product owners, functional area teams across levels o Global Data Product Portfolio Management & teams (Enterprise Data Model, Data Catalog, Master Data Management) Preferred Qualifications Knowledge of AWS cloud, and Python is a plus. . . . . . .

Posted 2 days ago

Apply

1.0 - 3.0 years

1 - 3 Lacs

Chennai, Tamil Nadu, India

On-site

Foundit logo

AWS, Python, SQL, spark, Airflow, SnowflakeResponsibilities Create and manage cloud resources in AWS Data ingestion from different data sources which exposes data using different technologies, such asRDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations Develop an infrastructure to collect, transform, combine and publish/distribute customer data. Define process improvement opportunities to optimize data collection, insights and displays. Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible Identify and interpret trends and patterns from complex data sets Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders. Key participant in regular Scrum ceremonies with the agile teams Proficient at developing queries, writing reports and presenting findings Mentor junior members and bring best industry practices

Posted 2 days ago

Apply

5.0 - 8.0 years

2 - 11 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Data Warehouse Solution Design & Development : Lead the design and implementation of batch and real-time ingestion architectures for data warehouses. Ensure that solutions are scalable, reliable, and optimized for performance. Team Leadership & Mentoring : Lead and mentor a team of data engineers , fostering a collaborative environment to encourage knowledge sharing and continuous improvement. Ensure that the team meets high standards of quality and performance. Hands-on Technical Delivery : Actively engage in hands-on development and ensure seamless delivery of data solutions. Provide technical direction and hands-on support for complex issues. Issue Resolution & Troubleshooting : Capable of troubleshooting issues that arise during runtime and providing quick resolutions to minimize disruptions and maintain system stability. API Management : Oversee the integration and management of APIs using APIM for seamless communication between internal and external systems. Implement and maintain API gateways and monitor API performance. Client Communication : Interact directly with clients , ensuring clear and convincing communication of technical ideas and project progress. Translate customer requirements into technical solutions and drive the implementation process. Cloud & DevOps : Ensure that the data solutions are designed with cloud-native technologies such as Azure , Snowflake , and DBT . Use Azure DevOps for continuous integration and deployment pipelines. Mentoring & Best Practices : Guide the team on best practices for data engineering , code reviews , and performance optimization . Ensure the adoption of modern tools and techniques to improve delivery efficiency. Mandatory Skills : Python for data engineering Snowflake and Postgres development experience Proficient in API Management (APIM) and DBT Strong experience with Azure DevOps for CI/CD Proven experience in data warehouse solutions design, development, and implementation Desired Skills : Experience with Apache Kafka , Azure Event Hub , Apache Airflow , Apache Flink Familiarity with Grafana , Prometheus , Terraform , Kubernetes Power BI for reporting and data visualization

Posted 2 days ago

Apply

2.0 - 6.0 years

2 - 7 Lacs

Bengaluru / Bangalore, Karnataka, India

Remote

Foundit logo

Were seeking an experienced MS SQL Server Developer to join our team. As a Senior MS SQL Server Developer, you will be responsible for designing, developing, and maintaining large-scale databases using MS SQL Server, NoSQL, and Snowflake. If you have a passion for database development and a strong understanding of database principles, we encourage you to apply. Responsibilities: Design, develop, and maintain large-scale databases using MS SQL Server, NoSQL, and Snowflake Develop and optimize database queries, stored procedures, and functions Collaborate with cross-functional teams to identify and prioritize database requirements Implement data modeling, database design, and data warehousing best practices Develop and maintain database documentation, including data dictionaries and entity-relationship diagrams Troubleshoot and resolve database performance issues and errors Stay up-to-date with the latest database technologies and trends Requirements: 8+ years of experience in database development using MS SQL Server, NoSQL, and Snowflake Strong understanding of database principles, including data modeling, database design, and data warehousing Experience with database performance tuning, optimization, and troubleshooting Proficiency in T-SQL, SQL, and database query languages Experience with agile development methodologies and version control systems (e.g., Git) Strong communication and collaboration skills Bachelors degree in Computer Science, Information Technology, or related field Nice to Have: Experience with cloud-based databases (e.g., AWS, Azure) Knowledge of data governance, data quality, and data security best practices Experience with data visualization tools (e.g., Tableau, Power BI) Certifications in MS SQL Server, NoSQL, or Snowflake What We Offer: Competitive salary and benefits package Opportunities for career growth and professional development Collaborative and dynamic work environment Flexible work arrangements, including remote work options Access to the latest database technologies and tools

Posted 2 days ago

Apply

2.0 - 6.0 years

2 - 7 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

Remote

Foundit logo

Were seeking an experienced MS SQL Server Developer to join our team. As a Senior MS SQL Server Developer, you will be responsible for designing, developing, and maintaining large-scale databases using MS SQL Server, NoSQL, and Snowflake. If you have a passion for database development and a strong understanding of database principles, we encourage you to apply. Responsibilities: Design, develop, and maintain large-scale databases using MS SQL Server, NoSQL, and Snowflake Develop and optimize database queries, stored procedures, and functions Collaborate with cross-functional teams to identify and prioritize database requirements Implement data modeling, database design, and data warehousing best practices Develop and maintain database documentation, including data dictionaries and entity-relationship diagrams Troubleshoot and resolve database performance issues and errors Stay up-to-date with the latest database technologies and trends Requirements: 8+ years of experience in database development using MS SQL Server, NoSQL, and Snowflake Strong understanding of database principles, including data modeling, database design, and data warehousing Experience with database performance tuning, optimization, and troubleshooting Proficiency in T-SQL, SQL, and database query languages Experience with agile development methodologies and version control systems (e.g., Git) Strong communication and collaboration skills Bachelors degree in Computer Science, Information Technology, or related field Nice to Have: Experience with cloud-based databases (e.g., AWS, Azure) Knowledge of data governance, data quality, and data security best practices Experience with data visualization tools (e.g., Tableau, Power BI) Certifications in MS SQL Server, NoSQL, or Snowflake What We Offer: Competitive salary and benefits package Opportunities for career growth and professional development Collaborative and dynamic work environment Flexible work arrangements, including remote work options Access to the latest database technologies and tools

Posted 2 days ago

Apply

2.0 - 6.0 years

2 - 7 Lacs

Delhi, India

Remote

Foundit logo

Were seeking an experienced MS SQL Server Developer to join our team. As a Senior MS SQL Server Developer, you will be responsible for designing, developing, and maintaining large-scale databases using MS SQL Server, NoSQL, and Snowflake. If you have a passion for database development and a strong understanding of database principles, we encourage you to apply. Responsibilities: Design, develop, and maintain large-scale databases using MS SQL Server, NoSQL, and Snowflake Develop and optimize database queries, stored procedures, and functions Collaborate with cross-functional teams to identify and prioritize database requirements Implement data modeling, database design, and data warehousing best practices Develop and maintain database documentation, including data dictionaries and entity-relationship diagrams Troubleshoot and resolve database performance issues and errors Stay up-to-date with the latest database technologies and trends Requirements: 8+ years of experience in database development using MS SQL Server, NoSQL, and Snowflake Strong understanding of database principles, including data modeling, database design, and data warehousing Experience with database performance tuning, optimization, and troubleshooting Proficiency in T-SQL, SQL, and database query languages Experience with agile development methodologies and version control systems (e.g., Git) Strong communication and collaboration skills Bachelors degree in Computer Science, Information Technology, or related field Nice to Have: Experience with cloud-based databases (e.g., AWS, Azure) Knowledge of data governance, data quality, and data security best practices Experience with data visualization tools (e.g., Tableau, Power BI) Certifications in MS SQL Server, NoSQL, or Snowflake What We Offer: Competitive salary and benefits package Opportunities for career growth and professional development Collaborative and dynamic work environment Flexible work arrangements, including remote work options Access to the latest database technologies and tools

Posted 2 days ago

Apply

1.0 - 4.0 years

1 - 4 Lacs

Indore, Madhya Pradesh, India

On-site

Foundit logo

Contract Duration: 6 Months Responsibilities: Snowflake Administration & Development in Data Warehouse, ETL, and BI projects. End-to-end implementation of Snowflake cloud data warehouse and on-premise data warehouse solutions (Oracle/SQL Server). Expertise in Snowflake: Data modeling, ELT using Snowflake SQL, complex stored procedures, and standard DWH/ETL concepts. Advanced features: resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning. Zero copy clone, time travel, and data sharing deployment. Hands-on experience with Snowflake utilities: SnowSQL, SnowPipe, Big Data model techniques using Python. Data Migration from RDBMS to Snowflake cloud data warehouse. Deep understanding of relational and NoSQL data stores, including star and snowflake dimensional modeling. Data security & access controls design expertise. Experience with AWS/Azure data storage and management technologies (S3, Blob). Process development for data transformation, metadata, dependency, and workload management. Proficiency in RDBMS: Complex SQL, PL/SQL, Unix Shell Scripting, performance tuning, and troubleshooting. Problem resolution for complex data pipeline issues, proactively and as they arise. Agile development methodologies experience. Good-to-Have Skills: CI/CD in Talend using Jenkins and Nexus. TAC configuration with LDAP, Job servers, Log servers, and databases. Job conductor, scheduler, and monitoring expertise. GIT repository management, including user roles and access control. Agile methodology and 24/7 Admin & Platform support. Effort estimation based on requirements. Strong written communication skills, effective and persuasive in both written and oral communication.

Posted 2 days ago

Apply

1.0 - 4.0 years

1 - 4 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Foundit logo

Contract Duration: 6 Months Responsibilities: Snowflake Administration & Development in Data Warehouse, ETL, and BI projects. End-to-end implementation of Snowflake cloud data warehouse and on-premise data warehouse solutions (Oracle/SQL Server). Expertise in Snowflake: Data modeling, ELT using Snowflake SQL, complex stored procedures, and standard DWH/ETL concepts. Advanced features: resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning. Zero copy clone, time travel, and data sharing deployment. Hands-on experience with Snowflake utilities: SnowSQL, SnowPipe, Big Data model techniques using Python. Data Migration from RDBMS to Snowflake cloud data warehouse. Deep understanding of relational and NoSQL data stores, including star and snowflake dimensional modeling. Data security & access controls design expertise. Experience with AWS/Azure data storage and management technologies (S3, Blob). Process development for data transformation, metadata, dependency, and workload management. Proficiency in RDBMS: Complex SQL, PL/SQL, Unix Shell Scripting, performance tuning, and troubleshooting. Problem resolution for complex data pipeline issues, proactively and as they arise. Agile development methodologies experience. Good-to-Have Skills: CI/CD in Talend using Jenkins and Nexus. TAC configuration with LDAP, Job servers, Log servers, and databases. Job conductor, scheduler, and monitoring expertise. GIT repository management, including user roles and access control. Agile methodology and 24/7 Admin & Platform support. Effort estimation based on requirements. Strong written communication skills, effective and persuasive in both written and oral communication.

Posted 2 days ago

Apply

12.0 - 22.0 years

3 - 6 Lacs

Chennai, Tamil Nadu, India

On-site

Foundit logo

We are hiring an ESA Solution Architect - COE for a CMMI Level 5 client. If you have relevant experience and are looking for a challenging opportunity, we invite you to apply. Key Responsibilities: Design and implement enterprise solutions that align with business and technical requirements. Lead migration projects from on-premise to cloud or cloud-to-cloud (preferably Snowflake). Provide expertise in ETL technologies such as Informatica, Matillion, and Talend . Develop Snowflake-based solutions and optimize data architectures. Analyze project constraints, mitigate risks, and recommend process improvements. Act as a liaison between technical teams and stakeholders , translating business needs into technical solutions. Conduct architectural system evaluations to ensure scalability and efficiency. Define processes and procedures to streamline solution delivery. Create solution prototypes and participate in technology selection . Ensure compliance with strategic guidelines, technical standards, and business objectives. Oversee solution development and collaborate closely with project management and IT teams. Required Skills & Experience: 10+ years of experience in technical solutioning and enterprise solution architecture. Proven experience in cloud migration projects (on-prem to cloud/cloud-to-cloud). Strong expertise in Snowflake architecture and solutioning . Hands-on experience with ETL tools such as Informatica, Matillion, and Talend . Excellent problem-solving and risk mitigation skills. Ability to work with cross-functional teams and align technical solutions with business goals. If you are interested, please share your updated profile.

Posted 2 days ago

Apply

7.0 - 15.0 years

20 - 36 Lacs

Chennai, Tamil Nadu, India

On-site

Foundit logo

Bounteous x Accolite is a premier end-to-end digital transformation consultancy dedicated to partnering with ambitious brands to create digital solutions for today's complex challenges and tomorrow's opportunities. With uncompromising standards for technical and domain expertise, we deliver innovative and strategic solutions in Strategy, Analytics, Digital Engineering, Cloud, Data & AI, Experience Design, and Marketing. Our Co-Innovation methodology is a unique engagement model designed to align interests and accelerate value creation. Our clients worldwide benefit from the skills and expertise of over 4,000+ expert team members across the Americas, APAC, and EMEA. By partnering with leading technology providers, we craft transformative digital experiences that enhance customer engagement and drive business success. About Bounteous ( https://www.bounteous.com/ ) Founded in 2003 in Chicago, Bounteous is a leading digital experience consultancy that co-innovates with the world's most ambitious brands to create transformative digital experiences. With services in Strategy, Experience Design, Technology, Analytics and Insight, and Marketing, Bounteous elevates brand experiences through technology partnerships and drives superior client outcomes. For more information, please visit www.bounteous.com Information Security Responsibilities Promote and enforce awareness of key information security practices, including acceptable use of information assets, malware protection, and password security protocols Identify, assess, and report security risks, focusing on how these risks impact the confidentiality, integrity, and availability of information assets Understand and evaluate how data is stored, processed, or transmitted, ensuring compliance with data privacy and protection standards (GDPR, CCPA, etc.) Ensure data protection measures are integrated throughout the information lifecycle to safeguard sensitive information Preferred Qualifications 7+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field. Working knowledge ofETL technology - Talend / Apache Ni-fi / AWS Glue Experience with relational SQL and NoSQL databases Experience with big data tools: Hadoop, Spark, Kafka, etc.(Nice to have) Advanced Alteryx Designer (Mandatory at this point - relaxing that would be tough) Tableau Dashboarding AWS (familiarity with Lambda, EC2, AMI) Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.(Nice to have) Experience with cloud services: EMR, RDS, Redshift or Snowflake Experience with stream-processing systems: Storm, Spark-Streaming, etc.(Nice to have) Experience with object-oriented/object function scripting languages: Python, Java, Scala, etc. Responsibilities Work with Project Managers, Senior Architects and other team members from Bounteous & Client teams to evaluate data systems and project requirements In cooperation with platform developers, develop scalable and fault-tolerant Extract Transform Load (ETL) and integration systems for various data platforms which can operate at appropriate scale; meeting security, logging, fault tolerance and alerting requirements. Work on Data Migration Projects. Effectively communicate data requirements of various data platforms to team members Evaluate and document existing data ecosystems and platform capabilities Configure CI/CD pipelines Implement proposed architecture and assist in infrastructure setup We invite you to stay connected with us by subscribing to our monthly job openings alert here . Research shows that women and other underrepresented groups apply only if they meet 100% of the criteria of a job posting. If you have passion and intelligence, and possess a technical knack (even if you're missing some of the above), we encourage you to apply. Bounteous x Accolite is focused on promoting an inclusive environment and is proud to be an equal opportunity employer. We celebrate the different viewpoints and experiences our diverse group of team members bring to Bounteous x Accolite. Bounteous x Accolite does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, physical or mental disability, national origin, veteran status, or any other status protected under federal, state, or local law. In addition, you have the opportunity to participate in several Team Member Networks, sometimes referred to as employee resource groups (ERGs), that host space with individuals with shared identities, interests, and passions. Our Team Member Networks celebrate communities of color, life as a working parent or caregiver, the 2SLGBTQIA+ community, wellbeing, and more. Regardless of your respective identity, there are various avenues we involve team members in the Bounteous x Accolite community. Bounteous x Accolite is willing to sponsor eligible candidates for employment visas.

Posted 2 days ago

Apply

10.0 - 15.0 years

10 - 15 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Foundit logo

How You Will Fulfill Your Potential Be part of a team, working closely with product development, UX designers, sales, and other engineers Write high quality, we'll-tested, and scalable code Evaluate the short- and long-term implications of every implementation decision Grow professionally and learn from other accomplished software engineers through pair coding while helping your peers to do the same Collaborate on critical system architecture decisions Review and provide feedback on other developers code and design Evaluate modern technologies, prototype innovative approaches to problems and make the business case for change Use infrastructure-as-code to build and deploy cloud-native services in AWS Requirements Experience leading and mentoring junior software engineers in a professional setting Meaningful experience in, but not limited to, any one of the following: Java, C#, Ruby on Rails, GO, Python, AWS (Amazon Web Services), JavaScript, React/Redux. Should include experience of working with non-relational as we'll as relational databases Experience with Data Pipelines, Data Warehouses, Snowflake is a plus Strong knowledge of data structures and algorithms Excellent object oriented or functional analysis and design skills Comfortable multi-tasking, managing multiple stakeholders, and working as part of a global team Proven communication and interpersonal ability Experience building client- and consumer-facing products is a plus, but far from required Knowledge of existing strategic firmwide platforms is a plus, but far from required

Posted 5 days ago

Apply

8.0 - 12.0 years

8 - 12 Lacs

Chennai, Tamil Nadu, India

On-site

Foundit logo

Accountabilities: Lead the design, development, and deployment of high-performance, scalable data warehouses and data pipelines. Collaborate closely with multi-functional teams to understand business requirements and translate them into technical solutions. Oversee and optimize the use of Snowflake for data storage and analytics. Develop and maintain SQL-based ETL processes. Implement data workflows and orchestrations using Airflow. Apply DBT for data transformation and modeling tasks. Mentor and guide junior data engineers, fostering a culture of learning and innovation within the team. Conduct performance tuning and optimization for both ongoing and new data projects. Confirmed ability to handle large, complex data sets and develop data-centric solutions. Strong problem-solving skills and a keen analytical mentality. Excellent communication and leadership skills, with the ability to work effectively in a team-oriented environment. 8-12 years of experience in data engineering roles, focusing on data warehousing, data integration, and data product development. Essential Skills/Experience: Snowflake SQL Airflow DBT Desirable Skills/Experience: Snaplogic Python Academic Qualifications: Bachelor s or Master s degree in Computer Science, Information Technology, or related field.

Posted 5 days ago

Apply

4.0 - 7.0 years

3 - 7 Lacs

Chennai, Tamil Nadu, India

On-site

Foundit logo

Accountabilities Strategic Leadership: Evaluate current platforms and lead the design of future-ready solutions, embedding AI-driven efficiencies and proactive interventions. Innovation & Integration: Introduce and integrate AI technologies to enhance ways of working, driving cost-effectiveness and operational excellence. Platform Maturity & Management: Ensure platforms are scalable and compliant, with robust automation and optimized technology stacks. Lead Deliveries: Oversee and manage the delivery of projects, ensuring timely execution and alignment with strategic goals. Thought Leadership: Champion data mesh and product-oriented work methodologies to continuously evolve our data landscapes. Quality and Compliance: Implement quality assurance processes, emphasizing data accuracy and security. Collaborative Leadership: Foster an environment that supports cross-functional collaboration and continuous improvement. Essential Skills/Experience Extensive experience with Snowflake, AI platforms, and cloud infrastructure. Proven track record in thought leadership, platform strategy, and cross-disciplinary innovation. Expertise in AI/GenAI integration with a focus on practical business applications. Strong experience in DataOps, DevOps, and cloud environments such as AWS. Excellent stakeholder management and the ability to lead diverse teams toward innovative solutions. Background in the pharmaceutical sector is a plus.

Posted 5 days ago

Apply

5.0 - 7.0 years

5 - 7 Lacs

Noida, Uttar Pradesh, India

On-site

Foundit logo

Role & Required Skills: Proven experience in Snowflake. Good experience in SQL and Python. Experience in Data Warehousing. Experience in Data migration from SQL to Snowflake. AWS experience is nice to have. Good communication skills. Responsibilities: Implement business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL) and data tools (reporting, visualization, analytics). Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models. Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Analyze and translate business needs into long-term solution data models. Evaluate existing data systems. Work with the development team to create conceptual data models and data flows. Develop best practices for data coding to ensure consistency within the system. Review modifications of existing systems for cross-compatibility. Implement data strategies and develop physical data models. Update and optimize local and metadata models. Evaluate implemented data systems for variances, discrepancies, and efficiency. Troubleshoot and optimize data systems.

Posted 6 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies