Jobs
Interviews

2799 Snowflake Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You will be working as an Outsystem or Snowflake Developer at KPMG in India, a professional services firm affiliated with KPMG International Limited. Established in India in August 1993, KPMG leverages a global network of firms and possesses in-depth knowledge of local laws, regulations, and markets. With offices in multiple cities across India, KPMG offers services to national and international clients across various sectors. As an Outsystem or Snowflake Developer, you will be responsible for developing and maintaining applications using Outsystem or Snowflake technologies. Your role will involve collaborating with the team to deliver high-quality solutions that meet client requirements. Additionally, you will contribute to the enhancement of technology-enabled services, leveraging your expertise in global and local industries. To qualify for this position, you should possess a Bachelor's degree or equivalent in a relevant field. Your ability to work effectively in a team, adapt to changing technology landscapes, and deliver innovative solutions will be crucial to your success in this role at KPMG in India. Join us at KPMG and be part of a dynamic team that values equal employment opportunities and encourages professional growth and development.,

Posted 22 hours ago

Apply

8.0 - 12.0 years

0 Lacs

pune, maharashtra

On-site

Cowbell is signaling a new era in cyber insurance by harnessing technology and data to provide small and medium-sized enterprises (SMEs) with advanced warning of cyber risk exposures bundled with cyber insurance coverage adaptable to the threats of today and tomorrow. Championing adaptive insurance, Cowbell follows policyholders" cyber risk exposures as they evolve through continuous risk assessment and continuous underwriting. In its unique AI-based approach to risk selection and pricing, Cowbell's underwriting platform, powered by Cowbell Factors, compresses the insurance process from submission to issue to less than 5 minutes. Founded in 2019 and based in the San Francisco Bay Area, Cowbell has rapidly grown, now operating across the U.S., Canada, U.K., and India. This growth was recently bolstered by a successful Series C fundraising round of $60 million from Zurich Insurance. This investment not only underscores the confidence in Cowbell's mission but also accelerates our capacity to revolutionize cyber insurance on a global scale. With the backing of over 25 prominent reinsurance partners, Cowbell is poised to redefine how SMEs navigate the evolving landscape of cyber threats. In support of business objectives, we are actively looking for an ambitious person, who is not afraid of hard-work and embraces ambiguity as it comes to join our Information Security Team as a Sr. Developer, Application Security. The InfoSec team drives security, privacy, and compliance improvements to reduce risk by building out key security programs. We enable our colleagues to keep the company secure and support our customers" security journey with tried and true best practices. We are a Java, Python, and React shop combined with world-class cloud infrastructure such as AWS & Snowflake. Balancing proper security while enabling execution speed for our colleagues is our ultimate goal. It's challenging and rewarding! If you are up for the challenge, come join us. You will be instrumental in curing security defects in code, burning down any new and existing vulnerabilities. You can fix the code yourself and continuous patching is your north star. You will be the champion for safeguards and standards that will keep our code secure and reduce the introduction of new vulnerabilities. Partner and collaborate with internal stakeholders in assisting with the overall security posture with an emphasis on the Engineering and Operations/IT areas. Work across engineering, product and business systems teams to enhance and evangelize security in applications (& infrastructure). Research emerging technologies and maintain awareness of current security risks in support of security enhancement and development efforts. Develop and maintain application scanning solutions to inform stakeholders of security weaknesses & vulnerabilities. Review outstanding vulnerabilities with product teams and assist in remediation efforts to reduce risk. Bachelor's degree in computer science or another STEM discipline and 8 to 10+ years of professional experience in security software development. Majority of prior experience as a Security Engineer focused on remediation of security vulnerabilities and defects in Java and Python. Must have prior in-depth demonstrable experience developing in JAVA and Python; Basically you are developer first and a security engineer second. Applicants that do not have this experience will not be considered. Experience developing in, and securing, Javascript and React a plus. Experience securing integrations and code that utilizes Elasticsearch, Snowflake, Databricks, RDS a big plus. Detail-oriented with problem-solving, communication, and analytical skills. Expert understanding of CVE and CVSS scoring and how to utilize this data for validation, prioritization, and remediation. Excellent understanding and utilization of OWASP. Demonstrated ability to secure API; Techniques, patterns, will be assessed. Experience designing and implementing application security solutions for web and or mobile applications. Experience developing and reporting vulnerability metrics as well as articulating how to reproduce and resolve those security defects. Experienced in application penetration testing; and understanding of remediation techniques for common misconfigurations and vulnerabilities. Demonstrable experience in understanding patching and library upgrade paths including interdependencies. Familiarity with CI/CD tools. Previous admin experience in CI/CD is not required but a big plus. Capability to deploy, provide maintenance for, and operationalize scanning solutions. Hands-on ability to conduct scans across application repositories and infrastructure. Must be willing to work extended hours and weekends as needed. Great at and enjoys documenting solutions; creating repeatable instruction for others, operational documentation, developing technical diagrams, and similar artifacts. Preferred Qualifications: You can demonstrate and document threat modeling scenarios using well-known frameworks such as STRIDE. Proficient with penetration testing tools such Burp suite, Metasploit or ZAP. You are already proficient with SAST & SCA tools; proficiency with DAST and/or OAST tool usage and techniques would be even better. As a mentor you also have the experience and desire in providing fellow engineering teams with technical guidance on the impact and priority of security issues and driving remediation. Capability to develop operational process from scratch or improve current processes and procedures through well-thought-out hand-offs, integrations, and automation. Familiarity with multiple security domains such as application security, infrastructure security, network security, incident response, and regulatory compliance and certifications. Understanding of modern endpoint security technologies/concepts. Adept at working with distributed team members. What Cowbell brings to the table: Employee equity plan for all and wealth enablement plan for select customer-facing roles. Comprehensive wellness program, meditation app subscriptions, lunch and learn, book club, happy hours, and much more. Professional development and the opportunity to learn the ins and outs of cyber insurance, cybersecurity as well as continuing to build your professional skills in a team environment. Equal Employment Opportunity: Cowbell is a leading innovator in cyber insurance, dedicated to empowering businesses to always deliver their intended outcomes as the cyber threat landscape evolves. Guided by our core values of TRUE Transparency, Resiliency, Urgency, and Empowerment, we are on a mission to be the gold standard for businesses to understand, manage, and transfer cyber risk. At Cowbell, we foster a collaborative and dynamic work environment where every employee is empowered to contribute and grow. We pride ourselves on our commitment to transparency and resilience, ensuring that we not only meet but exceed industry standards. We are proud to be an equal opportunity employer, promoting a diverse and inclusive workplace where all voices are heard and valued. Our employees enjoy competitive compensation, comprehensive benefits, and continuous opportunities for professional development.,

Posted 23 hours ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Senior Software Engineer specializing in Debezium, Snowflake, Business Objects, Power BI, Java/Python, and SQL, with 3 to 6 years of experience in Software Development/Engineering, you will be a crucial member of our team in either Bangalore or Hyderabad (Position ID: J1124-1679). In this permanent role, your primary responsibility will be the development and maintenance of our applications to ensure they are robust, user-friendly, and scalable. Your key duties and responsibilities will include designing, developing, and maintaining web applications utilizing technologies such as Debezium, Snowflake, Business Objects, Power BI, and Pentaho. You will collaborate with cross-functional teams to define, design, and implement new features, ensuring clean, scalable, and efficient code. Additionally, you will conduct code reviews, perform unit testing and continuous integration, as well as troubleshoot and resolve technical issues promptly. Staying abreast of emerging technologies and industry trends will be essential, and active participation in Agile/Scrum development processes is expected. To excel in this role, you should hold a Bachelor's degree in Computer Science, Information Technology, or a related field, coupled with at least 3 years of experience in full-stack development. Possessing analytical and multitasking skills will be advantageous, along with familiarity with tools like JIRA, Gitlab, and Confluence. Proficiency in database technologies such as SQL, MySQL, PostgreSQL, or NoSQL databases, as well as experience with version control systems like Git, is preferred. Knowledge of cloud services like AWS, Azure, or Google Cloud, understanding CI/CD pipelines, and DevOps practices will be beneficial. Soft skills are paramount in this role, with a strong emphasis on problem-solving, communication, collaboration, and the ability to thrive in a fast-paced, agile environment. The successful candidate will exhibit a strong work ethic and a commitment to turning insights into actionable solutions. At CGI, we prioritize ownership, teamwork, respect, and belonging. As a CGI Partner, you will have the opportunity to contribute from day one, shaping our collective success and actively participating in the company's strategy and direction. Your work will be valued and impactful, allowing you to innovate, build relationships, and leverage global capabilities. CGI offers a supportive environment for career growth, health, and well-being, providing opportunities to enhance your skills and broaden your horizons. Join our team at CGI, one of the world's largest IT and business consulting services firms, and embark on a fulfilling career journey with us.,

Posted 23 hours ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

As a Data Architect / Data Modeling Expert, you will be an essential part of our offshore team based in India, collaborating closely with Business Analysts and Technical Analysts. Your primary responsibilities will revolve around designing and implementing efficient data models in Snowflake, along with creating source-to-target mapping documents. Your expertise in data modeling principles, coupled with exposure to ETL tools, will play a crucial role in architecting databases and driving data modeling initiatives leading to AI solutions. Your key responsibilities will include: - Designing and implementing normalized and denormalized data models in Snowflake based on business and technical requirements. - Collaborating with Business Analysts/Technical Analysts to gather data needs and document requirements effectively. - Developing source-to-target mapping documents to ensure accurate data transformations. - Working on data ingestion, transformation, and integration pipelines using SQL and cloud-based tools. - Optimizing Snowflake queries, schema designs, and indexing for enhanced performance. - Maintaining clear documentation of data models, mappings, and data flow processes. - Ensuring data accuracy, consistency, and compliance with best practices in data governance and quality. You should possess: - 10+ years of experience in Data Modeling, Data Engineering, or related roles. - A strong understanding of data modeling concepts such as OLTP, OLAP, Star Schema, and Snowflake Schema. - Hands-on experience in Snowflake including schema design and query optimization. - The ability to create detailed source-to-target mapping documents. - Proficiency in SQL-based data transformations and queries. - Exposure to ETL tools, with familiarity in Matillion considered advantageous. - Strong problem-solving and analytical skills. - Excellent communication skills for effective collaboration with cross-functional teams. Preferred qualifications include experience in cloud-based data environments (AWS, Azure, or GCP), hands-on exposure to Matillion or other ETL tools, understanding of data governance and security best practices, and familiarity with Agile methodologies.,

Posted 1 day ago

Apply

6.0 - 10.0 years

0 Lacs

delhi

On-site

The client, a leading MNC, specializes in technology consulting and digital solutions for global enterprises. With a vast workforce of over 145,000 professionals across 90+ countries, they cater to 1100+ clients in various industries. The company offers a comprehensive range of services including consulting, IT solutions, enterprise applications, business processes, engineering, network services, customer experience, AI & analytics, and cloud infrastructure services. Notably, they have been recognized for their commitment to sustainability with the Terra Carta Seal, showcasing their dedication to building a climate and nature-positive future. As a Data Engineer with a minimum of 6 years of experience, you will be responsible for constructing and managing data pipelines. The ideal candidate should possess expertise in Databricks, AWS/Azure, and data storage technologies such as databases and distributed file systems. Familiarity with the Spark framework is essential, and prior experience in the retail sector would be advantageous. Key Responsibilities: - Design, develop, and maintain scalable ETL pipelines for processing large data volumes from diverse sources. - Implement and oversee data integration solutions utilizing tools like Databricks, Snowflake, and other relevant technologies. - Develop and optimize data models and schemas to support analytical and reporting requirements. - Write efficient and sustainable Python code for data processing and transformations. - Utilize Apache Spark for distributed data processing and large-scale analytics. - Translate business needs into technical solutions. - Ensure data quality and integrity through rigorous unit testing. - Collaborate with cross-functional teams to integrate data pipelines with other systems. Technical Requirements: - Proficiency in Databricks for data integration and processing. - Experience with ETL tools and processes. - Strong Python programming skills with Apache Spark, emphasizing data processing and automation. - Solid SQL skills and familiarity with relational databases. - Understanding of data warehousing concepts and best practices. - Exposure to cloud platforms such as AWS and Azure. - Hands-on troubleshooting ability and problem-solving skills for complex data issues. - Practical experience with Snowflake.,

Posted 1 day ago

Apply

2.0 - 10.0 years

0 Lacs

pune, maharashtra

On-site

As a Principal Data Engineer at Brillio, you will play a key role in leveraging your expertise in Data Modeling, particularly with tools like ER Studio and ER Win. Brillio, known for its digital technology services and partnership with Fortune 1000 companies, is committed to transforming disruptions into competitive advantages through innovative digital solutions. With over 10 years of IT experience and at least 2 years of hands-on experience in Snowflake, you will be responsible for building and maintaining data models that support the organization's Data Lake/ODS, ETL processes, and data warehousing needs. Your ability to collaborate closely with clients to deliver physical and logical model solutions will be critical to the success of various projects. In this role, you will demonstrate your expertise in Data Modeling concepts at an advanced level, with a focus on modeling in large volume-based environments. Your experience with tools like ER Studio and your overall understanding of database technologies, data warehouses, and analytics will be essential in designing and implementing effective data models. Additionally, your strong skills in Entity Relationship Modeling, knowledge of database design and administration, and proficiency in SQL query development will enable you to contribute to the design and optimization of data structures, including Star Schema design. Your leadership abilities and excellent communication skills will be instrumental in leading teams and ensuring the successful implementation of data modeling solutions. While experience with AWS ecosystems is a plus, your dedication to staying at the forefront of technological advancements and your passion for delivering exceptional client experiences will make you a valuable addition to Brillio's team of "Brillians." Join us in our mission to create innovative digital solutions and make a difference in the world of technology.,

Posted 1 day ago

Apply

6.0 - 10.0 years

0 Lacs

pune, maharashtra

On-site

As a Senior Data Engineer at our Pune location, you will play a critical role in designing, developing, and maintaining scalable data pipelines and architectures using Data bricks on Azure/AWS cloud platforms. With 6 to 9 years of experience in the field, you will collaborate with stakeholders to integrate large datasets, optimize performance, implement ETL/ELT processes, ensure data governance, and work closely with cross-functional teams to deliver accurate solutions. Your responsibilities will include building, maintaining, and optimizing data workflows, integrating datasets from various sources, tuning pipelines for performance and scalability, implementing ETL/ELT processes using Spark and Data bricks, ensuring data governance, collaborating with different teams, documenting data pipelines, and developing automated processes for continuous integration and deployment of data solutions. To excel in this role, you should have 6 to 9 years of hands-on experience as a Data Engineer, expertise in Apache Spark, Delta Lake, Azure/AWS Data bricks, proficiency in Python, Scala, or Java, advanced SQL skills, experience with cloud data platforms, data warehousing solutions, data modeling, ETL tools, version control systems, and automation tools. Additionally, soft skills such as problem-solving, attention to detail, and ability to work in a fast-paced environment are essential. Nice to have skills include experience with Data bricks SQL and Data bricks Delta, knowledge of machine learning concepts, and experience in CI/CD pipelines for data engineering solutions. Joining our team offers challenging work with international clients, growth opportunities, a collaborative culture, and global project involvement. We provide competitive salaries, flexible work schedules, health insurance, performance-based bonuses, and other standard benefits. If you are passionate about data engineering, possess the required skills and qualifications, and thrive in a dynamic and innovative environment, we welcome you to apply for this exciting opportunity.,

Posted 1 day ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Quality Engineer, your primary responsibility will be to analyze business and technical requirements to design, develop, and execute comprehensive test plans for ETL pipelines and data transformations. You will perform data validation, reconciliation, and integrity checks across various data sources and target systems. Additionally, you will be expected to build and automate data quality checks using SQL and/or Python scripting. It will be your duty to identify, document, and track data quality issues, anomalies, and defects. Collaboration is key in this role, as you will work closely with data engineers, developers, QA, and business stakeholders to understand data requirements and ensure that data quality standards are met. You will define data quality KPIs and implement continuous monitoring frameworks. Participation in data model reviews and providing input on data quality considerations will also be part of your responsibilities. In case of data discrepancies, you will be expected to perform root cause analysis and work with teams to drive resolution. Ensuring alignment to data governance policies, standards, and best practices will also fall under your purview. To qualify for this position, you should hold a Bachelor's degree in Computer Science, Information Technology, or a related field. Additionally, you should have 4 to 7 years of experience as a Data Quality Engineer, ETL Tester, or a similar role. A strong understanding of ETL concepts, data warehousing principles, and relational database design is essential. Proficiency in SQL for complex querying, data profiling, and validation tasks is required. Familiarity with data quality tools, testing methodologies, and modern cloud data ecosystems (AWS, Snowflake, Apache Spark, Redshift) will be advantageous. Moreover, advanced knowledge of SQL, data pipeline tools like Airflow, DBT, or Informatica, as well as experience with integrating data validation processes into CI/CD pipelines using tools like GitHub Actions, Jenkins, or similar, are desired qualifications. An understanding of big data platforms, data lakes, non-relational databases, data lineage, master data management (MDM) concepts, and experience with Agile/Scrum development methodologies will be beneficial for excelling in this role. Your excellent analytical and problem-solving skills along with a strong attention to detail will be valuable assets in fulfilling the responsibilities of a Data Quality Engineer.,

Posted 1 day ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

The successful candidate will be responsible for developing and maintaining applications using Python, SQL, Reactjs, and Java. You will also be involved in building and managing data pipelines on platforms such as Databricks, DBT, Snowflake, RSL (Report Specification Language-Geneva), and RDL (Report Definition Language). Your experience with non-functional aspects like performance management, scalability, and availability will be crucial. Additionally, you will collaborate closely with front-office, operations, and finance teams to enhance reporting and analysis for alternative investments. Working with cross-functional teams, you will drive automation, workflow efficiencies, and reporting enhancements. Troubleshooting system issues, implementing enhancements, and ensuring optimal system performance to follow the sun model for end-to-end coverage of applications will also be part of your responsibilities. Qualifications & Experience: - Education: Bachelor's degree in Computer Science, Engineering, or a related field. - Experience: Minimum of 2 years of experience in enterprise software development and production management, preferably within financial services. - Proficiency in at least one programming language - Python, Java, Reactjs, SQL. - Familiarity with alternative investments and their reporting requirements. - Hands-on experience in relational databases and complex query authoring. - Ability to thrive in a fast-paced work environment with quick iterations. - Must be able to work out of our Bangalore office. Preferred Qualifications: - Knowledge of AWS/Azure services. - Previous experience in asset management/private equity domain. This role provides an exciting opportunity to work in a fast-paced, engineering-focused startup environment and contribute to meaningful projects that address complex business challenges. Join our team and become part of a culture that values innovation, collaboration, and excellence. FS Investments: 30 years of leadership in private markets FS Investments is an alternative asset manager focused on delivering attractive returns across private equity, private credit, and real estate. With the recent acquisition of Portfolio Advisors in 2023, FS Investments now manages over $85 billion for institutional and wealth management clients globally. With over 30 years of experience and more than 500 employees across nine global offices, the firm's investment professionals oversee a variety of strategies across private markets and maintain relationships with 300+ sponsors. FS Investments" active partnership model fosters superior market insights and deal flow, informing the underwriting process and contributing to strong returns. FS is an Equal Opportunity Employer. FS Investments does not accept unsolicited resumes from recruiters or search firms. Any resume or referral submitted without a signed agreement is the property of FS Investments, and no fee will be paid.,

Posted 1 day ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

The Applications Development Technology Lead Analyst role at our organization involves working closely with the Technology team to establish and implement new or updated application systems and programs. Your primary responsibility will be to lead applications systems analysis and programming activities. As the Applications Development Technology Lead Analyst, you will collaborate with various management teams to ensure seamless integration of functions to achieve organizational goals. You will also be responsible for identifying necessary system enhancements for deploying new products and process improvements. Additionally, you will play a key role in resolving high-impact problems and projects by evaluating complex business processes and industry standards. Your expertise in applications programming will be crucial in ensuring that application design aligns with the overall architecture blueprint. You will need to have a deep understanding of system flow and develop coding, testing, debugging, and implementation standards. Furthermore, you will be expected to have a comprehensive knowledge of how different business areas integrate to achieve business objectives. In this position, you will provide in-depth analysis and innovative solutions to address issues effectively. You will also serve as an advisor or coach to mid-level developers and analysts, assigning work as needed. It is essential to assess risks carefully when making business decisions, with a focus on upholding the firm's reputation and complying with relevant laws and regulations. To qualify for this role, you should have 6-10 years of relevant experience in Apps Development or systems analysis. You must also possess extensive experience in system analysis and software application programming, along with a track record of managing and implementing successful projects. Being a Subject Matter Expert (SME) in at least one area of Applications Development will be advantageous. A Bachelor's degree or equivalent experience is required, while a Master's degree is preferred. The ability to adjust priorities swiftly, demonstrated leadership and project management skills, and clear written and verbal communication are also essential qualifications for this position. The job description provides an overview of the typical responsibilities associated with this role. As a Vice President (VP) in this capacity, you will lead a specific technical vertical (Frontend, Backend, or Data), mentor developers, and ensure timely, scalable, and testable delivery within your domain. Your responsibilities will include leading a team of engineers, translating architecture into execution, reviewing complex components, and driving data platform migration projects. Additionally, you will be expected to evaluate and implement AI-based tools for enhanced productivity, testing, and code improvement. The required skills for this role include having 10-14 years of experience in leading development teams, delivering cloud-native solutions, and proficiency in programming languages such as Java, Python, and JavaScript/TypeScript. Familiarity with frameworks like Spring Boot/WebFlux, Angular, Node.js, databases including Oracle and MongoDB, cloud technologies such as ECS, S3, Lambda, and Kubernetes, as well as data technologies like Apache Spark and Snowflake, are also essential. Strong mentoring, conflict resolution, and cross-team communication skills are important attributes for success in this position.,

Posted 1 day ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

You are a strategic thinker passionate about driving solutions in BI and Analytics (Alteryx, SQL, Tableau), and you have found the right team. As a Business Intelligence Developer Associate within our Asset and Wealth Management Finance Transformation and Analytics team, you will be tasked with defining, refining, and achieving set objectives for our firm on a daily basis. You will be responsible for designing the technical and information architecture for the MIS (DataMarts) and Reporting Environments. Additionally, you will support the MIS team in query optimization and deployment of BI technologies, including but not limited to Alteryx, Tableau, MS SQL Server (T-SQL programming), SSIS, and SSRS. You will scope, prioritize, and coordinate activities with the product owners, design and develop complex queries for data inputs, and work on agile improvements by sharing experiences and knowledge with the team. Furthermore, you will advocate and steer the team to implement CI/CD (DevOps) workflow and design and develop complex dashboards from large and/or different data sets. The ideal candidate for this position will be highly skilled in reporting methodologies, data manipulation & analytics tools, and have expertise in the visualization and presentation of enterprise data. Required qualifications, capabilities, and skills include a Bachelor's Degree in MIS, Computer Science, or Engineering. A different field of study with significant professional experience in BI Development is also acceptable. Strong DW-BI skills are required with a minimum of 7 years of experience in Data warehouse and visualization. You should have strong work experience in data wrangling tools like Alteryx and working proficiency in Data Visualizations Tools, including but not limited to Alteryx, Tableau, MS SQL Server (SSIS, SSRS). Working knowledge in querying data from databases such as MS SQL Server, Snowflake, Databricks, etc., is essential. You must have a strong knowledge of designing database architecture, building scalable visualization solutions, and the ability to write complicated yet efficient SQL queries and stored procedures. Experience in building end-to-end ETL processes, working with multiple data sources, handling large volumes of data, and converting data into information is required. Experience in the end-to-end implementation of Business Intelligence (BI) reports & dashboards, as well as good communication and analytical skills, are also necessary. Preferred qualifications, capabilities, and skills include exposure to Data Science and allied technologies like Python, R, etc. Exposure to automation tools like UIPath, Blue Prism, Power Automate, etc., working knowledge of CI/CD workflows and automated deployment, and experience with scheduling tools like Control M are considered advantageous.,

Posted 1 day ago

Apply

8.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

The Marketing Cloud Technical Design Architect at Novartis DDIT, Hyderabad, plays a key role in translating business requirements into IT solution design specifications. You will collaborate with business customers and Strategic Business Partners to analyze demands, propose solutions, and provide funding estimates. Your responsibilities include contributing to technology delivery, leading Rapid-Prototyping engagements, ensuring on-time delivery of engagements, engaging with SI Partners, and driving enterprise-grade Solution Design and Architecture. You will also be responsible for DevSecOps management, following industry trends, ensuring security and compliance, and enhancing user experience. To qualify for this role, you should have a university degree in a business/technical area with at least 8 years of experience in Solution Design, including 3 years in Salesforce Marketing Cloud. Marketing Cloud certifications are advantageous. You must have practical knowledge of Marketing Automation projects, Salesforce Marketing Cloud integrations, data modeling, AMPScript, SQL, and Data Mapping. Proficiency in HTML, CSS, and tools that integrate with Marketing Cloud is preferred. Experience in managing global Marketing Automation projects, knowledge of Marketing automation concepts, and familiarity with tools like Data Cloud, CDP, MCP, MCI, Google Analytics, Salesforce CRM, MDM, and Snowflake are required. Novartis is dedicated to reimagining medicine to enhance and prolong lives, with a vision to become the most valued and trusted pharmaceutical company globally. By joining Novartis, you will be part of a mission-driven organization that values diversity and inclusion. If you are a dependable professional with excellent communication skills, attention to detail, and the ability to work in a fast-paced, multicultural environment, this role offers an opportunity to contribute to groundbreaking healthcare advancements. Novartis is committed to fostering an inclusive work environment and building diverse teams that reflect the patients and communities we serve. If you are looking to be part of a community of dedicated individuals working towards a common goal of improving patient outcomes, consider joining the Novartis Network to stay informed about future career opportunities. Novartis offers a range of benefits and rewards to support your personal and professional growth. If you are passionate about making a difference in the lives of patients and are eager to collaborate with like-minded individuals, explore the opportunities at Novartis and be part of a community focused on creating a brighter future together. For more information about Novartis and our commitment to diversity and inclusion, visit: https://www.novartis.com/about/strategy/people-and-culture To stay connected and learn about future career opportunities at Novartis, join our talent community here: https://talentnetwork.novartis.com/network To learn more about the benefits and rewards offered by Novartis, read our handbook: https://www.novartis.com/careers/benefits-rewards,

Posted 1 day ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

The Data Services ETL Developer specializes in data transformations and integration projects using Zeta's tools, 3rd Party software, and coding. Understanding CRM methodologies related to marketing operations is essential. Responsibilities include manipulating client and internal marketing data across various platforms, automating scripts for data transfer, building and managing cloud-based data pipelines using AWS services, managing tasks with competing priorities, and collaborating with technical staff to support a proprietary ETL environment. Collaborating with database/CRM, modelers, analysts, and application programmers is crucial for delivering results to clients. The ideal candidate should cover the US time-zone, be in the office a minimum of three days per week, have experience in database marketing, knowledge of US and International postal addresses (including SAP postal products), proficiency with AWS services (S3, Airflow, RDS, Athena), experience with Oracle and Snowflake SQL, familiarity with various tools like Snowflake, Airflow, GitLab, Grafana, LDAP, Open VPN, DCWEB, Postman, and Microsoft Excel. Additionally, knowledge of SQL Server, SFTP, PGP, large-scale customer databases, project life cycle, and proficiency with editors like Notepad++ and Ultra Edit is required. Strong communication, collaboration skills, and the ability to manage multiple tasks simultaneously are essential. Minimum qualifications include a Bachelors degree or equivalent with 5+ years of experience in database marketing and cloud-based technologies, a strong understanding of data engineering concepts and cloud infrastructure, as well as excellent oral and written communication skills.,

Posted 1 day ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

Agivant is seeking a talented and passionate Senior Data Engineer to join our growing data team. In this role, you will play a key part in building and scaling our data infrastructure, enabling data-driven decision-making across the organization. You will be responsible for designing, developing, and maintaining efficient and reliable data pipelines for both ELT (Extract, Load, Transform) and ETL (Extract, Transform, Load) processes. Responsibilities: Design, develop, and maintain robust and scalable data pipelines for ELT and ETL processes, ensuring data accuracy, completeness, and timeliness. Work with stakeholders to understand data requirements and translate them into efficient data models and pipelines. Build and optimize data pipelines using a variety of technologies, including Elastic Search, AWS S3, Snowflake, and NFS. Develop and maintain data warehouse schemas and ETL/ELT processes to support business intelligence and analytics needs. Implement data quality checks and monitoring to ensure data integrity and identify potential issues. Collaborate with data scientists and analysts to ensure data accessibility and usability for various analytical purposes. Stay current with industry best practices, CI/CD/DevSecFinOps, Scrum, and emerging technologies in data engineering. Contribute to the development and enhancement of our data warehouse architecture. Requirements: - Bachelor's degree in Computer Science, Engineering, or a related field. - 5+ years of experience as a Data Engineer with a strong focus on ELT/ETL processes. - At least 3+ years of experience in Snowflake data warehousing technologies. - At least 3+ years of experience in creating and maintaining Airflow ETL pipelines. - Minimum 3+ years of professional level experience with Python languages for data manipulation and automation. - Working experience with Elastic Search and its application in data pipelines. - Proficiency in SQL and experience with data modeling techniques. - Strong understanding of cloud-based data storage solutions such as AWS S3. - Experience working with NFS and other file storage systems. - Excellent problem-solving and analytical skills. - Strong communication and collaboration skills.,

Posted 1 day ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

Sykatiya Technology Pvt Ltd is a leading Semiconductor Industry innovator committed to leveraging cutting-edge technology to solve complex problems. We are currently looking for a highly skilled and motivated Data Scientist to join our dynamic team and contribute to our mission of driving innovation through data-driven insights. As the Lead Data Scientist and Machine Learning Engineer at Sykatiya Technology Pvt Ltd, you will play a crucial role in analyzing large datasets to uncover patterns, develop predictive models, and implement AI/ML solutions. Your responsibilities will include working on projects involving neural networks, deep learning, data mining, and natural language processing (NLP) to drive business value and enhance our products and services. Key Responsibilities: - Lead the design and implementation of machine learning models and algorithms to address complex business problems. - Utilize deep learning techniques to enhance neural network models and enhance prediction accuracy. - Conduct data mining and analysis to extract actionable insights from both structured and unstructured data. - Apply natural language processing (NLP) techniques for advanced text analytics. - Develop and maintain end-to-end data pipelines, ensuring data integrity and reliability. - Collaborate with cross-functional teams to understand business requirements and deliver data-driven solutions. - Mentor and guide junior data scientists and engineers in best practices and advanced techniques. - Stay updated with the latest advancements in AI/ML, neural networks, deep learning, data mining, and NLP. Technical Skills: - Proficiency in Python and its libraries such as NumPy, pandas, sci-kit-learn, TensorFlow, Keras, and PyTorch. - Strong understanding of machine learning algorithms and techniques. - Extensive experience with neural networks and deep learning frameworks. - Hands-on experience with data mining and analysis techniques. - Proficiency in natural language processing (NLP) tools and libraries like NLTK, spaCy, and transformers. - Proficiency in Big Data Technologies including Sqoop, Hadoop, HDFS, Hive, and PySpark. - Experience with Cloud Platforms such as AWS services like S3, Step Functions, EventBridge, Athena, RDS, Lambda, and Glue. - Strong knowledge of Database Management systems like SQL, Teradata, MySQL, PostgreSQL, and Snowflake. - Familiarity with Other Tools like ExactTarget, Marketo, SAP BO, Agile, and JIRA. - Strong Analytical Skills to analyze large datasets and derive actionable insights. - Excellent Problem-Solving Skills with the ability to think critically and creatively. - Effective Communication Skills and teamwork abilities to collaborate with various stakeholders. Experience: - At least 8 to 12 years of experience in a similar role.,

Posted 1 day ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a Forward Deployed Engineer at Salesforce, you will play a crucial role in delivering transformative AI solutions to our strategic clients. Your responsibilities will include leading the design, development, and implementation of bespoke solutions using cutting-edge technologies like the Agentforce platform. You will be at the forefront of driving technical vision, mentoring team members, and ensuring the successful delivery of mission-critical AI applications in real-world environments. Your impact will be significant as you lead the architectural design of scalable production systems, strategize complex data ecosystems, drive innovation on the Agentforce platform, and operate with a proactive and strategic mindset. Building strong relationships with senior client teams, ensuring seamless deployment, and optimizing solutions for long-term reliability will be key aspects of your role. Additionally, you will act as a bridge between customer needs and product evolution, providing valuable feedback to shape future enhancements. To excel in this role, you are required to have a Bachelor's degree in Computer Science or a related field, with 5+ years of experience in delivering scalable production solutions. Proficiency in programming languages like JavaScript, Java, Python, and expertise in AI technologies are essential. Strong communication skills, a proactive attitude, and the ability to travel as needed are also important qualifications. Preferred qualifications include expert-level experience with Salesforce Data Cloud and the Agentforce platform, as well as knowledge of Salesforce CRM across various clouds. Experience in developing complex conversational AI solutions and Salesforce platform certifications would be advantageous. If you are passionate about leveraging AI to drive business transformation and have a track record of impactful delivery in agile environments, this role offers a unique opportunity to make a difference.,

Posted 1 day ago

Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

As an Analyst Programmer at WPFH department in Gurgaon, you will be an integral part of the Data team responsible for building data integration and distribution solutions within the Distribution Data and Reporting team. Your role will involve working closely with technical leads, business analysts, and product teams to design, develop, and troubleshoot ETL jobs for various Operational data stores. You will be expected to demonstrate innovative problem-solving skills, strong interpersonal abilities, and a high level of ownership in a dynamic working environment. Key Responsibilities: - Collaborate with Technical leads, Business Analysts, and subject matter experts. - Design and develop ETL jobs based on data model requirements. - Utilize Informatica Power Centre tool set and Oracle database for ETL job development. - Provide development estimates and ensure adherence to standards and best practices. - Coordinate dependencies and deliverables with cross-functional teams. Essential Skills: Technical: - Minimum 3 years of experience using Informatica Power Centre tool set. - Proficiency in Snowflake, Source Control Tools, Control-M, UNIX scripting, and SQL/Pl-SQL. - Experience in Data Warehouse, Datamart, ODS concepts, and Oracle/SQL Server utilities. - Knowledge of data normalization, OLAP, and Oracle performance optimization. Functional: - Minimum 3 years of experience in financial organizations with broad base business process knowledge. - Strong communication, interpersonal, and client-facing skills. - Ability to work closely with cross-functional teams and data stewards. About You: - Bachelor's Degree in B.E./B.Tech/MBA/M.C.A or equivalent. - Minimum 3 years of experience in Data Integration and Distribution. - Experience in building web services and APIs. - Knowledge of Agile software development methodologies. At our organization, we value your wellbeing, support your development, and offer a comprehensive benefits package. We prioritize a flexible work environment that promotes a healthy work-life balance. Join our team at WPFH and be part of a collaborative environment where you can contribute to building better financial futures. Explore more about our dynamic working approach and future opportunities at careers.fidelityinternational.com.,

Posted 1 day ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

You are invited to join Avaap's India-based Center of Excellence (ACE) team as a dedicated Workday Innovation Consultant. In this role, you will be instrumental in accelerating the innovation pipeline by building and demonstrating cutting-edge solutions using Workday Extend applications and Snowflake integrations with embedded insights. This newly created position offers the opportunity to be part of a small, high-impact pilot group focused on innovation builds and prototypes, supported by structured technical guidance. As a Workday Innovation Consultant at Avaap, you will have the chance to work on high-visibility initiatives in a collaborative and forward-thinking environment. You will be involved in designing and developing Workday Extend applications that incorporate Snowflake-powered insights into the Workday UI. Additionally, you will be responsible for building and showcasing integrations between Workday and Snowflake, utilizing Python, SQL, and Workday Extend APIs. Collaboration with innovation leads and solution architects to understand technical requirements and deliver iterative prototypes will be a key aspect of your role. To excel in this position, you are expected to have experience with Workday Extend, preferably certified. Hands-on experience with Snowflake and SQL-based data modeling, proficiency in Python and/or Java/Javascript, and a strong understanding of Workday's object model and integration points are highly valued. Demonstrated ability to create demos or proof-of-concept applications, as well as Snowflake certification or project experience, is preferred. Previous experience in consulting or client-facing environments and exposure to Workday Reporting, Prism, or People Analytics would be advantageous. The minimum qualifications for this role include a willingness to work from 1 PM to 10 PM IST, excellent verbal and written communication skills, the ability to work indoors at a computer, and the capacity to thrive in a fast-paced and deadline-oriented environment. A passion for exceptional customer service and collaboration will be key to your success in this role. Join Avaap's innovation team and be a part of shaping go-to-market prototypes while expanding your expertise across the Workday and Snowflake ecosystems. Take advantage of structured internal training opportunities and work alongside a supportive team dedicated to fostering innovation and creativity.,

Posted 1 day ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

You are looking for a Mid-Level Full Stack Developer specializing in Data Visualization & Reporting, particularly skilled in tools like Power BI, SSIS, and Snowflake. The role is based in Bangalore under a hybrid work model, requiring 7-10 years of experience with an immediate to 15 days" notice period. As a Full Stack Developer, you will be responsible for analyzing, designing, and developing online dashboards, visualizations, and offline reports in an agile environment. You will work across the complete secure software development life cycle, from concept to deployment. Your mandatory skills should include proficiency in reporting tools such as Sigma Computing, Power BI, Microsoft SQL Server, Microservices, and Event-driven architecture using C#/.NET. Additionally, you should have a strong understanding of Artificial Intelligence (AI) and GenAI tools to accelerate development. Experience in Data Modeling and Data Engineering using Snowflake is crucial for this role. Having knowledge of Agile methodologies and Gen AI would be considered a bonus. The interview process for this position will be conducted virtually. If you are interested in this opportunity, please share your resume with netra.s@twsol.com.,

Posted 1 day ago

Apply

2.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

As a Principal Data Engineer at Brillio, you will play a crucial role in leveraging your expertise in data modeling, particularly with tools like ER Studio and ER Win. Your specialization lies in transforming disruptive technologies into a competitive advantage for Fortune 1000 companies through innovative digital adoption. Brillio prides itself on being a rapidly growing digital technology service provider that excels in integrating cutting-edge digital skills with a client-centric approach. As a preferred employer, Brillio attracts top talent by offering opportunities to work on exclusive digital projects and engage with groundbreaking technologies. To excel in this role, you should have over 10 years of IT experience, with a minimum of 2 years dedicated to Snowflake and hands-on modeling experience. Your proficiency in Data Lake/ODS, ETL concepts, and data warehousing principles will be critical in delivering effective solutions to clients. Collaboration with clients to drive both physical and logical model solutions will be a key aspect of your responsibilities. Your technical skills should encompass advanced data modeling concepts, experience in modeling large volumes of data, and proficiency in tools like ER Studio. A solid grasp of database concepts, including data warehouses, reporting, and analytics, will be essential. Familiarity with platforms like SQLDBM and expertise in entity relationship modeling will further strengthen your profile. Moreover, your communication skills should be excellent, enabling you to lead teams effectively and facilitate seamless collaboration with clients. While exposure to AWS ecosystems is a plus, your ability to design and administer databases, develop SQL queries for analysis, and implement data modeling for star schema designs will be instrumental in your success as a Principal Data Engineer at Brillio.,

Posted 1 day ago

Apply

8.0 - 12.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Senior Data Engineering Architect at Iris Software, you will play a crucial role in leading enterprise-level data engineering projects on public cloud platforms like AWS, Azure, or GCP. Your responsibilities will include engaging with client managers to understand their business needs, conceptualizing solution options, and finalizing strategies with stakeholders. You will also be involved in team building, delivering Proof of Concepts (PoCs), and enhancing competencies within the organization. Your role will focus on building competencies in Data & Analytics, including Data Engineering, Analytics, Data Science, AI/ML, and Data Governance. Staying updated with the latest tools, best practices, and trends in the Data and Analytics field will be essential to drive innovation and excellence in your work. To excel in this position, you should hold a Bachelor's or Master's degree in a Software discipline and have extensive experience in Data architecture and implementing large-scale Data Lake/Data Warehousing solutions. Your background in Data Engineering should demonstrate leadership in solutioning, architecture, and successful project delivery. Strong communication skills in English, both written and verbal, are essential for effective collaboration with clients and team members. Proficiency in tools such as AWS Glue, Redshift, Azure Data Lake, Databricks, Snowflake, and databases, along with programming skills in Spark, Spark SQL, PySpark, and Python, are mandatory competencies for this role. Joining Iris Software offers a range of perks and benefits designed to support your financial, health, and overall well-being. From comprehensive health insurance and competitive salaries to flexible work arrangements and continuous learning opportunities, we are dedicated to providing a supportive and rewarding work environment where your success and happiness are valued. If you are inspired to grow your career in Data Engineering and thrive in a culture that values talent and personal growth, Iris Software is the place for you. Be part of a dynamic team where you can be valued, inspired, and encouraged to be your best professional and personal self.,

Posted 1 day ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You are an experienced IICS (Informatica Intelligent Cloud Services) Developer with a strong background in the IICS platform. You possess in-depth knowledge of Snowflake and excel in creating and managing integrations across various systems and databases. Your role involves collaborating on cloud-based integration solutions, ensuring seamless data flow between platforms, and optimizing performance for large-scale data processes. Your primary responsibilities include designing, developing, and implementing data integration solutions using IICS. You will work extensively with Snowflake data warehouse solutions, handling tasks such as data loading, transformation, and querying. Building, monitoring, and maintaining efficient data pipelines between cloud-based systems and Snowflake will be crucial. Troubleshooting and resolving integration issues within the IICS platform and Snowflake are part of your routine tasks. You will also focus on ensuring optimal data processing performance and managing data flow among different cloud applications and databases. Collaboration with data architects, analysts, and stakeholders to gather requirements and design integration solutions is essential. Implementation of best practices for data governance, security, and data quality within the integration solutions is a key aspect of your role. Additionally, you will conduct unit testing and debugging of IICS data integration tasks and optimize integration workflows to meet performance and scalability requirements. Your skill set encompasses hands-on experience with IICS, a strong understanding of Snowflake as a cloud data warehouse, and proficiency in building ETL/ELT workflows. You are adept at integrating various data sources into Snowflake and possess experience with SQL for writing complex queries for data transformation and manipulation. Familiarity with data integration techniques and best practices for cloud-based platforms, along with experience in working with RESTful APIs and other integration protocols, is vital. Your ability to troubleshoot, optimize, and maintain data pipelines effectively, coupled with knowledge of data governance, security principles, and data quality standards, sets you apart. In terms of qualifications, you hold a Bachelor's degree in Computer Science, Information Technology, or a related field, or possess equivalent experience. You have a minimum of 5 years of experience in data integration development, with proficiency in Snowflake and cloud-based data solutions. A strong understanding of ETL/ELT processes and integration design principles, as well as experience working in Agile or similar development methodologies, further enhance your profile.,

Posted 1 day ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You will be a seasoned Senior ETL/DB Tester with expertise in data validation and database testing across modern data platforms. Your role will involve designing, developing, and executing comprehensive test plans for ETL and database validation processes. You will validate data transformations and integrity across multiple stages and systems such as Talend, ADF, Snowflake, and Power BI. Your responsibilities will include performing manual testing and defect tracking using tools like Zephyr or Tosca. You will analyze business and data requirements to ensure full test coverage, and write and execute complex SQL queries for data reconciliation. Identifying data-related issues and conducting root cause analysis in collaboration with developers will be crucial aspects of your role. You will track and manage bugs and enhancements through appropriate tools, and optimize testing strategies for performance, scalability, and accuracy in ETL processes. Your skills should include proficiency in ETL Tools such as Talend and ADF, working on Data Platforms like Snowflake, and experience in Reporting/Analytics tools like Power BI and VPI. Additionally, you should have expertise in Testing Tools like Zephyr or Tosca, manual testing, and strong SQL skills for validating complex data. Moreover, exposure to API Testing and familiarity with advanced features of Power BI such as Dashboards, DAX, and Data Modeling will be beneficial for this role.,

Posted 1 day ago

Apply

5.0 - 9.0 years

0 Lacs

indore, madhya pradesh

On-site

You will be joining Beyond Key, a Microsoft Gold Partner and a Great Place to Work-certified company that prioritizes the happiness of both team members and clients. Established in 2005, Beyond Key is an international IT consulting and software services firm known for delivering cutting-edge services and products to meet the global needs of their clients across various regions such as the United States, Canada, Europe, Australia, the Middle East, and India. With a team of over 350+ skilled software professionals, Beyond Key creates and designs IT solutions tailored to their clients" requirements. For more information, visit https://www.beyondkey.com/about. As a Snowflake DevOps Engineer within the BI TEC team, your primary responsibility will be to support and enhance a multi-region Snowflake data warehouse infrastructure. This role will involve developing and maintaining robust CI/CD pipelines using tools like GitHub, Git Actions, Python, TeamCity, and SDA. Proficiency in Control-M for batch scheduling and a solid background in data warehousing are crucial for this position. Collaboration with cross-functional technical teams and a proactive delivery approach are essential aspects of this role. While experience in the Broker Dealer domain is advantageous, a proven track record in managing large-scale data warehouse projects will also be highly valued. Key Responsibilities: - Develop and maintain CI/CD pipelines for Snowflake. - Collaborate with different teams to improve deployment and automation processes. - Manage batch scheduling using Control-M. - Ensure quality and security compliance, including conducting Veracode scan reviews. - Contribute to data warehouse design following Kimball methodologies. - Translate technical concepts into easily understandable language for business purposes. - Provide support for production reporting and be available for on-call support when necessary. Required Skills & Experience: - Minimum 5 years of experience in Snowflake CI/CD. - Minimum 5 years of Python development experience. - Proficiency in GitHub, Git Actions, TeamCity, and SDA. - Strong understanding of Data Warehousing and Kimball methodology. - Experience with Control-M for batch processing and job scheduling. - Familiarity with Veracode or similar security scanning tools. - Experience working in large-scale database development teams. - Knowledge of Capital Markets or Broker Dealer domain (preferred). - Oracle PL/SQL experience is a plus. If you are seeking a role where you can contribute to innovative data solutions and work collaboratively with a dynamic team, this opportunity at Beyond Key may be perfect for you. Explore all our job openings and share this opportunity with someone exceptional.,

Posted 1 day ago

Apply

5.0 - 9.0 years

0 Lacs

kolkata, west bengal

On-site

Genpact is a global professional services and solutions firm that is dedicated to delivering outcomes that shape the future. With a team of over 125,000 professionals across more than 30 countries, we are motivated by curiosity, entrepreneurial agility, and the desire to create lasting value for our clients. Our purpose is the relentless pursuit of a world that works better for people, and we serve and transform leading enterprises, including the Fortune Global 500, leveraging our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. We are currently looking for a Principal Consultant - Snowflake Sr. Data Engineer (Snowflake + Python/Pyspark) to join our team! As a Snowflake Sr. Data Engineer, you will be responsible for providing technical direction and leading a group of developers to address a common goal. You should have experience in the IT industry and be proficient in building productionized data ingestion and processing data pipelines in Snowflake. Additionally, you should be well-versed in data warehousing concepts and have expertise in Snowflake features and integration with other data processing tools. Experience with Python programming and Pyspark for data analysis is essential for this role. Key Responsibilities: - Work on requirement gathering, analysis, designing, development, and deployment - Write SQL queries against Snowflake and develop scripts for Extract, Load, and Transform data - Understand Data Warehouse concepts and Snowflake Architecture - Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, tables, Tasks, Streams, and more - Experience with Snowflake AWS data services or Azure data services - Proficiency in Python programming language and knowledge of packages like pandas, NumPy, etc. - Design and develop efficient ETL jobs using Python and Pyspark - Use Python and Pyspark for data cleaning, pre-processing, and transformation tasks - Implement CDC or SCD type 2 and build data ingestion pipelines - Work with workflow management tools for scheduling and managing ETL jobs Qualifications: - B.E./ Masters in Computer Science, Information Technology, or Computer Engineering - Relevant years of experience as a Snowflake Sr. Data Engineer - Skills in Snowflake, Python/Pyspark, AWS/Azure, ETL concepts, Airflow, or any orchestration tools, Data Warehousing concepts If you are passionate about leveraging your skills to drive innovative solutions and create value in a dynamic environment, we encourage you to apply for this exciting opportunity. Join us in shaping the future and making a difference!,

Posted 1 day ago

Apply

Exploring Snowflake Jobs in India

Snowflake has become one of the most sought-after skills in the tech industry, with a growing demand for professionals who are proficient in handling data warehousing and analytics using this cloud-based platform. In India, the job market for Snowflake roles is flourishing, offering numerous opportunities for job seekers with the right skill set.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Chennai

These cities are known for their thriving tech industries and have a high demand for Snowflake professionals.

Average Salary Range

The average salary range for Snowflake professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum

Career Path

A typical career path in Snowflake may include roles such as: - Junior Snowflake Developer - Snowflake Developer - Senior Snowflake Developer - Snowflake Architect - Snowflake Consultant - Snowflake Administrator

Related Skills

In addition to expertise in Snowflake, professionals in this field are often expected to have knowledge in: - SQL - Data warehousing concepts - ETL tools - Cloud platforms (AWS, Azure, GCP) - Database management

Interview Questions

  • What is Snowflake and how does it differ from traditional data warehousing solutions? (basic)
  • Explain how Snowflake handles data storage and compute resources in the cloud. (medium)
  • How do you optimize query performance in Snowflake? (medium)
  • Can you explain how data sharing works in Snowflake? (medium)
  • What are the different stages in the Snowflake architecture? (advanced)
  • How do you handle data encryption in Snowflake? (medium)
  • Describe a challenging project you worked on using Snowflake and how you overcame obstacles. (advanced)
  • How does Snowflake ensure data security and compliance? (medium)
  • What are the benefits of using Snowflake over traditional data warehouses? (basic)
  • Explain the concept of virtual warehouses in Snowflake. (medium)
  • How do you monitor and troubleshoot performance issues in Snowflake? (medium)
  • Can you discuss your experience with Snowflake's semi-structured data handling capabilities? (advanced)
  • What are Snowflake's data loading options and best practices? (medium)
  • How do you manage access control and permissions in Snowflake? (medium)
  • Describe a scenario where you had to optimize a Snowflake data pipeline for efficiency. (advanced)
  • How do you handle versioning and change management in Snowflake? (medium)
  • What are the limitations of Snowflake and how would you work around them? (advanced)
  • Explain how Snowflake supports semi-structured data formats like JSON and XML. (medium)
  • What are the considerations for scaling Snowflake for large datasets and high concurrency? (advanced)
  • How do you approach data modeling in Snowflake compared to traditional databases? (medium)
  • Discuss your experience with Snowflake's time travel and data retention features. (medium)
  • How would you migrate an on-premise data warehouse to Snowflake in a production environment? (advanced)
  • What are the best practices for data governance and metadata management in Snowflake? (medium)
  • How do you ensure data quality and integrity in Snowflake pipelines? (medium)

Closing Remark

As you explore opportunities in the Snowflake job market in India, remember to showcase your expertise in handling data analytics and warehousing using this powerful platform. Prepare thoroughly for interviews, demonstrate your skills confidently, and keep abreast of the latest developments in Snowflake to stay competitive in the tech industry. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies