Home
Jobs

172 Snowflake Jobs - Page 7

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 6.0 years

0 Lacs

, India

On-site

Foundit logo

About the Role: 10 One of the most valuable asset in today's Financial industry is the data which can provide businesses the intelligence essential to making business and financial decisions with conviction. This role will provide an opportunity to you to work on Ratings and Research related data. You will get an opportunity to work on cutting edge big data technologies and will be responsible for development of both Data feeds as well as API work. The Team: RatingsXpress is at the heart of financial workflows when it comes to providing and analyzing data. We provide Ratings and Research information to clients . Our work deals with content ingestion, data feeds generation as well as exposing the data to clients via API calls. This position in part of the Ratings Xpresss team and is focused on providing clients the critical data they need to make the most informed investment decisions possible. Impact: As a member of the Xpressfeed Team in S&P Global Market Intelligence, you will work with a group of intelligent and visionary engineers to build impactful content management tools for investment professionals across the globe. Our Software Engineers are involved in the full product life cycle, from design through release. You will be expected to participate in application designs , write high-quality code and innovate on how to improve the overall system performance and customer experience. If you are a talented developer and want to help drive the next phase for Data Management Solutions at S&P Global and can contribute great ideas, solutions and code and understand the value of Cloud solutions, we would like to talk to you. What's in it for you: We are currently seeking a Software Developer with a passion for full-stack development. In this role, you will have the opportunity to work on cutting-edge cloud technologies such as Databricks , Snowflake , and AWS , while also engaging in Scala and SQL Server -based database development. This position offers a unique opportunity to grow both as a Full Stack Developer and as a Cloud Engineer , expanding your expertise across modern data platforms and backend development. Responsibilities: Analyze, design and develop solutions within a multi-functional Agile team to support key business needs for the Data feeds Design, implement and test solutions using AWS EMR for content Ingestion. Work on complex SQL server projects involving high volume data Engineer components, and common services based on standard corporate development models, languages and tools Apply software engineering best practices while also leveraging automation across all elements of solution delivery Collaborate effectively with technical and non-technical stakeholders. Must be able to document and demonstrate technical solutions by developing documentation, diagrams, code comments, etc. Basic Qualifications: Bachelor's degree in Computer Science, Information Technology, Engineering, or a related field. 3-6 years of experience in application development. Minimum of 2 years of hands-on experience with Scala. Minimum of 2 years of hands-on experience with Microsoft SQL Server. Solid understanding of Amazon Web Services (AWS) and cloud-based development. In-depth knowledge of system architecture, object-oriented programming, and design patterns. Excellent communication skills, with the ability to convey complex ideas clearly both verbally and in writing. Preferred Qualifications: Familiarity with AWS Services, EMR, Auto scaling, EKS Working knowledge of snowflake. Preferred experience in Python development. Familiarity with the Financial Services domain and Capital Markets is a plus. Experience developing systems that handle large volumes of data and require high computational performance. What's In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology-the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide-so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We're committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We're constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That's why we provide everything you-and your career-need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It's not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards-small perks can make a big difference. For more information on benefits by country visit: Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority - Ratings - (Strategic Workforce Planning)

Posted 2 weeks ago

Apply

3.0 - 6.0 years

0 Lacs

, India

On-site

Foundit logo

About the Role: 10 One of the most valuable asset in today's Financial industry is the data which can provide businesses the intelligence essential to making business and financial decisions with conviction. This role will provide an opportunity to you to work on Ratings and Research related data. You will get an opportunity to work on cutting edge big data technologies and will be responsible for development of both Data feeds as well as API work. The Team: RatingsXpress is at the heart of financial workflows when it comes to providing and analyzing data. We provide Ratings and Research information to clients . Our work deals with content ingestion, data feeds generation as well as exposing the data to clients via API calls. This position in part of the Ratings Xpresss team and is focused on providing clients the critical data they need to make the most informed investment decisions possible. Impact: As a member of the Xpressfeed Team in S&P Global Market Intelligence, you will work with a group of intelligent and visionary engineers to build impactful content management tools for investment professionals across the globe. Our Software Engineers are involved in the full product life cycle, from design through release. You will be expected to participate in application designs , write high-quality code and innovate on how to improve the overall system performance and customer experience. If you are a talented developer and want to help drive the next phase for Data Management Solutions at S&P Global and can contribute great ideas, solutions and code and understand the value of Cloud solutions, we would like to talk to you. What's in it for you: We are currently seeking a Software Developer with a passion for full-stack development. In this role, you will have the opportunity to work on cutting-edge cloud technologies such as Databricks , Snowflake , and AWS , while also engaging in Scala and SQL Server -based database development. This position offers a unique opportunity to grow both as a Full Stack Developer and as a Cloud Engineer , expanding your expertise across modern data platforms and backend development. Responsibilities: Analyze, design and develop solutions within a multi-functional Agile team to support key business needs for the Data feeds Design, implement and test solutions using AWS EMR for content Ingestion. Work on complex SQL server projects involving high volume data Engineer components, and common services based on standard corporate development models, languages and tools Apply software engineering best practices while also leveraging automation across all elements of solution delivery Collaborate effectively with technical and non-technical stakeholders. Must be able to document and demonstrate technical solutions by developing documentation, diagrams, code comments, etc. Basic Qualifications: Bachelor's degree in Computer Science, Information Technology, Engineering, or a related field. 3-6 years of experience in application development. Minimum of 2 years of hands-on experience with Scala. Minimum of 2 years of hands-on experience with Microsoft SQL Server. Solid understanding of Amazon Web Services (AWS) and cloud-based development. In-depth knowledge of system architecture, object-oriented programming, and design patterns. Excellent communication skills, with the ability to convey complex ideas clearly both verbally and in writing. Preferred Qualifications: Familiarity with AWS Services, EMR, Auto scaling, EKS Working knowledge of snowflake. Preferred experience in Python development. Familiarity with the Financial Services domain and Capital Markets is a plus. Experience developing systems that handle large volumes of data and require high computational performance. What's In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology-the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide-so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We're committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We're constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That's why we provide everything you-and your career-need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It's not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards-small perks can make a big difference. For more information on benefits by country visit: Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority - Ratings - (Strategic Workforce Planning)

Posted 2 weeks ago

Apply

8.0 - 14.0 years

15 - 30 Lacs

Mangalore, Karnataka, India

On-site

Foundit logo

Job Title: Snowflake & SQL Developer Location: Bangalore/Mangalore Type: Full-Time Experience: 8+ years Why MResult Founded in 2004, MResult is a global digital solutions partner trusted by leading Fortune 500 companies in industries such as pharma & healthcare, retail, and BFSI. MResult's expertise in data and analytics, data engineering, machine learning, AI, and automation help companies streamline operations and unlock business value. As part of our team, you will collaborate with top minds in the industry to deliver cutting-edge solutions that solve real-world challenges. Website: https://mresult.com/ LinkedIn: https://www.linkedin.com/company/mresult/ What We Offer: At MResult, you can leave your mark on projects at the world's most recognized brands, access opportunities to grow and upskill, and do your best work with the flexibility of hybrid work models. Great work is rewarded, and leaders are nurtured from within. Our values Agility, Collaboration, Client Focus, Innovation, and Integrity are woven into our culture, guiding every decision. What This Role Requires In the role of Snowflake & SQL Developer, you will be a key contributor to MResult's mission of empowering our clients with data-driven insights and innovative digital solutions. Each day brings exciting challenges and growth opportunities. Here is what you will do: Roles and responsibilities: Overall 8+ years of experience and 5+ years of relevant experience in Snowflake and PostgreSQL. Design, develop, and optimize data pipelines and batch processes using PostgreSQL and Snowflake. Maintain and enhance the rules-based engine that drives territory alignment logic, ensuring scalability and performance. Troubleshoot and resolve data integration issues. Manage, Master, and Maximize with MResult MResult is an equal-opportunity employer committed to building an inclusive environment free of discrimination and harassment. Take the next step in your career with MResult where your ideas help shape the future.

Posted 3 weeks ago

Apply

4.0 - 6.0 years

0 Lacs

, India

On-site

Foundit logo

At Roche you can show up as yourself, embraced for the unique qualities you bring. Our culture encourages personal expression, open dialogue, and genuine connections, where you are valued, accepted and respected for who you are, allowing you to thrive both personally and professionally. This is how we aim to prevent, stop and cure diseases and ensure everyone has access to healthcare today and for generations to come. Join Roche, where every voice matters. The Position In Roche Informatics, we build on Roche's 125-year history as one of the world's largest biotech companies, globally recognized for providing transformative innovative solutions across major disease areas. We combine human capabilities with cutting-edge technological innovations to do now what our patients need next. Our commitment to our patients needs motivates us to deliver technology that evolves the practice of medicine. Be part of our inclusive team at Roche Informatics, where we're driven by a shared passion for technological novelties and optimal IT solutions. About the position Data Engineer, who will work closely with multi-disciplinary and multi-cultural teams to build structured, high-quality data solutions. The person may be leading technical squads. These solutions will be leveraged across Enterprise , Pharma and Diagnostics solutions to help our teams fulfill our mission: to do now what patients need next. In this position, you will require hands-on expertise in ETL pipeline development, data engineering. You should also be able to provide direction and guidance to developers, oversee the development and unit testing, as well as document the developed solution. Building strong customer relationships for ongoing business is also a key aspect of this role. To succeed in this position, you should have experience with Cloud-based Data Solution Architectures, the Software Development Life Cycle (including both Agile and waterfall methodologies), Data Engineering and ETL tools/platforms, and data modeling practices. Your key responsibilities: Building and optimizing data ETL pipelines to support data analytics Developing and implementing data integrations with other systems and platforms Maintaining documentation for data pipelines and related processes Logical and physical modeling of datasets and applications Making Roche data assets accessible and findable across the organization Explore new ways of building, processing, and analyzing data in order to deliver insights to our business partners Continuously refine data quality with testing, tooling and performance evaluation Work with business and functional stakeholders to understand data requirements and downstream analytics needs. Partner with business to ensure appropriate integration of functions to meet goals as well as identify and define necessary system enhancements to deploy new products and process improvements. Foster a data-driven culture throughout the team and lead data engineering projects that will have an impact throughout the organization. Work with data and analytics experts to strive for greater functionality in our data systems and products and help to grow our data team with exceptional engineers. Your qualifications and experience: Education in related fields (Computer Science, Computer Engineering, Mathematical Engineering, Information Systems) or job experience preferably within multiple Data Engineering technologies. 4+ years experience with ETL development, data engineering and data quality assurance. Good Experience on Snowflake and its features. Hands on experience as Data Engineering in Cloud Data Solutions using Snowflake . Experienced working with Cloud Platform Services (AWS/Azure/GCP) . Experienced in ETL/ETL technologies like Talend/DBT or other ETL platforms . Experience in preparing and reviewing new data flows patterns. Excellent Python Skills Strong RDBMS concepts and SQL development skills Strong focus on data pipelines automation Exposure in quality assurance and data quality activities are an added advantage. DevOps/ DataOps experience (especially Data operations preferred) Readiness to work with multiple tech domains and streams Passionate about new technologies and experimentation Experience with Inmuta and Montecarlo is a plus What you get: Good and stable working environment with attractive compensation and rewards package (according to local regulations) Annual bonus payment based on performance Access to various internal and external training platforms (e.g. Linkedin Learning) Experienced and professional colleagues and workplace that supports innovation Multiple Savings Plans with Employer Match Company's emphasis on employees wellness and work-life balance ( (e.g. generous vacation days and OneRoche Wellness Days ), Workplace flexibility policy State of art working environment and facilities And many more that the Talent Acquisition Partner will be happy to talk about! Who we are A healthier future drives us to innovate. Together, more than 100'000 employees across the globe are dedicated to advance science, ensuring everyone has access to healthcare today and for generations to come. Our efforts result in more than 26 million people treated with our medicines and over 30 billion tests conducted using our Diagnostics products. We empower each other to explore new possibilities, foster creativity, and keep our ambitions high, so we can deliver life-changing healthcare solutions that make a global impact. Let's build a healthier future, together. Roche is an Equal Opportunity Employer.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

5 - 10 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Role & responsibilities Design, develop, and optimize scalable data pipelines for ETL/ELT processes. Develop and maintain Python-based data processing scripts and automation tools. Write and optimize complex SQL queries (preferably in Snowflake) for data transformation and analytics. Experience with Jenkins or other CI/CD tools. Experience developing with Snowflake as the data platform. Experience with ETL/ELT tools (preferably Fivetran, dbt). Implement version control best practices using Git or other tools to manage code changes. Collaborate with cross-functional teams (analysts, product managers, and engineers) to understand business needs and translate them into technical data solutions. Ensure data integrity, security, and governance across multiple data sources. Optimize query performance and database architecture for efficiency and scalability. Lead troubleshooting and debugging efforts for data-related issues. Document data workflows, architectures, and best practices to ensure maintainability and knowledge sharing. Preferred candidate profile 5+ years of experience in Data Engineering, Software Engineering, or a related field. Bachelors or masters degree in computer science, Computer Engineering, or a related discipline High proficiency in SQL (preferably Snowflake) for data modeling, performance tuning, and optimization. Strong expertise in Python for data processing and automation. Experience with Git or other version control tools in a collaborative development environment. Strong communication skills and ability to collaborate with cross-functional teams for requirements gathering and solution design. Experience working with large-scale, distributed data systems and cloud data warehouse.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

5 - 10 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

EagleView, the leader in aerial imagery, is hiring an Analytics and AI Development Lead in the Enterprise Data & Analytics team to help with implementing the organization's data and analytics strategy. This role entails leading a team of developers and data engineers overseeing the processes of AI tool development, automation and analytics, working closely with various departments across the US and India to ensure that data-driven insights are successfully integrated into business operations. The ideal candidate should possess a robust analytical background, exceptional leadership abilities, and a passion for utilizing data to address complex business challenges. This role is ideal for an experienced manager with strong experience in team management, Python, AWS, SQL (preferably Snowflake), Git and Jenkins. You will help build and manage our AI operations platform, helping deliver insights to a variety of stakeholders across all departments. Responsibilities Oversee and lead development for AI asset design, with a focus on efficiency Data governance to support quality AI, security, and compliance best practices Collaborate with cross-functional teams (analysts, product managers, and engineers) to understand business needs and translate them into technical solutions Lead team project delivery and communication to leadership Debug and optimize complex Python and SQL (preferably in Snowflake) Lead troubleshooting and debugging efforts for data pipelines and data quality issues Deliver high quality analytics for executive leadership on occasion Qualifications 5+ years of experience in lead role for a team within any of the Data Engineering, software engineering or Artificial Intelligence domains Bachelor's or Master's degree in Computer Science, Computer Engineering, or a related discipline Experience developing with Snowflake as the data platform Strong communication skills and ability to collaborate with cross-functional teams for requirements gathering and solution design Experience working with large-scale, distributed data systems and cloud data warehouses Understanding of data modeling principles and database design High proficiency in SQL (preferably Snowflake) for data modeling, performance tuning, and optimization High proficiency in Python, particularly for building data sets (including ingestion and transformation), ideally with production level projects using data science and AI libraries (including LLMs, semantic models or predictive analytics) Preferred Experience: Design, develop, and optimize scalable data pipelines for ETL/ELT processes Experience with ETL/ELT tools (preferably Fivetran, dbt) Experience with Git or other version control tools in a collaborative development environment Develop and maintain Python-based data processing scripts and automation tools Administration of big data platforms, such as Snowflake Experience with Microsoft PowerBI and other visualization tools Knowledge of cloud platforms (AWS, GCP, Azure) and related services (e.g., S3, Lambda, BigQuery) Experience in AWS, particularly for building and deploying AI solutions (ideally using Bedrock or OpenAI)

Posted 3 weeks ago

Apply

7.0 - 12.0 years

7 - 11 Lacs

Delhi, India

On-site

Foundit logo

Key deliverables: Enhance and maintain the MDM platform to support business needs Develop data pipelines using Snowflake, Python, SQL, and orchestration tools like Airflow Monitor and improve system performance and troubleshoot data pipeline issues Resolve production issues and ensure platform reliability Role responsibilities: Collaborate with data engineering and analytics teams for scalable solutions Apply DevOps practices to streamline deployment and automation Integrate cloud-native tools and services (AWS, Azure) with the data platform Utilize dbt and version control (Git) for data transformation and management

Posted 3 weeks ago

Apply

7.0 - 12.0 years

7 - 11 Lacs

Pune, Maharashtra, India

On-site

Foundit logo

Key deliverables: Enhance and maintain the MDM platform to support business needs Develop data pipelines using Snowflake, Python, SQL, and orchestration tools like Airflow Monitor and improve system performance and troubleshoot data pipeline issues Resolve production issues and ensure platform reliability Role responsibilities: Collaborate with data engineering and analytics teams for scalable solutions Apply DevOps practices to streamline deployment and automation Integrate cloud-native tools and services (AWS, Azure) with the data platform Utilize dbt and version control (Git) for data transformation and management

Posted 3 weeks ago

Apply

5.0 - 10.0 years

8 - 18 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Total Experience:5+ Years Location :Hyderabad, Bangalore Work Mode: Hybrid Responsibilities: Translate business requirements into technical requirements as needed. Design and develop automated scripts for data pipelines to process and transform as per the requirements and monitor those. Produce artifacts such as data flow diagrams, designs, data model along with git code as deliverable. Use tools or programming languages such as SQL, Snowflake, Airflow, dbt, Salesforce Data cloud. Ensure data accuracy, timeliness, and reliability throughout the pipeline. Complete QA, data profiling to ensure data is ready as per the requirements for UAT. Collaborate with stakeholders on business, Visualization team and support enhancements. Timely updates on the sprint boards, task updates. Team lead to provide timely project updates on all the projects. Project experience with version control systems and CICD such as GIT, GitFlow, Bitbucket, Jenkins etc. Participate in UAT to resolve findings and plan Go Live/Production deployment.

Posted 3 weeks ago

Apply

8.0 - 13.0 years

7 - 13 Lacs

Chennai, Tamil Nadu, India

Remote

Foundit logo

We are seeking a highly skilled and experienced Data Architect with strong expertise in data modeling and Snowflake to design, develop, and optimize enterprise data architecture. The ideal candidate will play a critical role in shaping data strategy, building scalable models, and ensuring efficient data integration and governance. Key Responsibilities: Design and implement end-to-end data architecture using Snowflake Develop and maintain conceptual, logical, and physical data models. Define and enforce data architecture standards, best practices, and policies. Collaborate with data engineers, analysts, and business stakeholders to gather requirements and design data solutions. Optimize Snowflake performance including data partitioning, caching, and query tuning. Create and manage data dictionaries, metadata, and lineage documentation. Ensure data quality, consistency, and security across all data platforms. Support data integration from various sources (cloud/on-premises) into Snowflake. Required Skills and Experience: 8+ years of experience in data architecture, data modeling, or similar roles. Hands-on expertise with Snowflake including Snowpipe, Streams, Tasks, and Secure Data Sharing. Strong experience with data modeling tools (e.g., Erwin, ER/Studio, dbt). Proficiency in SQL , ETL/ELT pipelines , and data warehousing concepts . Experience working with structured, semi-structured (JSON, XML), and unstructured data. Solid understanding of data governance, data cataloging, and security frameworks. Excellent analytical, communication, and stakeholder management skills. Preferred Qualifications: Experience with cloud platforms like AWS , Azure , or GCP . Familiarity with data lakehouse architecture and real-time data processing. Snowflake Certification(s) or relevant cloud certifications. Knowledge of Python or scripting for data automation is a plus.

Posted 3 weeks ago

Apply

8.0 - 13.0 years

7 - 13 Lacs

Bengaluru / Bangalore, Karnataka, India

Remote

Foundit logo

We are seeking a highly skilled and experienced Data Architect with strong expertise in data modeling and Snowflake to design, develop, and optimize enterprise data architecture. The ideal candidate will play a critical role in shaping data strategy, building scalable models, and ensuring efficient data integration and governance. Key Responsibilities: Design and implement end-to-end data architecture using Snowflake Develop and maintain conceptual, logical, and physical data models. Define and enforce data architecture standards, best practices, and policies. Collaborate with data engineers, analysts, and business stakeholders to gather requirements and design data solutions. Optimize Snowflake performance including data partitioning, caching, and query tuning. Create and manage data dictionaries, metadata, and lineage documentation. Ensure data quality, consistency, and security across all data platforms. Support data integration from various sources (cloud/on-premises) into Snowflake. Required Skills and Experience: 8+ years of experience in data architecture, data modeling, or similar roles. Hands-on expertise with Snowflake including Snowpipe, Streams, Tasks, and Secure Data Sharing. Strong experience with data modeling tools (e.g., Erwin, ER/Studio, dbt). Proficiency in SQL , ETL/ELT pipelines , and data warehousing concepts . Experience working with structured, semi-structured (JSON, XML), and unstructured data. Solid understanding of data governance, data cataloging, and security frameworks. Excellent analytical, communication, and stakeholder management skills. Preferred Qualifications: Experience with cloud platforms like AWS , Azure , or GCP . Familiarity with data lakehouse architecture and real-time data processing. Snowflake Certification(s) or relevant cloud certifications. Knowledge of Python or scripting for data automation is a plus.

Posted 3 weeks ago

Apply

1.0 - 4.0 years

1 - 4 Lacs

Pune, Maharashtra, India

On-site

Foundit logo

Duties Responsibilities: Collaborate with cross-functional teams to understand business requirements and translate them into data integration solutions. Develop and maintain ETL/ELT pipelines using modern tools like Informatica IDMC to connect source systems to Snowflake. Ensure data accuracy, consistency, and security in all integration workflows. Monitor, troubleshoot, and optimize data integration processes to meet performance and scalability goals. Support ongoing integration projects, including Salesforce and SAP data pipelines, while adhering to best practices in data governance. Document integration designs, workflows, and operational processes for effective knowledge sharing. Assist in implementing and improving data quality controls at the start of processes to ensure reliable outcomes. Stay informed about the latest developments in integration technologies and contribute to team learning and improvement. Qualifications: Required Skills and Experience: 5+ years of hands-on experience in data integration, ETL/ELT development, or data engineering. Proficiency in SQL and experience working with relational databases such as Snowflake, PostgreSQL, or SQL Server. Familiarity with data integration tools such as FiveTran, Informatica Intelligent Data Management Cloud (IDMC), or similar platforms. Basic understanding of cloud platforms like AWS, Azure, or GCP. Experience working with structured and unstructured data in varying formats (e.g., JSON, XML, CSV). Strong problem-solving skills and the ability to troubleshoot data integration issues effectively. Excellent verbal and written communication skills, with the ability to document technical solutions clearly. Preferred Skills and Experience: Exposure to integrating business systems such as Salesforce or SAP into data platforms. Knowledge of data warehousing concepts and hands-on experience with Snowflake. Familiarity with APIs, event-driven pipelines, and automation workflows. Understanding of data governance principles and data quality best practices. Education: Bachelor s degree in Computer Science, Data Engineering, or a related field, or equivalent practical experience. What We Offer: A collaborative and mission-driven work environment at the forefront of EdTech innovation. Opportunities for growth, learning, and professional development. Competitive salary and benefits package, including support for certifications like Snowflake SnowPro Core and Informatica Cloud certifications.

Posted 3 weeks ago

Apply

10.0 - 15.0 years

25 - 40 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

Remote

Foundit logo

Responsibilities: Lead and manage an offshore team of data engineers, providing strategic guidance, mentorship, and support to ensure the successful delivery of projects and the development of team members. Collaborate closely with onshore stakeholders to understand project requirements, allocate resources efficiently, and ensure alignment with client expectations and project timelines. Drive the technical design, implementation, and optimization of data pipelines, ETL processes, and data warehouses, ensuring scalability, performance, and reliability. Define and enforce engineering best practices, coding standards, and data quality standards to maintain high-quality deliverables and mitigate project risks. Stay abreast of emerging technologies and industry trends in data engineering, and provide recommendations for tooling, process improvements, and skill development. Assume a data architect role as needed, leading the design and implementation of data architecture solutions, data modeling, and optimization strategies. Demonstrate proficiency in AWS services such as: Expertise in cloud data services, including AWS services like Amazon Redshift, Amazon EMR, and AWS Glue, to design and implement scalable data solutions. Experience with cloud infrastructure services such as AWS EC2, AWS S3, to optimize data processing and storage. Knowledge of cloud security best practices, IAM roles, and encryption mechanisms to ensure data privacy and compliance. Proficiency in managing or implementing cloud data warehouse solutions, including data modeling, schema design, performance tuning, and optimization techniques. Demonstrate proficiency in modern data platforms such as Snowflake and Databricks, including: Deep understanding of Snowflake's architecture, capabilities, and best practices for designing and implementing data warehouse solutions. Hands-on experience with Databricks for data engineering, data processing, and machine learning tasks, leveraging Spark clusters for scalable data processing. Ability to optimize Snowflake and Databricks configurations for performance, scalability, and cost-effectiveness. Manage the offshore team's performance, including resource allocation, performance evaluations, and professional development, to maximize team productivity and morale. Qualifications: Bachelor's degree in computer science, Engineering, or a related field; advanced degree preferred. 10+ years of experience in data engineering, with a proven track record of leadership and technical expertise in managing complex data projects. Proficiency in programming languages such as Python, Java, or Scala, and expertise in SQL and relational databases (e.g., PostgreSQL, MySQL). Strong understanding of distributed computing, cloud technologies (e.g., AWS), and big data frameworks (e.g., Hadoop, Spark). Experience with data architecture design, data modeling, and optimization techniques. Excellent communication, collaboration, and leadership skills, with the ability to effectively manage remote teams and engage with onshore stakeholders. Proven ability to adapt to evolving project requirements and effectively prioritize tasks in a fast-paced environment.

Posted 3 weeks ago

Apply

0.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Foundit logo

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Assistant Manager /Manager - Decision Analyst (SAS, R, Python) This position is responsible for providing reporting & insights on client&rsquos digital banking channel. The digital channel management team is responsible for achieving channel account, balance, and fee revenue growth targets across all business segments and product lines . Additionally, the team is responsible for idea and concept generation, business case development, valuation of new concepts and business execution . You will need to become an expert in providing data support and analysis to help define strategy and tactics to meet portfolio goals. In this role, you will be partnering with leadership, product teams, and the broader digital team Responsibilities Critical thinking skills to come up with the right questions to ask or problems that need to be solved. Ability to define the data necessary to build strategy, solve a problem, or make a recommendation. Ability to quickly learn about our data environment to source data and query large datasets across multiple databases. Overall digital channel performance & insights delivered to key stakeholders and senior leadership. Business case and initiative performance tracking delivered to key stakeholders and broader digital channel management team. Ad-hoc analysis & insights needed to achieve channel performance goals. Analyze the results of the queries to create meaningful insights. Ability to effectively visualize and summarize work product for a variety of audiences. Strong presentation, collaboration, and communication skills. Ability to simply layout clear options and recommendations for decision makers. Qualifications we seek in you! Minimum Q ualifications / Skills Bachelor&rsquos degree in business information systems, computer science, mathematical disciplines, statistics, finance, economics, or other technical degree. Relevant years of experience in financial services. Strong knowledge and working experience in data manipulation tools such as Tableau, SQL, Experience in SAS, R, or Python to query large databases and manipulate large datasets would be an added advantage . Experience presenting analytical findings to a non-technical audience to guide business decision-making. Preferred Q ualifications / Skills Experience in Banking or Finance is preferred. Proficiency in Tableau and SQL SAS BASE language certification is a plus Experience with cloud-based analytics services such as Snowflake and AWS Strong attention to detail and an ability to prioritize work in a fast-paced environment. Ability to manage a queue of deliverables & requests with minimum supervision. Strong communications skills written and verbal. Excellent interpersonal skills. Excellent skills with MS Word, Excel and PowerPoint . Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. For more information, visit . Follow us on Twitter, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 3 weeks ago

Apply

0.0 years

2 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Key Responsibilities: We are seeking a skilled and detail oriented Data Warehouse Engineer to design build and maintain scalable data warehouse solutions You will be responsible for developing efficient data pipelines integrating diverse data sources ensuring data accuracy and enabling high quality analytics to drive business decisions Responsibilities Design develop and maintain data warehouse architectures and systems Build robust ETL Extract Transform Load processes for structured and unstructured data sources Optimize data models database performance and storage solutions Collaborate with data analysts data scientists and business stakeholders to understand data requirements Implement data quality checks and ensure data governance best practices Develop and maintain documentation related to data warehouse design data flow and processes Monitor system performance and proactively identify areas for improvement Support ad hoc data requests and reporting needs Stay up to date with emerging data technologies and industry best practices Preferred Skills: Technology->ETL & Data Quality->ETL - Others,Technology->Database->Data Modeling,Technology->Data Management - DB->DB2,Technology->Data on Cloud-DataStore->Snowflake

Posted 3 weeks ago

Apply

8.0 - 9.0 years

8 - 9 Lacs

Gurgaon / Gurugram, Haryana, India

On-site

Foundit logo

Overview: We are looking for a skilled Snowflake Developer with 8+ years of experience in developing and managing data warehouse solutions using Snowflake. The ideal candidate should have expertise in stored procedures, SQL scripting, and DBT development using models, macros, and jobs. The candidate should also have a strong understanding of DWH concepts, along with experience in developing ETL solutions and implementing CICD pipelines using Bitbucket, Jenkins, DBT, and Snowflake. Additionally, the candidate should have experience in collaborating with stakeholders to gather requirements, develop logic, and deploy solutions. In This Role, You Will: Manage and maintain the Snowflake platform, ensuring optimal performance and uptime. Design and implement Snowflake architecture, considering best practices for scalability, security, and compliance. Conduct performance optimization activities to ensure efficient use of resources and credits. Oversee governance and compliance practices, enabling the right audit logs and ensuring data security using RBAC, masking etc. Perform POCs to evaluate new features and functionalities. Enable and configure new features on the Snowflake platform. Develop and implement integration design strategies using AWS services such as S3, Lambda, SQS, and Kinesis. Design and implement API-based integrations to ensure seamless data flow between systems. Collaborate with cross-functional teams to ensure the successful implementation of Snowflake projects. Utilize programming languages, particularly Python, to develop custom solutions and automation scripts. Heres What You Need: Proven experience working with Snowflake and AWS cloud platforms. In-depth knowledge of Snowflake architecture, design, and best practices. Strong understanding of compliance and governance practices, with the ability to enable and manage audit logs. Expertise in performance optimization and credit usage management on the Snowflake platform. Experience with AWS services such as S3, Lambda, SQS, and Kinesis. Proficient in API-based integrations and data integration strategies. Strong programming skills, particularly in Python. Excellent collaboration and communication skills, with the ability to work effectively with cross-functional teams. Experience: 8 - 9 years Salary: Not Disclosed Location: Gurugram

Posted 3 weeks ago

Apply

11.0 - 12.0 years

11 - 12 Lacs

Gurgaon / Gurugram, Haryana, India

On-site

Foundit logo

Overview: We are looking for a skilled Snowflake Developer with 8+ years of experience in developing and managing data warehouse solutions using Snowflake. The ideal candidate should have expertise in stored procedures, SQL scripting, and DBT development using models, macros, and jobs. The candidate should also have a strong understanding of DWH concepts, along with experience in developing ETL solutions and implementing CICD pipelines using Bitbucket, Jenkins, DBT, and Snowflake. Additionally, the candidate should have experience in collaborating with stakeholders to gather requirements, develop logic, and deploy solutions. In This Role, You Will: Manage and maintain the Snowflake platform, ensuring optimal performance and uptime. Design and implement Snowflake architecture, considering best practices for scalability, security, and compliance. Conduct performance optimization activities to ensure efficient use of resources and credits. Oversee governance and compliance practices, enabling the right audit logs and ensuring data security using RBAC, masking etc. Perform POCs to evaluate new features and functionalities. Enable and configure new features on the Snowflake platform. Develop and implement integration design strategies using AWS services such as S3, Lambda, SQS, and Kinesis. Design and implement API-based integrations to ensure seamless data flow between systems. Collaborate with cross-functional teams to ensure the successful implementation of Snowflake projects. Utilize programming languages, particularly Python, to develop custom solutions and automation scripts. Heres What You Need: Proven experience working with Snowflake and AWS cloud platforms. In-depth knowledge of Snowflake architecture, design, and best practices. Strong understanding of compliance and governance practices, with the ability to enable and manage audit logs. Expertise in performance optimization and credit usage management on the Snowflake platform. Experience with AWS services such as S3, Lambda, SQS, and Kinesis. Proficient in API-based integrations and data integration strategies. Strong programming skills, particularly in Python. Excellent collaboration and communication skills, with the ability to work effectively with cross-functional teams.

Posted 3 weeks ago

Apply

10.0 - 14.0 years

10 - 14 Lacs

Gurgaon / Gurugram, Haryana, India

On-site

Foundit logo

The Deployment Manager is responsible for leading and managing the deployment process that involve deployment of new releases, patches, data migration, integration, and transformation. This role requires a strong background in deployment management and a solid understanding of database administration, as well as excellent project management and communication skills. He will work with various teams, including IT, operations, and business units, to ensure that deployment activities are aligned with the project objectives and scope. He will also oversee the quality, performance, and security of the deployed solutions, and provide training and support to end-users and stakeholders when required. Job Responsibilities Lead and manage deployment projects from initiation to closure, ensuring timely and successful implementation of solutions Develop detailed deployment plans, including timelines, milestones, resource allocation, and risk mitigation strategies Coordinate with cross-functional teams, including IT, operations, and business units, to ensure seamless deployment activities and stakeholder satisfaction Monitor and maintain the quality, performance, and security of the deployed solutions, including regular backups, updates, and patching Identify and resolve any issues or challenges that arise during deployment, such as database performance, data quality, or integration errors Maintain clear and effective communication with all stakeholders, providing regular updates on project status, milestones, and any issues that arise Ensure that deployment activities meet quality standards and comply with organizational policies and procedures Prepare and maintain comprehensive deployment documentation, including project plans, status reports, data dictionaries, and post-deployment reviews Provide training and support to end-users and stakeholders to ensure successful adoption of deployed solutions Identify opportunities for process improvements and implement best practices to enhance the efficiency and effectiveness of deployment activities Education BE/B.Tech Master of Computer Application Work Experience Cloud PlatformsAzure, AWS, Oracle Cloud Proficiency in SQL and experience with relational database management systems (e.g., MySQL, Postgres, Redshift, Snowflake, SQL Server) Familiarity with Agile/Scrum framework Strong understanding of CI/CD (Continuous Integration/Continuous Deployment) principles and practices, including experience with CI/CD tools such as Jenkins, GitLab CI etc Bachelor's degree in computer science, Information Technology, Business Administration, or a related field Minimum 10 years of experience in project management and IT deployment, or a related role

Posted 3 weeks ago

Apply

5.0 - 10.0 years

5 - 10 Lacs

Noida, Uttar Pradesh, India

On-site

Foundit logo

This position is part of the technical leadership in data warehousing and Business Intelligence areas. Someone who can work on multiple project streams and clients for better business decision making especially in the area of Lifesciences/ Pharmaceutical domain. Job Responsibilities Technology Leadership Lead and guide the team independently or with little support to design, implement, and deliver complex cloud data management and BI project assignments. Technical Portfolio Expertise in a range of BI and data hosting technologies like the AWS stack (Redshift, EC2), Snowflake, Spark, Full Stack, Qlik, Tableau, Microstrategy. Project Management Get accurate briefs from the Client and translate into tasks for team members with priorities and timeline plans. Must maintain high standards of quality and thoroughness. Should be able to monitor accuracy and quality of others work. Ability to think in advance about potential risks and mitigation plans. Logical Thinking Able to think analytically, use a systematic and logical approach to analyze data, problems, and situations. Must be able to guide team members in analysis. Handle Client Relationship, P&L Manage client relationship and client expectations independently. Should be able to deliver results back to the Client independently. Should have excellent communication skills. Education BE/B.Tech Master of Computer Application Work Experience Minimum of 5 years of relevant experience in Pharma domain . Technical: Should have 15 years of hands-on experience in the following tools. Must have working knowledge of at least 2 of the following tools: QlikView, QlikSense, Tableau, Microstrategy, Spotfire . Aware of techniques such as UI design, Report modeling, performance tuning, and regression testing . Basic expertise with MS Excel . Advanced expertise with SQL . Functional: Should have experience in following concepts and technologies: Pharma data sources like IMS, Veeva, Symphony, Cegedim etc. Business processes like alignment, market definition, segmentation, sales crediting, activity metrics calculation. Relevant Experience: 0-2 years of relevant experience in a large/midsize IT services/Consulting/Analytics Company. 1-3 years of relevant experience in a large/midsize IT services/Consulting/Analytics Company. 3-5 years of relevant experience in a large/midsize IT services/Consulting/Analytics Company. 3-5 years of relevant experience in a large/midsize IT services/Consulting/Analytics Company.

Posted 3 weeks ago

Apply

4.0 - 7.0 years

4 - 7 Lacs

Gurgaon / Gurugram, Haryana, India

On-site

Foundit logo

The Cvent Analytics team is looking to hire a Senior Analyst and is currently accepting applications The selected candidate will typically focus on providing customer support through data-driven analysis. He/she would be partnering with the Sales team based out of the US to drive impactful client conversations using Cvent's powerful sourcing and product data. In This Role, You Will: Collaborate with the Sales and Product teams to diligently monitor and evaluate essential product metrics and performance indicators Partner with the Product team to conduct a thorough analysis of usage patterns and identify trends in product adoption Effectively communicate the data narrative through expert analysis, interpretation, and data visualization, thereby conveying significant insights utilizing PowerPoint or other data visualization tools Analyze and interpret data into comprehensive charts and high-quality graphics, ensuring the effective communication and presentation of analytical insights to internal stakeholders Collaborate with fellow analysts within the team to define and refine customer segmentation, which will serve as a foundation for the support matrix Partner with Product leadership to furnish data-driven recommendations for product enhancements Develop and design scalable market insights and customer insights content suitable for internal office hours, webinars, and industry publications Employ a research-led approach to identify both internal and external factors that influence customer performance Assume full management responsibilities and deliver periodic outputs (repeatable, scalable short analyses for stakeholders), thereby ensuring project success and quality Heres What You Need: 4-7 years of experience in a product or strategy role in the analytics domain Bachelors Degree (in technology, statistics, sciences, or mathematics) and/or Engineering with a good academic record Strong verbal and written communication skills with attention to precision of language and ability to organize information logically Experience working on SQL or Snowflake and Advanced Excel, and any BI visualization tool (Mandatory) Hands-on experience to work on PowerPoint decks and storyboarding skills Good presentation skills to deliver insights to a larger audience Excellent project and time management skills; consultative experience and exposure, proven competence for meeting deadlines, multi-tasking under pressure, and managing work under ambiguity Self-driven and can work with geographically spread teams.

Posted 3 weeks ago

Apply

3.0 - 4.0 years

3 - 4 Lacs

Gurgaon / Gurugram, Haryana, India

On-site

Foundit logo

At Cvent, we value the diverse perspectives that each individual brings Whether working with a team of colleagues or with clients, we ensure that we foster a culture that celebrates differences and builds on shared connections In This Role, You Will: Technology Implementation and Project Management: Collecting and documenting requirements, planning, risk management, Sandbox implementation and rigorous UAT testing, sign-off and production deployment Solution Architecting: Convert business requirements into technology design/ solutions Provide RCAs, What-If simulations and Impact Analysis on need basis Implementation of large system integrations, Merger and Acquisitions in SF and new business systems in multiple locations Ensure that the whole team meets statutory, regulatory and compliance requirements Additional Responsibilities: Provide regular status reporting, identify, track and mitigate key risks, issues and dependencies including escalations and liaising with the required stakeholders Ensure appropriate program communications are in place to address and engage all stakeholder groups Ensure that a culture of improvement is in place to identify key complaint drivers and that we champion remedial work - working in collaboration with other departments to strengthen process, systems and control that will mitigate issues Implement and maintain performance improvement project as agreed by the goals of the department Front ending Business continuity planning for the local site Support the local implementation of the Global strategy for Service Readiness and delivering suitable go to market and commercial models Heres What You Need: Bachelor's degree in engineering or business discipline, or equivalent experience Strong know-how of SFDC overarching solution including SF Metadata and Salesforce Flow, Lightning Components, SF Service Reporting, Dashboard and Data Migration Salesforce Admin Certification, preferred SF BA, PD1, FinancialForce PSA, CPQ, Gainsight, Snowflake, Sigma Strong skills in complex process analysis, project management, problem solving and business process design with a focus on process efficiency Experienced and well-versed in change management methodologies, System Development Life Cycle, System administration, Call-Center specific technologies, messaging systems and emerging technologies Strong experience and knowledge of agile project, support and constantly evolving the intake process and willingness to quickly understand and embrace new processes and technologies Track record of working cross-functionally to deliver large scale multi-million-dollar change and continuous improvement initiatives with a focus on user/customer experience Exceptional stakeholder management and communication skills with the ability to form relationships in person and in writing with ease Excellent Interpersonal and analytical skills Ability to influence decision making processes and negotiate win-win situations At ease with introducing processes and drive change in an unstructured environment Open for global work timings

Posted 3 weeks ago

Apply

8.0 - 10.0 years

0 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

Remote

Foundit logo

JOB DESCRIPTION Are you ready to make an impact at DTCC Do you want to work on innovative projects, collaborate with a dynamic and supportive team, and receive investment in your professional development At DTCC, we are at the forefront of innovation in the financial markets. We're committed to helping our employees grow and succeed. We believe that you have the skills and drive to make a real impact. We foster a thriving internal community and are committed to creating a workplace that looks like the world that we serve. Pay and Benefits: Competitive compensation, including base pay and annual incentive Comprehensive health and life insurance and well-being benefits, based on location Pension / Retirement benefits Paid Time Off and Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being. DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (onsite Tuesdays, Wednesdays and a third day unique to each team or employee). The Impact you will have in this role: The Enterprise Intelligence Lead will be responsible for building data pipelines using their deep knowledge of Talend, SQL and Data Analysis on the bespoke Snowflake data warehouse for Enterprise Intelligence This role will be in the Claw Team within Enterprise Data & Corporate Technology (EDCT). The Enterprise Intelligence team maintains the firm's business intelligence tools and data warehouse. Your Primary Responsibilities: Working on and leading engineering and development focused projects from start to finish with minimal supervision Providing technical and operational support for our customer base as well as other technical areas within the company that utilize Claw Risk management functions such as reconciliation of vulnerabilities, security baselines as well as other risk and audit related objectives Administrative functions for our tools such as keeping the tool documentation current and handling service requests Participate in user training to increase awareness of Claw Ensuring incident, problem and change tickets are addressed in a timely fashion, as well as escalating technical and managerial issues Following DTCC's ITIL process for incident, change and problem resolution Qualifications: Minimum of 8 years of related experience Bachelor's degree preferred or equivalent experience. Talents Needed for Success: Must have experience in snowflake or SQL Minimum of 5 years of related data warehousing work experience 5+ years managing data warehouses in a production environment. This includes all phases of lifecycle management: planning, design, deployment, upkeep and retirement Strong understanding of star/snowflake schemas and data integration methods and tools Moderate to advanced competency of Windows and Unix-like operating system principles Developed competencies around essential project management, communication (oral, written) and personal effectiveness Working experience in MS Office tools such as Outlook, Excel, PowerPoint, Visio and Project Optimize/Tune source streams, queries, Powerbase Dashboards Good knowledge of the technical components of Claw (i.e. Snowflake, Talend, PowerBI, PowerShell, Autosys) ABOUT THE TEAM IT Architecture and Enterprise Services are responsible for enabling digital transformation of DTCC. The group manages complexity of the technology landscape within DTCC and enhances agility, robustness and security of the technology footprint. It does so by serving as the focal point for all technology architectural activities in the organization as well as engineering a portfolio of foundational technology assets to enable our digital transformation.

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies