Jobs
Interviews

2818 Snowflake Jobs - Page 6

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

You should have good experience in AWS, Informatica Power Centre, IICS, Unix, Unix Scripting or Python Scripting. Strong experience in SQL is required and you should be an expert on Snowflake. A minimum of 5+ years of experience in a Data Engineering role is necessary. Additionally, good communication skills are essential for this position.,

Posted 4 days ago

Apply

0.0 - 4.0 years

0 Lacs

jaipur, rajasthan

On-site

As a Support Engineer at UiPath in Jaipur, you will have the opportunity to work in a customer-facing role that involves problem-solving and assisting customers with technical issues related to the Peak platform and their deployed applications. This role is ideal for individuals with a genuine interest in technology, particularly in fields like Data Engineering, Data Science, or Platform Ops. While some basic knowledge of SQL and Python is required, this position does not involve software development. Your primary responsibilities will include troubleshooting and resolving customer issues on the Peak platform, taking ownership of problems from investigation to resolution, analyzing application logs and system outputs to identify errors, scripting to automate support tasks, monitoring system health, assisting with infrastructure security, communicating updates clearly to both internal teams and customers, contributing to internal documentation, and participating in an on-call rotation to support customers when needed. To be successful in this role, you should have a computer science degree or equivalent academic experience in technology, be proficient in Python, Bash, and SQL, have familiarity with Linux and cloud platforms like AWS, GCP, or Azure, possess strong communication skills in English, be well-organized with strong problem-solving abilities, and have excellent interpersonal skills to work effectively in a team environment. While these are preferred qualifications, UiPath encourages individuals with varying levels of experience and a passion for the job to apply. UiPath values flexibility in work arrangements, with a mix of hybrid, office-based, and remote work options available based on business needs and role requirements. Applications for this role are reviewed on a rolling basis, and there is no fixed deadline for submission, as the application window may change depending on the volume of applicants or if a suitable candidate is selected promptly. If you believe you have the drive and enthusiasm to excel in this role, we encourage you to apply and be a part of our dynamic team at UiPath.,

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Technical Content Manager (Full Stack), you will play a key leadership role in content development, contributing to the platform's technical depth and quality. Your primary focus will be on technical content creation, involving tasks such as designing technical problems for the platform and clients, reviewing code, and drafting software designs. Additionally, you will be responsible for understanding client requirements and translating them into effective, problem-based content. Your contributions to content development will span various key areas, including full-stack development problems, database design problems, core middle-tier modules, performance optimization, containerized applications, and cloud technologies along with DevOps practices. Your role will also involve expanding your expertise in continuous delivery and other domains within modern software engineering. Key Responsibilities - Lead technical content creation by designing software problems and reviewing code for accuracy and clarity. - Collaborate closely with clients to understand their technical requirements and develop tailored content solutions. - Design, develop, and maintain an internal content library across the entire tech stack, ensuring performance and scalability. - Create content focusing on relational, NoSQL, and in-memory databases, cloud infrastructure, and container technologies. - Provide coaching, mentoring, and guidance to enhance organizational capabilities and content practices. - Stay updated on the latest open-source libraries, frameworks, and technologies, actively participating in Agile software development practices to foster innovation within the team. Qualifications and Skills - Bachelors degree in Computer Science, Engineering, or a related technical field. - Minimum of 3+ years of experience in software development, with expertise in complex enterprise systems. - Strong skills in technologies like Java, Python, Spark, Secure Coding, Synchronous/Asynchronous Microservices. - Proficiency in frontend development with an understanding of core principles. - Experience with relational databases, NoSQL databases, in-memory databases, cloud technologies, and container technologies. - Knowledge of Terraform, Docker, Kubernetes, and experience in managing containerized applications in production. - Familiarity with Big Data platforms and modern engineering practices such as code refactoring, continuous integration, and security best practices. - Proficient in Agile methodologies, application scalability, and performance optimization tools. - Strong analytical and problem-solving skills with a history of innovative solutions. - Excellent written and verbal communication skills for managing client expectations and delivering under pressure. Desired Attributes - Passion for learning new technologies and improving processes. - Ability to thrive in a fast-paced and dynamic environment as a collaborative team player. - Capability to manage multiple projects simultaneously while ensuring quality and meeting deadlines. Join Us for - The opportunity to lead and innovate in developing technical content for a diverse client base. - A supportive environment for continuous learning and professional growth. - Being part of an organization that values creativity, collaboration, and a deep commitment to engineering excellence.,

Posted 4 days ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

We are seeking a skilled and seasoned Senior Data Engineer to become a part of our innovative team. The perfect candidate will possess a solid foundation in data engineering and proficiency in Azure, Azure Data Factory (ADF), Azure Fabric, Databricks, and Snowflake. This position necessitates the creation, development, and upkeep of data pipelines, ensuring data quality and accessibility, and collaborating with various teams to support our data-centric initiatives. Your responsibilities will include designing, developing, and maintaining robust data pipelines utilizing Azure Data Factory, Azure Fabric, Databricks, and Snowflake. You will work closely with data scientists, analysts, and stakeholders to comprehend data requirements and guarantee the availability and quality of data. Implementing and refining ETL processes to handle the ingestion, transformation, and loading of data from diverse sources into data warehouses, data lakes, and Snowflake will also be a key aspect of your role. Additionally, you will be responsible for upholding data integrity and security through the implementation of best practices and compliance with data governance policies. Monitoring and resolving data pipeline issues to ensure the timely and accurate delivery of data, as well as enhancing data storage and retrieval processes to boost performance and scalability, will be essential tasks. It is crucial to stay abreast of industry trends and best practices in data engineering and cloud technologies. Furthermore, you will have the opportunity to mentor and provide guidance to junior data engineers, offering technical expertise and assistance as required. To qualify for this role, you should hold a Bachelor's or Master's degree in Computer Science, Information Technology, or a related field, along with over 5 years of experience in data engineering, with a strong emphasis on Azure, ADF, Azure Fabric, Databricks, and Snowflake. Proficiency in SQL, experience in data modeling and database design, and strong programming skills in Python, Scala, or Java are also essential. Familiarity with big data technologies like Apache Spark, Hadoop, and Kafka, as well as a solid grasp of data warehousing concepts and experience with data warehousing solutions (e.g., Azure Synapse Analytics, Snowflake) is required. Knowledge of data governance, data quality, and data security best practices, excellent problem-solving abilities, and effective communication and collaboration skills within a team setting are all highly valued. Preferred qualifications include experience with other Azure services such as Azure Blob Storage, Azure SQL Database, and Azure Cosmos DB, familiarity with DevOps practices and tools for CI/CD in data engineering, as well as certifications in Azure Data Engineering, Snowflake, or related fields.,

Posted 4 days ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

As a Senior Data Engineer at Buildnetic, a Singapore HQ company located in Bangalore, you will leverage your 8 to 12 years of experience to play a crucial role in designing, implementing, and managing data infrastructure that drives data-driven decision-making processes. In this hybrid role, you will collaborate with cutting-edge technologies to construct data pipelines, architect data models, and uphold data integrity. Your key responsibilities will include designing, developing, and maintaining scalable data pipelines and architectures, working with large datasets to create efficient ETL processes, and partnering with data scientists, analysts, and stakeholders to discern business requirements. Ensuring data quality through cleaning, validation, and profiling, implementing data models for optimal performance in data warehousing and data lakes, and managing cloud data infrastructure on platforms like AWS, Azure, or GCP will be essential aspects of your role. You will work with a variety of programming languages including Python, SQL, Java, and Scala, alongside data warehousing and data lakes tools such as Snowflake, Redshift, Databricks, Hadoop, Hive, and Spark. Your expertise in data modeling techniques, ETL tools like Informatica and Talend, and management of both NoSQL and relational databases will be critical. Additionally, experience with CI/CD pipelines, Git for version control, troubleshooting complex data infrastructure issues, and proficiency in Linux/Unix systems will be advantageous. If you possess strong problem-solving skills, effective communication abilities, and prior experience working in a hybrid work environment, Buildnetic offers you an opportunity to be part of a forward-thinking company that prioritizes innovation and technological advancement. You will collaborate with a talented and collaborative team, enjoy a flexible hybrid working model, and receive a competitive salary and benefits package. If you are passionate about data engineering and eager to work with the latest technologies, we look forward to hearing from you.,

Posted 4 days ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

You should have a minimum of 5+ years of ETL testing experience with expertise in ETL test planning, functional testing, performance testing, developer management, and defect management using Application Lifecycle Management (ALM) tools. Your experience should include ETL testing with tools such as Informatica, SSIS, Power BI, and Snowflake. Additionally, you should have proficiency in test data management, processes analysis, and documentation using industry standard tools. It is essential to have experience working with scrum/agile or other project management methodologies. Knowledge or experience in the life sciences/biotech space is preferred. Strong written and verbal communication skills are required for this role. Experience in testing or leading testing for ETL is preferred, and experience with other account management systems is a strong plus. Knowledge of Gxp regulations, Python Scripting, and Automation testing would be beneficial. The work locations available for this position are Chennai, Bangalore, Coimbatore, and Pune. The work hours are from 2 pm to 11 pm, and the work mode is hybrid.,

Posted 4 days ago

Apply

2.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

As a Senior Principal Technologist (Data Architect) at Medtronic, you will be responsible for delivering data architecture solutions that align with business capability needs and enterprise standards. In this role, you will collaborate with Enterprise Solution Architects, Business Solution Architects, Technical Architects, and external service providers to ensure that data and information models and technologies are in line with architecture strategies and Medtronic's standards. Your role will involve working with Business Analysts to review business capability needs, define requirements, conduct data analysis, develop data models, write technical specifications, and collaborate with development teams to ensure the successful delivery of designs. Your technical expertise will be crucial in leveraging tools such as webMethods suite, Informatica, ETL tools, Kafka, and data transformation techniques to design and implement robust integration solutions. You will oversee the implementation of integration solutions, ensuring they meet technical specifications, quality standards, and best practices. Additionally, you will lead continuous improvement initiatives to enhance integration processes, troubleshoot and resolve integration-related issues, mentor junior team members, collaborate with vendors, optimize performance, and contribute to documentation and knowledge management efforts. To be successful in this role, you should have at least 8 years of IT experience with a Bachelor's Degree in Engineering, MCA, or MSc. You should also have experience in relevant architecture disciplines (integrations, data, services, infrastructure), Oracle, SAP, or big data platforms, Informatica, PowerDesigner, Python coding, and Snowflake. Specialized knowledge in Enterprise-class architecture concepts, data integration, data modeling methodologies, cloud-based solutions, and data governance would be advantageous. It would be beneficial to have a high degree of learning agility, experience with large enterprise systems, technical modeling and design skills, awareness of architecture frameworks, and strong leadership, teamwork, analytical, and communication skills. Experience in the Medical Device Industry or other regulated industries, as well as the ability to work independently and collaboratively, would also be valuable. At Medtronic, we offer a competitive salary, flexible benefits package, and a commitment to recognizing and supporting the contributions of our employees. Our mission is to alleviate pain, restore health, and extend life by boldly addressing the most challenging health problems. As part of our global team of passionate individuals, you will have the opportunity to engineer real solutions for real people and contribute to our mission of making healthcare technology accessible to all. Join us at Medtronic and be a part of a team that is dedicated to innovation, collaboration, and making a meaningful impact on global healthcare technology.,

Posted 4 days ago

Apply

5.0 - 9.0 years

0 Lacs

kochi, kerala

On-site

As a Data Architect at Beinex located in Kochi, Kerala, you will be responsible for collaborating with the Sales team to build RFPs, Pre-sales activities, Project Delivery, and support. Your role will involve delivering on-site technical engagements with customers, participating in pre-sales visits, understanding customer requirements, defining project timelines, and implementing solutions. Additionally, you will work on both on and off-site projects to assist customers in migrating from their existing data warehouses to Snowflake and other databases. You should have at least 8 years of experience in IT platform implementation, development, DBA, and Data Migration in Relational Database Management Systems (RDBMS). Furthermore, you should possess 5+ years of hands-on experience in implementing and performance tuning MPP databases. Proficiency in Snowflake, Redshift, Databricks, or Azure Synapse is essential, along with the ability to prioritize projects effectively. Experience in analyzing Data Warehouses such as Teradata, Netezza, Oracle, and SAP will be valuable in this role. Your responsibilities will also include designing database environments, analyzing production deployments, optimizing performance, writing SQL, stored procedures, conducting Data Validation and Data Quality tests, and planning migrations to Snowflake. You will be expected to possess strong communication skills, problem-solving abilities, and the capacity to work effectively both independently and as part of a team. At Beinex, you will have access to various perks including comprehensive health plans, learning and development opportunities, workation and outdoor training, a hybrid working environment, and on-site travel opportunities. Join us to be a part of a dynamic team and advance your career in a supportive and engaging work environment.,

Posted 4 days ago

Apply

1.0 - 10.0 years

0 Lacs

karnataka

On-site

You should have a minimum of 5 to 10 years of experience along with 2 project experiences or a minimum of 1 year of experience in React JS. The required skills include proficiency in coding using Java, Spring boot, Maven, JDBC, JavaScript, React, Postman, Docker, Jenkins, and Hibernate. You should also have knowledge of databases such as MongoDB, SnowFlake, and MySQL. It is essential to have a willingness to quickly learn and apply good software development practices and patterns. Moreover, you should be aware of standard software engineering design patterns, object-oriented concepts, coding standards, and practices. An understanding of web services, including SOAP and REST standards, security models, integration patterns, and methods is also required. The location of the job is Pan India.,

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

Are you someone with an in-depth understanding of ETL and a strong background in developing Snowflake and ADF ETL-based solutions Do you have a knack for developing, documenting, unit testing, and maintaining ETL applications to deliver successful code that meets customer expectations If your answer is yes, this opportunity could be the next step in your career. Keep reading. Join our Data leverage team, a group of high-energy individuals who excel in a rapid-pace and agile product development environment. As a Developer, you will play a crucial role in the ETL and Data Integration space, from the development phase through delivery. Working closely with the Project Manager, Technical Lead, and client teams, your primary responsibilities will include developing bug-free code with proper unit testing and documentation. Your insights will be valuable in planning, estimation, scheduling, and coordinating technical activities related to ETL-based applications. Your primary goal will be to meet development schedules and deliver high-quality ETL-based solutions that align with technical specifications and design requirements, ensuring customer satisfaction. A good understanding of tools like Snowflake and ADF will be essential for this role. Key Responsibilities: - Develop, implement, and optimize complex SQL queries and functions using Snowflake. - Write Snowflake scripts and have a strong grasp of SQL query JOINS, CASE statements, and data format conversion SQL functions. - Work with heterogeneous sources, transforming data into output files. - Comprehend the business requirements for Data flow process needs. - Develop mapping documents and transformation business rules as per scope and requirements/Source to target. - Create Airflow Dags for Data flow process needs. - Analyze existing SQL/Snowflake queries for performance improvements. - Collaborate closely with onsite lead data analysts for dependencies and requirements. - Maintain continuous formal and informal communication on project status. - Understand the JIRA stories process for SQL development activities. Requirements: - Strong experience in Snowflake and ADF. - Experience working in Onsite/Offshore model. - 3-5 years of experience in Snowflake and ADF development. - Skills in ETL, data, ADF, data engineering, SQL, and Snowflake.,

Posted 4 days ago

Apply

5.0 - 10.0 years

0 Lacs

karnataka

On-site

You will play a crucial role as a Data Engineer, leading the development of data infrastructure at the forefront. Your responsibilities will involve creating and maintaining systems that ensure a seamless flow, availability, and reliability of data. Your key tasks at Coforge will include: - Developing and managing data pipelines to facilitate efficient data extraction, transformation, and loading (ETL) processes. - Designing and enhancing data storage solutions such as data warehouses and data lakes. - Ensuring data quality and integrity by implementing data validation, cleansing, and error handling mechanisms. - Collaborating with data analysts, data architects, and software engineers to comprehend data requirements and provide relevant data sets for business intelligence purposes. - Automating and enhancing data processes and workflows to drive scalability and efficiency. - Staying updated on industry trends and emerging technologies in the field of data engineering. - Documenting data pipelines, processes, and best practices to facilitate knowledge sharing. - Contributing to data governance and compliance initiatives to adhere to regulatory standards. - Working closely with cross-functional teams to promote data-driven decision-making across the organization. Key skills required for this role: - Proficiency in data modeling and database management. - Strong programming capabilities, particularly in Python, SQL, and PL/SQL. - Sound knowledge of Airflow, Snowflake, and DBT. - Hands-on experience with ETL (Extract, Transform, Load) processes. - Familiarity with data warehousing and cloud platforms, especially Azure. Your experience of 5-10 years will be instrumental in successfully fulfilling the responsibilities of this role located in Greater Noida with a shift timing from 2:00 PM IST to 10:30 PM IST.,

Posted 4 days ago

Apply

2.0 - 6.0 years

0 Lacs

maharashtra

On-site

You will be part of Morgan Stanley Investment Management Technology (IMIT) team, which collaborates with the Investment Management business division to develop systems and integrate vendor products for global full life cycle business processing. Your responsibilities will include supporting Portfolio Analysis, Risk, Trading, Operations, and Sales & Marketing functions. Additionally, you will provide holistic support and quality assurance across applications used in the MSIM environment. Morgan Stanley is looking for an experienced full stack developer to join the Fixed Income Technology team. You will work as part of a global development team to design and develop a modern technology stack for digital platforms that enhance efficiencies for Investment Teams. As a backend developer, you will collaborate with the Business to create strategic solutions. **Must Have:** - Core Java / Spring Boot for backend technologies - Python / FAST / Flask API - Experience with Relational Databases **Good To Have:** - Frontend technologies like Angular / React - Knowledge of Linux, Cloud Stack, Docker, Kubernetes - Proficiency with Build tooling such as Jenkins / Gradle - Familiarity with Database technologies like NoSQL MongoDB / Snowflake **Your Responsibilities Will Include:** - Developing modern and modular applications with modern coding and testing standards - Active participation in agile ceremonies and driving towards team goals - Utilizing modern software development practices - Collaborating with a global team of technologists - Leading with ideas and innovation - Communicating and partnering with end users to design solutions **Competencies:** - At least 2 years of experience in developing enterprise JAVA Web applications - Ability to write unit, component, and integration tests - Proficiency in working with relational databases - Experience with microservices and distributed systems - Proficiency in Python - Basic understanding of financial markets and various financial instruments, focusing on Equities - Strong Computer Science fundamentals At Morgan Stanley, we are committed to delivering top-notch service and maintaining excellence. Our values include putting clients first, doing the right thing, leading with exceptional ideas, committing to diversity and inclusion, and giving back. We offer a supportive and empowering environment where you can work alongside talented individuals. Our teams are collaborative and innovative, driven by diverse backgrounds and experiences. We provide comprehensive employee benefits and perks, with opportunities for growth and advancement for those who demonstrate passion and dedication in their work.,

Posted 4 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You will be working as a Principal Consultant-Talend Developer at Genpact, a global professional services and solutions firm with a workforce of over 125,000 individuals across more than 30 countries. Driven by curiosity, agility, and a commitment to delivering long-term value to clients, we aim to create a world that functions better for people. Our expertise lies in serving and transforming leading enterprises, including Fortune Global 500 companies, through our deep industry knowledge, digital operations services, and proficiency in data, technology, and AI. As a Principal Consultant-Talend Developer, your role will involve utilizing your strong ETL knowledge, particularly with a Snowflake background. Some of your key responsibilities will include building and uploading data tables, automating process logics for various systems, showcasing strong SQL skills, understanding database concepts, and having a solid grasp of logical and physical data models within Snowflake. You will also be required to document and maintain a knowledge base for current BI solutions, their logic, enhancements, configurations, and best practices while possessing excellent communication skills. To be successful in this role, you should hold a minimum qualification of Graduation/Bachelors degree. Preferred qualifications and skills include expertise in Talend, Snowflake, and SQL. This is a full-time position based in India-Bangalore. If you are passionate about digital technologies and possess the necessary skills and qualifications, we invite you to apply for this role of Principal Consultant-Talend Developer at Genpact.,

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

You will be responsible for developing ETL/ELT pipelines using Python and working with Snowflake for data transformation and modeling. Your role will involve writing efficient SQL queries within the Snowflake environment and integrating data from various sources to ensure data quality. Collaboration with data engineers and analysts on scalable solutions will also be a key aspect of your responsibilities. Strong programming skills in Python, hands-on experience with Snowflake, a solid understanding of SQL and data warehousing concepts, as well as familiarity with cloud platforms (AWS/GCP/Azure is a plus) are expected from you as a Python+Snowflake Developer.,

Posted 4 days ago

Apply

6.0 - 10.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

You are an experienced Data Engineer with at least 6 years of relevant experience. In this role, you will be working as part of a team to develop Data and Analytics solutions. Your responsibilities will include participating in the development of cloud data warehouses, data as a service, and business intelligence solutions. You should be able to provide forward-thinking solutions in data integration and ensure the delivery of a quality product. Experience in developing Modern Data Warehouse solutions using Azure or AWS Stack is required. To be successful in this role, you should have a Bachelor's degree in computer science & engineering or equivalent demonstrable experience. It is desirable to have Cloud Certifications in Data, Analytics, or Ops/Architect space. Your primary skills should include: - 6+ years of experience as a Data Engineer, with a key/lead role in implementing large data solutions - Programming experience in Scala or Python, SQL - Minimum of 1 year of experience in MDM/PIM Solution Implementation with tools like Ataccama, Syndigo, Informatica - Minimum of 2 years of experience in Data Engineering Pipelines, Solutions implementation in Snowflake - Minimum of 2 years of experience in Data Engineering Pipelines, Solutions implementation in Databricks - Working knowledge of some AWS and Azure Services like S3, ADLS Gen2, AWS Redshift, AWS Glue, Azure Data Factory, Azure Synapse - Demonstrated analytical and problem-solving skills - Excellent written and verbal communication skills in English Your secondary skills should include familiarity with Agile Practices, Version control platforms like GIT, CodeCommit, problem-solving skills, ownership mentality, and a proactive approach rather than reactive. This is a permanent position based in Trivandrum/Bangalore. If you meet the requirements and are looking for a challenging opportunity in the field of Data Engineering, we encourage you to apply before the close date on 11-10-2024.,

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

You will be working as a Snowflake developer at our organization, with the job location being in Noida, Chennai, or Pune (3 times per week). Your primary responsibilities will include utilizing your expertise in Python, Snowflake, DBT, SQL, Data Quality, and Data Modelling. It is essential for you to possess proficiency in Python, Snowflake, DBT, SQL, Data Quality, and Data Modelling. Additionally, it would be advantageous if you have experience with Snowflake db, snowpipe, and fivetran. As a successful candidate, you should be an expert in DBT and SQL, capable of developing and maintaining DBT models, comprehending data flow, conducting data quality assessments, and performing data testing using DBT. Your role will involve ensuring efficient data processing and maintaining high data quality standards throughout the development and maintenance processes.,

Posted 4 days ago

Apply

2.0 - 6.0 years

0 Lacs

hyderabad, telangana

On-site

The Data Analyst provides analytics and decision support at a corporate level, assisting the executive staff and finance community. You will work on a team responsible for a shared reporting & analytics infrastructure and directly provide reporting, analysis, and insights to the executive teams. Leveraging various tools & analytical capabilities, you will support analysis across different functional areas of the business and their impacts on company financial performance. Gathering/developing required datasets, applying different techniques to analyze/test business hypotheses, and summarizing findings with clear data/method driven substantiation are key responsibilities. Accessing and integrating information/data from various sources into a structured environment for analytics and reporting is essential. Developing, maintaining, and tuning complex databases to ensure data availability and integrity, as well as automating integrations between data sources and processes to expedite information availability, consistency in results, and minimize costs are part of the role. Performing and documenting detailed root cause analyses, investigating and isolating issues in the code base and outlier data affecting key performance metrics are critical tasks. Developing and automating standardized reporting delivery for Executive level presentations or distribution to end users is required. Using a variety of analytical techniques to discover, identify, and conclude analysis reactively to leadership requests and proactively based on curiosity/desire to move the business forward is important. Defining and documenting core business metrics, definitions, and logic by collaborating with business teams, delivering analysis and findings to leadership and business teams, and staying updated on developments in technology, business intelligence, data warehousing, and emerging technologies are key responsibilities. Technical Qualifications: - Ability to extract and integrate data using technologies such as SQL, Snowflake, ODBC, ETL Tools, Oracle, DB2, MS Access - Familiarity with reporting/analytical apps including Oracle BI, Power BI, Tableau, SAS, Cognos, Bus Objects - Ability to automate reporting/analysis and operational processes using Python, R - Presenting information and responding to questions from senior management, department heads, and Finance/IT/Business teams - Attention to quality, reconciliation, and accuracy is crucial due to the audience often being executive level staff Education and Experience: - Bachelor's degree in computer science, Engineering, Business, or related field required (Master's in Analytics or Mathematics a plus) - 2-5 years of systems/business analysis experience; or equivalent combination of education and experience.,

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

As an integral part of our Data Automation & Transformation team, you will experience unique challenges every day. We are looking for someone with a positive attitude, entrepreneurial spirit, and a willingness to dive in and get things done. This role is crucial to the team and will provide exposure to various aspects of managing a banking office. In this role, you will focus on building curated Data Products and modernizing data by moving it to SNOWFLAKE. Your responsibilities will include working with Cloud Databases such as AWS and SNOWFLAKE, along with coding languages like SQL, Python, and Pyspark. You will analyze data patterns across large multi-platform ecosystems and develop automation solutions, analytics frameworks, and data consumption architectures utilized by Decision Sciences, Product Strategy, Finance, Risk, and Modeling teams. Ideally, you should have a strong analytical and technical background in financial services, particularly in small business banking or commercial banking segments. Your key responsibilities will involve migrating Private Client Office Data to Public Cloud (AWS and Snowflake), collaborating closely with the Executive Director of Automation and Transformation on new projects, and partnering with various teams to support data analytics needs. You will also be responsible for developing data models, automating data assets, identifying technology gaps, and supporting data integration projects with external providers. To qualify for this role, you should have at least 3 years of experience in analytics, business intelligence, data warehousing, or data governance. A Master's or Bachelor's degree in a related field (e.g., Data Analytics, Computer Science, Math/Statistics, or Engineering) is preferred. You must have a solid understanding of programming languages such as SQL, SAS, Python, Spark, Java, or Scala, and experience in building relational data models across different technology platforms. Excellent communication, time management, and multitasking skills are essential for this role, along with experience in data visualization tools and compliance with regulatory standards. Knowledge of risk classification, internal controls, and commercial banking products and services is desirable. Preferred qualifications include experience with Big Data and Cloud platforms, data wrangling tools, dynamic reporting applications like Tableau, and proficiency in data architecture, data mining, and analytical methodologies. Familiarity with job scheduling workflows, code versioning software, and change management tools would be advantageous.,

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

As a Data Integration Specialist, your primary responsibility will involve working closely with customers to understand their data integration challenges and develop customized solutions using industry-leading tools such as Talend, Snowflake, Databricks, SAP, and AWS. You will leverage your technical expertise in Talend Data Integration (DI) and other relevant tools to ensure that the solutions you design are both technically robust and aligned with the specific needs of the customers. Your role will also entail engaging with clients through product demonstrations and presentations to effectively showcase the value and functionality of the proposed solutions. Collaborating with the sales team, you will contribute to the development of detailed proposals, presentations, and solution briefs that address the business objectives of the clients. Additionally, you will be responsible for providing technical support to clients by addressing their queries and offering guidance on best practices for data integration, cloud-based solutions, and data architecture. It is crucial to stay updated on the latest features and best practices related to Talend, Snowflake, Databricks, AWS, and other technologies to ensure that the solutions you deliver are innovative and relevant, particularly within the banking industry. Furthermore, you will be involved in pre-sales activities such as creating Proof of Concepts (POCs) and tailored demonstrations to demonstrate the effectiveness of the proposed solutions to potential clients. Your expertise and collaboration with the sales team will be instrumental in driving successful engagements and delivering impactful data integration solutions.,

Posted 4 days ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Principal/Sr. Lead Engineer Automation at FIS, you will be part of the team responsible for developing the next generation compliance product. Your role will involve playing a key part in automation for cloud-based applications, focusing on understanding data flow and configurations for multiple environments to expedite release/build verifications, ultimately improving quality and stability. Leveraging your expertise in scripting, AWS, Jenkins, and Snowflake, you will collaborate with cross-functional teams to solve challenging problems and drive strategic automation initiatives. Your responsibilities will include devising and utilizing automation frameworks and DevOps best practices to set up end-to-end automation pipelines, reviewing and validating data for uniformity and accuracy, analyzing results for failures, and interpreting them with clear objectives in mind. You will have the opportunity to work with multiple products and businesses, gaining a deep understanding of the trading and compliance domain. Key Requirements: - Minimum of five years of experience in AWS, Snowflake, and DevOps automation with a proven track record of delivering impactful solutions. - Strong SQL skills and proficiency in programming languages such as Python and Unix scripting. - Experience with Jenkins build pipeline and release deployment automation. - Strong analytical and problem-solving skills to translate business requirements into analytical solutions. - Excellent communication and presentation skills to convey complex technical concepts effectively. - Experience with big data technologies (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, Azure, GCP) is a plus. - Demonstrated ability to work collaboratively in a team environment, manage multiple priorities, and automate data creation processes to reduce manual efforts. Responsibilities: - Define automation plan and own it end-to-end for release verification and CI/CD pipeline setup. - Understand product architecture and workflow to build an optimized automation pipeline for continuous delivery. - Collaborate with product and solution management teams to convert business use cases into efficient automation setups and execution. - Stay updated on the latest advancements in DevOps and AWS automation to leverage the latest concepts and methodologies. - Contribute to the development and implementation of best practices for DevOps, automation, and release deployment/verification. - Set up new and upgrade existing environments for automation pipeline and monitor them for failure analysis for daily sanity and regression verification. Qualifications: - Bachelor's or master's degree in computer science or a related field. Competencies: - Fluent in English. - Excellent communicator with the ability to discuss automation initiatives and provide optimized solutions. - Attention to detail and quality focus. - Organized approach to manage and adapt priorities according to client and internal requirements. - Self-starter with a team mindset, capable of working autonomously and as part of a global team. FIS offers a multifaceted job with a high degree of responsibility, visibility, and ownership, along with opportunities for growth and learning in the trading and compliance space. You will benefit from a competitive salary and benefits, a variety of career development tools, resources, and opportunities, and a supportive work environment. Join us at FIS, the global leader in financial technology solutions, and be part of a dynamic team dedicated to innovation and excellence.,

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we are a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth - bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are seeking to hire Selenium with Java Professionals with 3-7 years of experience for the following responsibilities: - Understanding requirements and translating them into test cases - Proficiency in understanding Integrated systems and deriving System Integration test cases - Identifying and selecting automation test cases - Configuring Test Automation Suite for setup - Automating the design of a framework and implementing it as per the project structure - Creating, enhancing, debugging, and executing test cases - Sharing daily status reports with Program/Project stakeholders - Collating and monitoring the defect management process - Managing changes and executing regression tests - Providing solutions for problems related to object identity and error handling - Interacting with Caterpillar to resolve various issues and updating them on the situation/related risks The required tools and technologies include: - Strong knowledge in UI Automation (Selenium + Java) and API Automation - Proficiency in API Automation (Rest Assured / Karate) - Experience in task automation (optional - Python scripting) - Hands-on Experience in cloud technologies (AWS) - Familiarity with tools like Maven, Git, Docker, Postman - Knowledge in writing SQL queries (Snowflake / Postgres) At YASH, you are empowered to create a career that will take you where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles: - Flexible work arrangements, Free spirit, and emotional positivity - Agile self-determination, trust, transparency, and open collaboration - All Support needed for the realization of business goals - Stable employment with a great atmosphere and ethical corporate culture,

Posted 4 days ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Gurugram

Remote

Data Visualization and BI Manager Join a leading independent service provider specializing in critical telecommunication and renewable energy infrastructure. With a global presence, our company delivers comprehensive engineering, maintenance, repair, and repowering solutions to ensure seamless operations in wireless and wireline telecom, commercial and utility-scale solar and wind projects, EV charging stations, and large-scale power generation and energy storage assets. Your Impact: As the Data Visualization and BI Manager , you will lead a talented team of Power BI developers, leveraging Power BI and Salesforce for advanced reporting and analytics. Your strategic expertise will drive data-driven initiatives to enhance business decisions and performance monitoring. Core Responsibilities: Lead and mentor a globally distributed team of data visualization and dashboard specialists. Oversee the creation and maintenance of comprehensive reports and dashboards using Power BI and Salesforce. Collaborate with data engineers and technical teams to enhance data pipelines, process automation, and data quality. Ensure data accuracy and integrity across all BI tools and systems. Promote the adoption of data analytics best practices and continuously improve BI processes. Present insights and recommendations clearly to stakeholders and executive leadership. Stay updated with industry trends and advancements in data analysis and BI technologies. Core Qualifications: Minimum of 5 years of experience in an English-based work environment. Experience in professional services or telecommunications is preferred. Proven expertise in a BI management role focusing on data visualization and reporting best practices. Strong proficiency in Power BI, including DAX and data modeling, and familiarity with Salesforce reporting. Knowledge of SQL and database management; familiarity with Snowflake is a plus. Excellent leadership and team management skills. Strong analytical and problem-solving abilities. Effective communication and presentation skills. Bachelor's degree in Business Intelligence, Data Science, Computer Science, or a related field. Benefits & Work Environment: Competitive salary and benefits package tailored for employees in India. Flexible work environment with opportunities for professional growth. Supportive, inclusive company culture that promotes collaboration and innovation. Access to necessary tools, company-provided laptop, and software resources.

Posted 4 days ago

Apply

12.0 - 16.0 years

30 - 40 Lacs

Pune

Work from Office

The Associate Director, Data Engineering will serve as a strategic leader within our Data & Analytics team, responsible for the development and delivery of scalable, secure, and high-performance data pipelines and analytical models. This individual will play a critical role in shaping the enterprise data ecosystem, driving innovation across analytics and data integration initiatives, and ensuring operational excellence in data engineering practices. You will lead a global team of data engineers to support enterprise-wide analytics across Sales, Marketing, Services, Customer Success, Finance, HR, Product, and Engineering. As a technical leader, you will balance hands-on contributions with strategic direction, governance, and execution oversight. Youll collaborate closely with enterprise architects, data & reporting analysts, data scientists, and business stakeholders to enable actionable insights and build a modern data infrastructure that supports our company’s growth. This role goes beyond traditional BI—it includes developing predictive models, uncovering insights with Gen AI-powered features to drive data driven decision making . You’ll use tools like Python, Snowflake Cortex , alongside SQL, DBT, Air Flow, and Snowflake to deliver scalable solutions that influence strategy and execution across the enterprise. In this role, you will... Provide strategic and technical leadership to the data engineering function, including architecture, development, quality, and delivery of data solutions. Lead and mentor a global team of engineers to build and support robust, performant data pipelines using Snowflake, DBT, Python, and AWS-based infrastructure. Partner with Business Stakeholders, Data Governance, and IT teams to define and execute a multi-year roadmap for data platform modernization and analytics enablement. Own the ETL/ELT development lifecycle—including design, orchestration, configuration, automation, testing, and support—ensuring adherence to SDLC, architectural standards, and security practices. Establish and enforce scalable best practices for data ingestion, transformation, modeling, documentation, and operational support. Define current and future state BI and data engineering architectures to support enterprise reporting, self-service analytics, and advanced data science use cases. Drive adoption of modern tools, techniques, and reusable frameworks that increase engineering velocity and reliability (e.g., CI/CD for data pipelines, automated testing, monitoring). Proactively identify areas for data quality, performance, and process improvement across the data pipeline ecosystem, lead remediation efforts. Act as an escalation point for complex production issues and strategic project deliverables. Serve as a technical thought leader and advocate for data engineering practices across the organization. You have what it takes if you have... 12+ years of experience in data engineering, data integration, or data architecture roles, including 4+ years leading high-performing teams in enterprise environments. Deep expertise in modern cloud data platforms (Snowflake, Azure, and/or AWS Redshift) and ETL/ELT tools such as DBT, Informatica, Talend, or equivalent. Strong hands-on experience designing and delivering scalable, secure, and resilient data pipelines and APIs in cloud environments (AWS preferred). Proficiency in SQL and Python for data manipulation, pipeline development, and automation. Solid understanding of data warehousing concepts (e.g., CDC, SCD types, dimensional modeling), data lake architectures, and real-time streaming. Proven ability to balance tactical execution with long-term strategic planning and stakeholder alignment. Experience working in Agile/Scrum environments using tools like Jira, Git, and Confluence. Demonstrated success collaborating across global teams and partnering with business, IT, and analytics leaders. Strong communication, documentation, and stakeholder management skills. Excellent communication skills and experience working cross-functionally with business and IT teams. Strong domain knowledge in CRM and Marketing Tech applications (Salesforce, Marketo, Gong, Gainsight). Bachelor's degree in Computer Science, Data Science, or equivalent. Extra dose of awesome if you have... Knowledge of Tableau, Looker, or similar BI tools is a plus. Experience with predictive analytics in go-to-market functions (e.g., campaign attribution, customer lifetime value modeling, lead prioritization, churn). Experience with Oracle ERP, CPQ, & quote to cash processes in a SaaS / recurring revenue company. Ability to work seamlessly as part of a multi-site, multicultural, development and testing team, onshore and offshore, internal and external resources. #LI-Onsite

Posted 4 days ago

Apply

3.0 - 6.0 years

10 - 15 Lacs

Hyderabad

Work from Office

Are you ready to make an impact at DTCC Do you want to work on innovative projects, collaborate with a dynamic and supportive team, and receive investment in your professional developmentAt DTCC, we are at the forefront of innovation in the financial markets We are committed to helping our employees grow and succeed We believe that you have the skills and drive to make a real impact We foster a thriving internal community and are committed to creating a workplace that looks like the world that we serve, Pay and Benefits: Competitive compensation, including base pay and annual incentive, Comprehensive health and life insurance and well-being benefits, based on location, Pension / Retirement benefits Paid Time Off and Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being, DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (onsite Tuesdays, Wednesdays and a third day unique to each team or employee), The Impact you will have in this role: We are seeking a motivated and detail-orientedAssociate Software Engineerto join our development team The ideal candidate will contribute to the design, development, and unit testing of software solutions usingPython,Java, andSQL, with a strong focus onSnowflakefor data warehousing This role requires hands-on experience withAWS cloud services,container technologies, and the ability to leverageAI productivity toolssuch asSnowflake Cortex,Microsoft Copilot, andAmazon Q The candidate must also be flexible with work hours and able to coordinate closely with the onshore team, Your Primary Responsibilities: Design, develop, and maintain scalable software applications using Python, Java, and SQL, Write and execute unit tests to ensure code quality and reliability, Develop and optimize data pipelines and queries in Snowflake, Deploy and manage applications in AWS using services like EC2, Lambda, S3, and RDS, Utilize container technologies such as Docker and Kubernetes for application packaging and orchestration, Integrate AI tools likeSnowflake Cortex,Microsoft Copilot, andAmazon Qinto the development workflow to improve productivity and code quality, Collaborate with multi-functional teams including QA, DevOps, and Product Management, Participate in code reviews, sprint planning, and agile ceremonies, Document technical designs, processes, and best practices, Maintain flexibility in work hours to support collaboration with onshore teams across time zones, Qualifications: Minimum of 3+ years of related experience Bachelor's degree preferred or equivalent experience Talents Needed for Success: Proficiency in Python, Java, and SQL, Experience with Snowflake or similar cloud data platforms, Hands-on experience with AWS cloud services, Familiarity with Docker, Kubernetes, or other container technologies, Exposure to AI-assisted development tools likeSnowflake Cortex,Microsoft Copilot, andAmazon Q, Strong problem-solving skills and attention to detail, Excellent communication and teamwork abilities, Willingness to work flexible hours and coordinate effectively with onshore teams, Fosters a culture where integrity and transparency arencourageded, Stays ahead of on changes in their own specialist area and seeks out learning opportunities to ensure knowledge is up-to-date, Actual salary is determined based on the role, location, individual experience, skills, and other considerations We are an equal opportunity employer and value diversity at our company We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment Please contact us to request accommodation, Show

Posted 4 days ago

Apply

6.0 - 11.0 years

10 - 14 Lacs

Chennai

Remote

What Youll Need BS or MS degree in Computer Science, Engineering, or a related technical field Strong SQL skills 6+ years of experience working with event instrumentation, data pipelines, and data warehouses, preferably acting as a data architect in a previous role Proficiency with systems design and data modeling Fluency with workflow management tools, like Airflow or dbt Experience with modern data warehouses, like Snowflake or BigQuery Expertise breaking down complex problems, documenting solutions, and sequencing work to make iterative improvements Familiarity with data visualization tools such as Mode, Tableau, and Looker Familiarity with programming skills, preferably in Python Familiarity with software design principles, including test-driven development About the Role Analytics Platform is on a mission to democratize learning by building systems that enable company-wide analytics and experimentation. By implementing sufficient instrumentation, designing intuitive data models, and building batch/streaming pipelines, we will allow for deep and scalable investigation and optimization of the business. By developing self-serve tools, we will empower executives, PMs, Marketing leadership & marketing managers to understand company performance at a glance and uncover insights to support decision making. Finally, by building capabilities such as forecasting, alerting, and experimentation, we will enable more, better, and faster decisions. What Youll Do Drive direct business impact with executive-level visibility Design technical architecture and implement components from the ground up as we transition to event-based analytics Work on the unique challenge of joining a variety of online and offline data sets, not just big data Learn and grow Data Science and Data Analytics skills (we sit in the same org!) Opportunity to grow into a Tech Lead/Manager, and mentor junior team members as we quickly grow the team Partner with infrastructure and product engineers to instrument our backend services and end-to-end user journeys to create visibility for the rest of the business Design, develop and monitor scalable and cost-efficient data pipelines and build out new integrations with third-party tools Work with data analysts and data scientists to design our data models as inputs to metrics and machine learning models Establish the best practices for data engineering Assess build vs buy tradeoffs for components in our company-wide analytics platform, which will inform decision-making for executives, PMs and Ops, etc. Opportunity to be founding member of the Data Engineer team based out of IN. Will have the autonomy to help shape the vision, influence roadmap and establish best practices for the team

Posted 4 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies