Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 4.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Role: Junior Engineer Location: Bangalore Duration: Full-Time Timings: 110 PM IST (Cabs & benefits provided) Experience: 2+ yrs Work Mode: Completely Onsite (No Hybrid/ Remote model) Required Skills: SQL; Prog Lang; ETL Tools; Cloud Position Overview The Data Engineer will report to the Data Engineering Manager and play a crucial role in designing, building, and maintaining scalable data pipelines within Kaseya. You will be responsible for ensuring data is readily available, accurate, and optimized for analytics and strategic decision-making. Required Qualifications: Bachelors degree (or equivalent) in Computer Science, Engineering, or related field. 2+ years of experience in data engineering or related role. Proficient in SQL and at least one programming language (Python, Scala, or Java). Hands-on experience with data integration/ETL tools (e.g., Matillion, Talend, Airflow). Familiarity with modern cloud data warehouses (Snowflake, Redshift, or BigQuery). Strong problem-solving skills and attention to detail. Excellent communication and team collaboration skills. Ability to work in a fast-paced, high-growth environment. Roles & Responsibilities: Design and Develop ETL Pipelines: Create high-performance data ingestion and transformation processes, leveraging tools like Matillion, Airflow, or similar. Implement Data Lake and Warehouse Solutions: Develop and optimize data warehouses/lakes (Snowflake, Redshift, BigQuery, or Databricks), ensuring best-in-class performance. Optimize Query Performance: Continuously refine queries and storage strategies to support large volumes of data and multiple use cases. Ensure Data Governance & Security: Collaborate with the Data Governance team to ensure compliance with privacy regulations and corporate data policies. Troubleshoot Complex Data Issues: Investigate and resolve bottlenecks, data quality problems, and system performance challenges. Document Processes & Standards: Maintain clear documentation on data pipelines, schemas, and operational processes to facilitate knowledge sharing. Collaborate with Analytics Teams: Work with BI, Data Science, and Business Analyst teams to deliver timely, reliable, and enriched datasets for reporting and advanced analytics. Evaluate Emerging Technologies: Stay informed about the latest tools, frameworks, and methodologies, recommending improvements where applicable. Company Description: Kaseya is the leading cloud provider of IT systems management software, offering a complete IT management solution delivered both via cloud and on-premise. Kaseya technology empowers MSPs and mid-sized enterprises to proactively manage and control their IT environments remotely, easily and efficiently from a single platform. Kaseya solutions are in use by more than 10,000 customers worldwide in a wide variety of industries, including retail, manufacturing, healthcare, education, government, media, technology, finance, and more. Kaseya has a presence in over 20 countries. To learn more, please visit http://www.kaseya.com. Show more Show less
Posted 1 day ago
3.0 - 7.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
As an Ignition Application Administrator at EY, you will be a key member of the Enterprise Services Data team. Your role will involve collaborating closely with peer platform administrators, developers, Product/Project Seniors, and Customers to administer the existing analytics platforms. While focusing primarily on Ignition, you will also be cross-trained on other tools such as Qlik Sense, Tableau, PowerBI, SAP Business Objects, and more. Your willingness to tackle complex problems and find innovative solutions will be crucial in this role. In this position, you will have the opportunity to work in a start-up-like environment within a Fortune 50 company, driving digital transformation and leveraging insights to enhance products and services. Your responsibilities will include installing and configuring Ignition, monitoring the platform, troubleshooting issues, managing data source connections, and contributing to the overall data platform architecture and strategy. You will also be involved in integrating Ignition with other ES Data platforms and Business Unit installations. To succeed in this role, you should have at least 3 years of experience in customer success or a customer-facing engineering capacity, along with expertise in large-scale implementations and complex solutions environments. Experience with Linux command line, cloud operations, Kubernetes application deployment, and cloud platform architecture is essential. Strong communication skills, both interpersonal and written, are also key for this position. Ideally, you should hold a BA/BS Degree in technology, computing, or a related field, although relevant work experience may be considered in place of formal education. The position may require flexibility in working hours, including weekends, to meet deadlines and fulfill application administration obligations. Join us at EY and contribute to building a better working world by leveraging data, technology, and your unique skills to drive innovation and growth for our clients and society.,
Posted 2 days ago
8.0 - 12.0 years
0 Lacs
indore, madhya pradesh
On-site
You are a highly skilled and experienced ETL Developer with expertise in data ingestion and extraction, sought to join our team. With 8-12 years of experience, you specialize in building and managing scalable ETL pipelines, integrating diverse data sources, and optimizing data workflows specifically for Snowflake. Your role will involve collaborating with cross-functional teams to extract, transform, and load large-scale datasets in a cloud-based data ecosystem, ensuring data quality, consistency, and performance. Your responsibilities will include designing and implementing processes to extract data from various sources such as on-premise databases, cloud storage (S3, GCS), APIs, and third-party applications. You will ensure seamless data ingestion into Snowflake, utilizing tools like SnowSQL, COPY INTO commands, Snowpipe, and third-party ETL tools (Matillion, Talend, Fivetran). Developing robust solutions for handling data ingestion challenges such as connectivity issues, schema mismatches, and data format inconsistencies will be a key aspect of your role. Within Snowflake, you will perform complex data transformations using SQL-based ELT methodologies, implement incremental loading strategies, and track data changes using Change Data Capture (CDC) techniques. You will optimize transformation processes for performance and scalability, leveraging Snowflake's native capabilities such as clustering, materialized views, and UDFs. Designing and maintaining ETL pipelines capable of efficiently processing terabytes of data will be part of your responsibilities. You will optimize ETL jobs for performance, parallelism, and data compression, ensuring error logging, retry mechanisms, and real-time monitoring for robust pipeline operation. Your role will also involve implementing mechanisms for data validation, integrity checks, duplicate handling, and consistency verification. Collaborating with stakeholders to ensure adherence to data governance standards and compliance requirements will be essential. You will work closely with data engineers, analysts, and business stakeholders to define requirements and deliver high-quality solutions. Documenting data workflows, technical designs, and operational procedures will also be part of your responsibilities. Your expertise should include 8-12 years of experience in ETL development and data engineering, with significant experience in Snowflake. You should be proficient in tools and technologies such as Snowflake (SnowSQL, COPY INTO, Snowpipe, external tables), ETL Tools (Matillion, Talend, Fivetran), cloud storage (S3, GCS, Azure Blob Storage), databases (Oracle, SQL Server, PostgreSQL, MySQL), and APIs (REST, SOAP for data extraction). Strong SQL skills, performance optimization techniques, data transformation expertise, and soft skills like strong analytical thinking, problem-solving abilities, and excellent communication skills are essential for this role. Location: Bhilai, Indore,
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
As a Snowflake Data Engineer at our organization, you will play a vital role in designing, developing, and maintaining our data infrastructure. Your responsibilities will include ingesting, transforming, and distributing data using Snowflake and AWS technologies. You will collaborate with various stakeholders to ensure efficient data pipelines and secure data operations. Your key responsibilities will involve designing and implementing data pipelines using Snowflake and AWS technologies. You will leverage tools like SnowSQL, Snowpipe, NiFi, Matillion, and DBT to ingest, transform, and automate data integration processes. Implementing role-based access controls and managing AWS resources will be crucial for ensuring data security and supporting Snowflake operations. Additionally, you will be responsible for optimizing Snowflake queries and data models for performance and scalability. To excel in this role, you should have a strong proficiency in SQL and Python, along with hands-on experience with Snowflake and AWS services. Understanding ETL/ELT tools, data warehousing concepts, and data quality techniques will be essential. Your analytical skills, problem-solving abilities, and excellent communication skills will enable you to collaborate effectively with data analysts, data scientists, and other team members. Preferred skills include experience with data virtualization, machine learning, AI concepts, data governance, and data security best practices. Staying updated with the latest advancements in Snowflake and AWS technologies will be essential for this role. If you are a passionate and experienced Snowflake Data Engineer with 5 to 7 years of experience, we invite you to apply and be a part of our team. This is a full-time position based in Gurgaon, with a hybrid work mode accommodating India, UK, and US work shifts.,
Posted 3 days ago
2.0 - 6.0 years
0 Lacs
telangana
On-site
You will provide analytics support to Novartis internal customers (CPOs & Regional marketing and sales teams) on various low-medium complexity analytical reports. You will support and facilitate data-enabled decision-making for Novartis internal customers by providing and communicating qualitative and quantitative analytics. Additionally, you will support GBS - GCO business in building practice by involving in various initiatives like knowledge sharing, on-boarding and training support, supporting team lead in all business-related tasks/activities, building process documentation, and knowledge repositories. You will also be an integral part of a comprehensive design team responsible for designing promotional marketing materials. As an Analyst at Novartis, your key responsibilities will include creating and delivering Field Excellence insights as per agreed SLAs, designing, developing, and/or maintaining ETL based solutions that optimize field excellence activities, delivering services through an Agile project management approach, maintaining standard operating procedures (SOPs) and quality checklists, and developing and maintaining knowledge repositories collecting qualitative and quantitative data of field excellence related trends across Novartis operating markets. Essential requirements for this role include 2 years of experience in SQL and Excel, learning agility, the ability to manage multiple stakeholders, experience in Pharma datasets, and experience in Python or any other scripting language. Desirable requirements include a University/Advanced degree, ideally a Masters degree or equivalent experience in fields such as business administration, finance, computer science, or a technical field. Experience of at least 3 years in using ETL tools (Alteryx, DataIKU, Matillion, etc.) and hands-on experience with cloud-based platforms like SnowFlake is mandatory. Novartis's purpose is to reimagine medicine to improve and extend people's lives, with a vision to become the most valued and trusted medicines company in the world. By joining Novartis, you will be a part of a mission-driven organization where associates drive the company to reach its ambitions. If you are passionate about making a difference in patients" lives and want to be part of a community of smart and dedicated individuals, consider joining Novartis. For more information about benefits and rewards at Novartis, you can refer to the Novartis Life Handbook at https://www.novartis.com/careers/benefits-rewards. If you are interested in staying connected with Novartis and learning about future career opportunities, you can join the Novartis Network here: https://talentnetwork.novartis.com/network.,
Posted 6 days ago
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Engineer with 4 to 6 years of hands-on experience in Microsoft Fabric, Snowflake, and Matillion, you will be a valuable asset to our team. Your primary responsibility will involve supporting MS Fabric and leading the migration process to Snowflake and Matillion. Your expertise and attention to detail will play a crucial role in the success of these projects.,
Posted 1 week ago
4.0 - 8.0 years
15 - 27 Lacs
Indore, Hyderabad
Hybrid
Data Engineer - D365 OneLake Integration Specialist Position Overview: We are seeking an experienced Data Engineer with expertise in Microsoft D365 ERP and OneLake integration to support a critical acquisition integration project. The successful candidate will assess existing data integrations, collaborate with our data team to migrate pipelines to Snowflake using Matillion, and ensure seamless data flow for go-live critical reports by November 2025. Role & responsibilities: Assessment & Documentation: Analyze and document existing D365 to OneLake/Fabric integrations and data flows Data Pipeline Migration: Collaborate with the current data team to redesign and migrate data integrations from D365 to Snowflake using Matillion Integration Architecture : Understand and map current Power BI reporting dependencies and data sources Go-Live Support: Identify critical reports for go-live and recommend optimal data integration strategies Technical Collaboration: Work closely with existing data engineering team to leverage current Snowflake and Matillion expertise Knowledge Transfer: Document findings and provide recommendations on existing vs. new integration approaches ERP Implementation Support: Support the acquired company's ERP go-live timeline and requirements Required Qualifications: Technical Skills 3+ years experience with Microsoft Dynamics 365 ERP data integrations 2+ years hands-on experience with Microsoft OneLake and Fabric ecosystem Strong experience with Snowflake data warehouse platform Proficiency in Matillion ETL tool for data pipeline development Experience with Power BI data modeling and reporting architecture Strong SQL skills and data modeling expertise Knowledge of Azure Data Factory or similar cloud ETL tools Experience with REST APIs and data connector frameworks Business & Soft Skills Experience supporting ERP implementation projects and go-live activities Strong analytical and problem-solving skills for complex data integration challenges Excellent documentation and communication skills Ability to work in fast-paced, deadline-driven environments Experience in M&A integration projects (preferred) Project management skills and ability to prioritize go-live critical deliverables Preferred candidate profile Microsoft Azure certifications (DP-203, DP-900) Experience with Snowflake SnowPro certification Previous experience with acquisition integration projects Knowledge of financial and operational reporting requirements Familiarity with data governance and compliance frameworks
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
You will be joining a fast-growing data-analytics consultancy focused on Life Sciences / Pharmaceutical commercial analytics. Our team specializes in building cloud-native data platforms to provide sales, marketing, and patient-centric insights for top global pharma brands, ensuring compliant and high-impact solutions on an enterprise scale. As a Data Engineer in this role, you will be responsible for architecting, constructing, and optimizing Snowflake data warehouses and ELT pipelines using SQL, Streams, Tasks, UDFs, and Stored Procedures to cater to complex commercial-analytics workloads. You will also work on integrating various pharma data sources such as Veeva, Salesforce, IQVIA, Symphony, RWD, and patient-services feeds through Fivetran, ADF, or Python-based frameworks to ensure end-to-end data quality. Your duties will involve establishing robust data models (star, snowflake, Data Vault) that are tailored for sales reporting, market-share analytics, and AI/ML use-cases. You will drive governance and compliance efforts (HIPAA, GDPR, GxP) by implementing fine-grained access controls, masking, lineage, and metadata management. Additionally, you will lead code reviews, mentor engineers, optimize performance, and ensure cost-efficient compute usage. Collaboration with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights will be a key aspect of your role. You will need to have at least 7 years of data-engineering / warehousing experience, including a minimum of 4 years of hands-on Snowflake design and development experience. Expertise in SQL, data modeling (Dimensional, Data Vault), ETL/ELT optimization, and proficiency in Python (or similar) for automation, API integrations, and orchestration are essential qualifications. Strong governance/security acumen within regulated industries (HIPAA, GDPR, PII), a Bachelor's degree in Computer Science, Engineering, or Information Systems (Masters preferred), and excellent client-facing communication and problem-solving skills in fast-paced, agile environments are required. Direct experience with pharma commercial datasets, cloud-platform depth (AWS, Azure, or GCP), familiarity with tools like Matillion/DBT/Airflow, Git, Snowflake certifications (SnowPro Core / Advanced), and knowledge of Tableau, Power BI, or Qlik connectivity are preferred qualifications. This is a full-time position that requires in-person work. If you are interested in this opportunity, please speak with the employer at +91 9008078505.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
hyderabad, telangana
On-site
As an Enterprise Snowflake L1/L2 AMS Support, your primary responsibilities will include monitoring and supporting Snowflake data warehouse performance, optimizing queries, and overseeing job execution. You will be tasked with troubleshooting data loading failures, managing access control, and addressing role-based security issues. Additionally, you will be expected to carry out patching, software upgrades, and security compliance checks while upholding SLA commitments for query execution and system performance. To excel in this role, you should possess 2-5 years of experience working with Snowflake architecture, SQL scripting, and query optimization. It would be beneficial to have familiarity with ETL tools such as Talend, Matillion, and Alteryx for seamless Snowflake integration.,
Posted 1 week ago
12.0 - 16.0 years
0 Lacs
hyderabad, telangana
On-site
YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we are a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth - bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are currently seeking SnowFlake Professionals with at least 12+ years of experience in the following areas: - Strong communication and proactive skills, ability to lead conversations - Experience architecting and delivering solutions on AWS - Hands-on experience with cloud warehouses like Snowflake - Strong knowledge of data integrations, data modeling (Dimensional & Data Vault), and visualization practices - Good understanding of data management (Data Quality, Data Governance, etc.) - Zeal to pick up new technologies, conduct PoCs, and present PoVs Technical Skills (Strong experience in at least one item in each category): - Cloud: AWS - Data Integration: Qlik Replicate, Snaplogic, Matillion & Informatica - Visualization: PowerBI & Thoughtspot - Storage & Databases: Snowflake, AWS Having certifications in Snowflake and Snaplogic would be considered a plus. At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles: - Flexible work arrangements, Free spirit, and emotional positivity - Agile self-determination, trust, transparency, and open collaboration - All Support needed for the realization of business goals - Stable employment with a great atmosphere and ethical corporate culture.,
Posted 1 week ago
9.0 - 14.0 years
30 - 40 Lacs
Pune, Chennai
Work from Office
Designing, implementing, and optimizing data solutions using both Azure and Snowflake experience working with Matillion tool Azure and Snowflake, including data modeling, ETL processes, and data warehousing. SQL and data integration tools.
Posted 2 weeks ago
5.0 - 8.0 years
10 - 20 Lacs
Hyderabad
Work from Office
7+ years of experience as a Data Engineer or Snowflake Developer. Expert-level knowledge of SQL (joins, subqueries, CTEs). Experience with ETL tools (e.g., Informatica, Talend, Matillion). Experience with cloud platforms like AWS, Azure, or GCP.
Posted 3 weeks ago
0.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Consultant - Sr.Data Engineer ( DBT+Snowflake ) ! In this role, the Sr.Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Develop, implement, and optimize data pipelines using Snowflake, with a focus on Cortex AI capabilities. Extract, transform, and load (ETL) data from various sources into Snowflake, ensuring data integrity and accuracy. Implement Conversational AI solutions using Snowflake Cortex AI to facilitate data interaction through ChatBot agents. Collaborate with data scientists and AI developers to integrate predictive analytics and AI models into data workflows. Monitor and troubleshoot data pipelines to resolve data discrepancies and optimize performance. Utilize Snowflake%27s advanced features, including Snowpark, Streams, and Tasks, to enable data processing and analysis. Develop and maintain data documentation, best practices, and data governance protocols. Ensure data security, privacy, and compliance with organizational and regulatory guidelines. Responsibilities: . Bachelor&rsquos degree in Computer Science, Data Engineering, or a related field. . experience in data engineering, with experience working with Snowflake. . Proven experience in Snowflake Cortex AI, focusing on data extraction, chatbot development, and Conversational AI. . Strong proficiency in SQL, Python, and data modeling . . Experience with data integration tools (e.g., Matillion , Talend, Informatica). . Knowledge of cloud platforms such as AWS, Azure, or GCP. . Excellent problem-solving skills, with a focus on data quality and performance optimization. . Strong communication skills and the ability to work effectively in a cross-functional team. Proficiency in using DBT%27s testing and documentation features to ensure the accuracy and reliability of data transformations. Understanding of data lineage and metadata management concepts, and ability to track and document data transformations using DBT%27s lineage capabilities. Understanding of software engineering best practices and ability to apply these principles to DBT development, including version control, code reviews, and automated testing. Should have experience building data ingestion pipeline. Should have experience with Snowflake utilities such as SnowSQL , SnowPipe , bulk copy, Snowpark, tables, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight . Should have good experience in implementing CDC or SCD type 2 Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in repository tools like Github /Gitlab, Azure repo Qualifications/Minimum qualifications B.E./ Masters in Computer Science , Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant of working experience as a Sr. Data Engineer with DBT+Snowflake skillsets Skill Matrix: DBT (Core or Cloud), Snowflake, AWS/Azure, SQL, ETL concepts, Airflow or any orchestration tools, Data Warehousing concepts Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 3 weeks ago
0.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Principal Consultant- Sr.Data Engineer ( DBT +Snowflake ) ! In this role, the Sr.Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description : Develop, implement, and optimize data pipelines using Snowflake, with a focus on Cortex AI capabilities. Extract, transform, and load (ETL) data from various sources into Snowflake, ensuring data integrity and accuracy. Implement Conversational AI solutions using Snowflake Cortex AI to facilitate data interaction through ChatBot agents. Collaborate with data scientists and AI developers to integrate predictive analytics and AI models into data workflows. Monitor and troubleshoot data pipelines to resolve data discrepancies and optimize performance. Utilize Snowflake%27s advanced features, including Snowpark, Streams, and Tasks, to enable data processing and analysis. Develop and maintain data documentation, best practices, and data governance protocols. Ensure data security, privacy, and compliance with organizational and regulatory guidelines. Roles and Responsibilities : . Bachelor&rsquos degree in Computer Science, Data Engineering, or a related field. . experience in data engineering, with e xperience working with Snowflake. . Proven experience in Snowflake Cortex AI, focusing on data extraction, chatbot development, and Conversational AI. . Strong proficiency in SQL, Python, and data modeling . . Experience with data integration tools (e.g., Matillion , Talend, Informatica). . Knowledge of cloud platforms such as AWS, Azure, or GCP. . Excellent problem-solving skills, with a focus on data quality and performance optimization. . Strong communication skills and the ability to work effectively in a cross-functional team. Proficiency in using DBT%27s testing and documentation features to ensure the accuracy and reliability of data transformations. Understanding of data lineage and metadata management concepts, and ability to track and document data transformations using DBT%27s lineage capabilities. Understanding of software engineering best practices and ability to apply these principles to DBT development, including version control, code reviews, and automated testing. Should have experience building data ingestion pipeline. Should have experience with Snowflake utilities such as SnowSQL , SnowPipe , bulk copy, Snowpark, tables, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight . Should have good experience in implementing CDC or SCD type 2 Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in repository tool s like Github /Gitlab, Azure repo Qualifications/Minimum qualifications B.E./ Masters in Computer Science , Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant of working experience as a Sr. Data Engineer with DBT+Snowflake skillsets Skill Matrix: DBT (Core or Cloud), Snowflake, AWS/Azure, SQL, ETL concepts, Airflow or any orchestration tools, Data Warehousing concepts Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 3 weeks ago
5.0 - 20.0 years
10 - 35 Lacs
Hyderabad, Pune, Delhi / NCR
Work from Office
Mandatory Skill - Snowflake, Matillion
Posted 1 month ago
5.0 - 6.0 years
12 - 16 Lacs
Gurugram
Work from Office
Role : Senior Data Architect Snowflake & Matillion (Remote) Design and implement data architecture support analytics, and collaborate with stakeholders Min. 5 yrs experience Strong data modeling, analytics, and communication skills required.
Posted 1 month ago
8.0 - 13.0 years
3 - 18 Lacs
Pune, Maharashtra, India
On-site
Your specific responsibilities will include: Design and implementation of last-mile data products using the most up-to-date technologies and software / data / DevOps engineering practices Enable data science & analytics teams to drive data modeling and feature engineering activities aligned with business questions and utilizing datasets in an optimal way Develop deep domain expertise and business acumen to ensure that all specificalities and pitfalls of data sources are accounted for Build data products based on automated data models, aligned with use case requirements, and advise data scientists, analysts and visualization developers on how to use these data models Develop analytical data products for reusability, governance and compliance by design Align with organization strategy and implement semantic layer for analytics data products Support data stewards and other engineers in maintaining data catalogs, data quality measures and governance frameworks Education: B.Tech / B.S., M.Tech / M.S. or PhD in Engineering, Computer Science, Engineering, Pharmaceuticals, Healthcare, Data Science, Business, or related field. Required experience: 8+ years of relevant work experience in the pharmaceutical/life sciences industry, with demonstrated hands-on experience in analyzing, modeling and extracting insights from commercial/marketing analytics datasets (specifically, real-world datasets) High proficiency in SQL, Python and AWS Experience creating / adopting data models to meet requirements from Marketing, Data Science, Visualization stakeholders Experience with including feature engineering Experience with cloud-based (AWS / GCP / Azure) data management platforms and typical storage/compute services (Databricks, Snowflake, Redshift, etc.) Experience with modern data stack tools such as Matillion, Starburst, ThoughtSpot and low-code tools (e.g. Dataiku) Excellent interpersonal and communication skills, with the ability to quickly establish productive working relationships with a variety of stakeholders Experience in analytics use cases of pharmaceutical products and vaccines Experience in market analytics and related use cases Preferred Experience: Experience in analytics use cases focused on informing marketing strategies and commercial execution of pharmaceutical products and vaccines Experience with Agile ways of working, leading or working as part of scrum teams Certifications in AWS and/or modern data technologies Knowledge of the commercial/marketing analytics data landscape and key data sources/vendors Experience in building data models for data science andvisualization/reportingproducts, in collaboration with data scientists, report developers and business stakeholders Experience with data visualization technologies (e.g, PowerBI)
Posted 1 month ago
0.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Senior Associate - Sr.Data Engineer ( DBT+Snowflake ) ! In this role, the Sr.Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Develop, implement, and optimize data pipelines using Snowflake, with a focus on Cortex AI capabilities. Extract, transform, and load (ETL) data from various sources into Snowflake, ensuring data integrity and accuracy. Implement Conversational AI solutions using Snowflake Cortex AI to facilitate data interaction through ChatBot agents. Collaborate with data scientists and AI developers to integrate predictive analytics and AI models into data workflows. Monitor and troubleshoot data pipelines to resolve data discrepancies and optimize performance. Utilize Snowflake%27s advanced features, including Snowpark, Streams, and Tasks, to enable data processing and analysis. Develop and maintain data documentation, best practices, and data governance protocols. Ensure data security, privacy, and compliance with organizational and regulatory guidelines. Responsibilities: . Bachelor&rsquos degree in Computer Science, Data Engineering, or a related field. . experience in data engineering, with experience working with Snowflake. . Proven experience in Snowflake Cortex AI, focusing on data extraction, chatbot development, and Conversational AI. . Strong proficiency in SQL, Python, and data modeling . . Experience with data integration tools (e.g., Matillion , Talend, Informatica). . Knowledge of cloud platforms such as AWS, Azure, or GCP. . Excellent problem-solving skills, with a focus on data quality and performance optimization. . Strong communication skills and the ability to work effectively in a cross-functional team. Proficiency in using DBT%27s testing and documentation features to ensure the accuracy and reliability of data transformations. Understanding of data lineage and metadata management concepts, and ability to track and document data transformations using DBT%27s lineage capabilities. Understanding of software engineering best practices and ability to apply these principles to DBT development, including version control, code reviews, and automated testing. Should have experience building data ingestion pipeline. Should have experience with Snowflake utilities such as SnowSQL , SnowPipe , bulk copy, Snowpark, tables, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight . Should have good experience in implementing CDC or SCD type 2 Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in repository tools like Github /Gitlab, Azure repo Qualifications/Minimum qualifications B.E./ Masters in Computer Science , Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant of working experience as a Sr. Data Engineer with DBT+Snowflake skillsets Skill Matrix: DBT (Core or Cloud), Snowflake, AWS/Azure, SQL, ETL concepts, Airflow or any orchestration tools, Data Warehousing concepts Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 1 month ago
7.0 - 10.0 years
20 - 30 Lacs
Hyderabad, Chennai
Work from Office
Prof in designing and delivering data pipelines in Cloud Data Warehouses (e.g., Snowflake, Redshift), using various ETL/ELT tools such as Matillion, dbt, Striim, etc. Solid understanding of database systems (relational / NoSQL) ,data modeling tech Required Candidate profile looking for candidates with strong experience in data architecture Potential companies: Tiger Analytics, Tredence, Quantiphi, Data Engineering Group within Infosys/TCS/Cognizant, Deloitte Consulting Perks and benefits 5 working days - Onsite
Posted 1 month ago
5.0 - 10.0 years
19 - 30 Lacs
Hyderabad
Work from Office
For Data Engineer Years of experience -3-5 years Number of openings-2 For Sr. Data Engineer Years of experience- 6-10 years Number of openings-2 About Us Logic Pursuits provides companies with innovative technology solutions for everyday business problems. Our passion is to help clients become intelligent, information driven organizations, where fact-based decision-making is embedded into daily operations, which leads to better processes and outcomes. Our team combines strategic consulting services with growth-enabling technologies to evaluate risk, manage data, and leverage AI and automated processes more effectively. With deep, big four consulting experience in business transformation and efficient processes, Logic Pursuits is a game-changer in any operations strategy. Job Description We are looking for an experienced and results-driven Data Engineer to join our growing Data Engineering team. The ideal candidate will be proficient in building scalable, high-performance data transformation pipelines using Snowflake and dbt and be able to effectively work in a consulting setup. In this role, you will be instrumental in ingesting, transforming, and delivering high-quality data to enable data-driven decision-making across the clients organization. Key Responsibilities Design and build robust ELT pipelines using dbt on Snowflake, including ingestion from relational databases, APIs, cloud storage, and flat files . Reverse-engineer and optimize SAP Data Services (SAP DS) jobs to support scalable migration to cloud-based data platforms . Implement layered data architectures (e.g., staging, intermediate, mart layers) to enable reliable and reusable data assets. Enhance dbt/Snowflake workflows through performance optimization techniques such as clustering, partitioning, query profiling, and efficient SQL design. Use orchestration tools like Airflow, dbt Cloud, and Control-M to schedule, monitor, and manage data workflows. Apply modular SQL practices, testing, documentation, and Git-based CI/CD workflows for version-controlled, maintainable code. Collaborate with data analysts, scientists, and architects to gather requirements, document solutions, and deliver validated datasets. Contribute to internal knowledge sharing through reusable dbt components and participate in Agile ceremonies to support consulting delivery. Required Qualifications Data Engineering Skills 3–5 years of experience in data engineering, with hands-on experience in Snowflake and basic to intermediate proficiency in dbt. Capable of building and maintaining ELT pipelines using dbt and Snowflake with guidance on architecture and best practices. Understanding of ELT principles and foundational knowledge of data modeling techniques (preferably Kimball/Dimensional) . Intermediate experience with SAP Data Services (SAP DS), including extracting, transforming, and integrating data from legacy systems. Proficient in SQL for data transformation and basic performance tuning in Snowflake (e.g., clustering, partitioning, materializations). Familiar with workflow orchestration tools like dbt Cloud, Airflow, or Control M . Experience using Git for version control and exposure to CI/CD workflows in team environments. Exposure to cloud storage solutions such as Azure Data Lake, AWS S3, or GCS for ingestion and external staging in Snowflake. Working knowledge of Python for basic automation and data manipulation tasks. Understanding of Snowflake's role-based access control (RBAC) , data security features, and general data privacy practices like GDPR. Data Quality & Documentation Familiar with dbt testing and documentation practices (e.g., dbt tests, dbt docs). Awareness of standard data validation and monitoring techniques for reliable pipeline development. Soft Skills & Collaboration Strong problem-solving skills and ability to debug SQL and transformation logic effectively. Able to document work clearly and communicate technical solutions to a cross-functional team. Experience working in Agile settings, participating in sprints, and handling shifting priorities. Comfortable collaborating with analysts, data scientists, and architects across onshore/offshore teams. High attention to detail, proactive attitude, and adaptability in dynamic project environments. Nice to Have Experience working in client-facing or consulting roles. Exposure to AI/ML data pipelines or tools like feature stores and MLflow Familiarity with enterprise-grade data quality tools Education: Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field. Certifications such as Snowflake SnowPro, dbt Certified Developer Data Engineering are a plus Additional Information Why Join Us? Opportunity to work on diverse and challenging projects in a consulting environment. Collaborative work culture that values innovation and curiosity. Access to cutting-edge technologies and a focus on professional development. Competitive compensation and benefits package. Be part of a dynamic team delivering impactful data solutions Required Qualification Bachelor of Engineering - Bachelor of Technology (B.E./B.Tech.)
Posted 1 month ago
12.0 - 22.0 years
3 - 6 Lacs
Chennai, Tamil Nadu, India
On-site
We are hiring an ESA Solution Architect - COE for a CMMI Level 5 client. If you have relevant experience and are looking for a challenging opportunity, we invite you to apply. Key Responsibilities: Design and implement enterprise solutions that align with business and technical requirements. Lead migration projects from on-premise to cloud or cloud-to-cloud (preferably Snowflake). Provide expertise in ETL technologies such as Informatica, Matillion, and Talend . Develop Snowflake-based solutions and optimize data architectures. Analyze project constraints, mitigate risks, and recommend process improvements. Act as a liaison between technical teams and stakeholders , translating business needs into technical solutions. Conduct architectural system evaluations to ensure scalability and efficiency. Define processes and procedures to streamline solution delivery. Create solution prototypes and participate in technology selection . Ensure compliance with strategic guidelines, technical standards, and business objectives. Oversee solution development and collaborate closely with project management and IT teams. Required Skills & Experience: 10+ years of experience in technical solutioning and enterprise solution architecture. Proven experience in cloud migration projects (on-prem to cloud/cloud-to-cloud). Strong expertise in Snowflake architecture and solutioning . Hands-on experience with ETL tools such as Informatica, Matillion, and Talend . Excellent problem-solving and risk mitigation skills. Ability to work with cross-functional teams and align technical solutions with business goals. If you are interested, please share your updated profile.
Posted 1 month ago
5.0 - 10.0 years
16 - 31 Lacs
Pune, Chennai, Bengaluru
Work from Office
Hi Connections, I am looking for Matillion Lead for one of our MNC client . Exp- 5+ yrs. Please email your resumes to parul@mounttalent.com. Skills Required: - Matillion - Python - SQL Location: Pune, Mumbai, Noida, Chennai, Bangalore, Hyderabad
Posted 1 month ago
5.0 - 8.0 years
22 - 25 Lacs
Pune, Chennai, Coimbatore
Hybrid
Candidate Skill: Technical Skills - SQL, MySQL, SQL Server, Python, AWS, Rundeck, Matillion, Tableau, Salesforce, ETL, Change Management, Web Application Support We are seeking a highly skilled Senior Full Stack Developer with expertise in .NET Core, React, and Azure Cloud to design, develop, and deploy scalable applications. The ideal candidate should have a strong technical background, a problem-solving mindset, and the ability to collaborate effectively with cross-functional teams. Key Responsibilities: Design, develop, and maintain web applications using .NET Core and React.Architect and optimize Azure cloud-based solutions for performance, security, and scalability.Implement best practices and design patterns to ensure high-quality, maintainable code. Collaborate with cross-functional teams including designers, testers, and DevOps engineers for seamless integration and deployment.Troubleshoot and resolve technical issues in cloud and application environments.Stay updated with the latest technologies and trends in .NET, React, and cloud computing. Required Skills: .NET Core Strong experience in building backend services and APIs.React Hands-on expertise in developing dynamic front-end applications.Azure Cloud Proficiency in deploying and managing applications on Azure. Experience with SQL and NoSQL databases.Familiarity with CI/CD pipelines, Git, and DevOps best practices.Strong problem-solving and debugging skills.Excellent communication and teamwork abilities. Nice to Have:Experience with Microservices architecture. Knowledge of Docker and Kubernetes. Understanding of Agile methodologies
Posted 1 month ago
3.0 - 8.0 years
22 - 25 Lacs
Mohali, Panchkula
Work from Office
The Customer Support Database Engineer is responsible for enabling the day-to-dayoperations associated with our SaaS offerings The team is responsible for frequentdata loads of client data, monitoring ETL and data processing, removing/resolvingpoints of failure, and determining methods to improve query performance This role will work both independently and as a team member, performing a largevariety of tasks This role is customer facing The individual will provide expertise to customer businessand IT departments 2 nd and 3 rd level support will be required This individual will beresponsible for investigating and resolving easy to extremely complex issues What youll do: Be a key player in the delivery of the SLA and works with customer support team todefine and execute 24X7 customer support plan Work queued cases from internal and external customers Triage cases, assist customers, resolve issues and bugs Assist in coordinating response to major incidents, including post-incident rootcause analysis Act as an escalation point to resolve critical and major client related issues Monitor ETL Processing using Rundeck, AWS tools, Matillion, and other services SQL, Tableau and script development for on-going improvements Work closely with the professional services team to understand the needs of eachclient implementation Assist with the on-boarding of new customers What you need: This role requires weekend duties and may be asked to work an alternative schedulein the evenings Experience in technical support, issue management, and conflict resolution Intermediate to expert knowledge of SQL (MySQL, SQL Server) and experienceusing Python Demonstrable experience diagnosing bugs/issues in customized software solutions Experience with Salesforce and Salesforce-based apps Salesforce Administration and/or development a plus Experience with Tableau or other data visualization tools Great organization, collaboration, communication, and coordination skills Ability to work across the organization and collaborate with customers, sales,services, and account management Experience working in a structured change management process for highly availableenvironment a plus Experience supporting a critical client facing web application Technical Skills: SQL, MySQL, SQL Server, Python, AWS, Rundeck, Matillion, Tableau, Salesforce, ETL, Change Management, Web Application Support
Posted 1 month ago
5.0 - 10.0 years
12 - 18 Lacs
Pune, Bengaluru, Delhi / NCR
Work from Office
SQL, SNOWFLAKE, TABLEAU SQL, SNOWFLAKE,DBT, Datawarehousing SQL, SNOWFLAKE, Python, DBT, Datawarehousing SQL, SNOWFLAKE, Datawarehousing, any ETL tool(preffered is Matillion) SQL, SNOWFLAKE, TABLEAU
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough