Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 11.0 years
10 - 18 Lacs
bengaluru
Remote
Notice Period-Immediate Joiner or Max 10 days(Do not share long notice period profile.) Permanent Payroll - Anlage Infotech Client - NTT DATA Location- Remote (Just to meet the compliance, candidate have to visit 1 day in a month to the office). Role & responsibilities - Description: Total Experience in Years : 6+ Years Minimum Skills Required: Skills: 5+ years of experience in performing administrative activities for BI (/Informatica MDM//Cognos/Matillion/Tableau) Experience in Installation, Configuration, Administration and Security of BI tools Experience in software upgrades, Implementation of Hot fixes, implementation of new software offerings and coordination of testing activities with project teams Experience in implementation of SSL and Different Authentications methods Informatica MDM & IDQ. Responsible for BI and Analytics Administration at an enterprise Level. Identify and improve infrastructure and processes. Ensure environments are stable and available all the time. Administer and support BI Analytics Infrastructure including Upgrades, testing, Security Administration, License Management, troubleshoot ETL, Reports and workbooks, Performance Monitoring and System maintenance. Sarbanes Oxley experience a plus. Informatica MDM Platform Design, Install, Configuration, and Administration of Informatica MDM Platform v10 or higher (currently on v10.2) on Linux, Install experience with Informatica MDM 10.x Leads software upgrades, Implementation of Hot fixes, implementation of new software offerings and infrastructure, maintenance, and coordinates testing activities with project teams Researches and provides recommendations for capacity modifications, collaborates with PM to document tasks and update status Creates and maintains architecture diagrams, Informatica/Data Integration/Data Quality tools & UNIX troubleshooting and Automating Daily Tasks Informatica/Data Integration/Data Quality Tools Security Informatica MDM Platform administration and Integration support Coordinates patching and other infrastructure related activities with different teams Monitoring of servers and services Tableau & Matellion: (Good to have) Install and upgrades, administration and technical support for the Tableau Enterprise wide implementation& Matillion Enterprise Server Experience developing in and administering Tableau Server 2025.x or later, Tableau Desktop and Matillion and administering Matillion server Full Knowledge of SSL and Different Authentications methods Working knowledge of Cloud SaaS and PaaS applications IBM Cognos: (Good to have) Design, Install, Configuration, Cognos BI 10.x or Higher (currently on v11) on Windows and Install and Configure IIS, Cognos Analytics (v11) Knowledge Preferred. Create and maintain installation documentation and architectural diagrams Implement package deployments and troubleshoot package issues with complex long running reports. Experience working with Motio products a plus devise or modify procedures to solve complex problems considering Cognos administration and/or cube and report generation Perform backup, upgrades, set up security roles and users, and troubleshoot by understanding the various log messages Implement and maintain Cognos security Manage Cognos Analytics reporting, while utilizing best practices and innovative ideas to build, maintain, distribute and educate users on the reporting & query functionality of Cognos Analytics Apply expertise to implementation of BI application to support global reporting
Posted 23 hours ago
5.0 - 8.0 years
15 - 20 Lacs
chennai
Remote
5+ yrs - Data architecture Strong Snowflake Experience with Advance Data Architect certification Strong Matillion Experience with Matillion certification Strong in Snowflake and data modeling Data Management & Analytics
Posted 4 days ago
2.0 - 7.0 years
7 - 17 Lacs
noida, hyderabad, gurugram
Hybrid
Inviting applications for the role of Consultant, Mattillion ETL Engineer We are looking for a skilled Mattillion ETL Engineer with strong expertise in support, operations, configurations, and development. The role involves managing day-to-day Mattillion operations, performing environment setups, creating and maintaining workflows/projects, and ensuring smooth ETL pipeline execution on cloud data platforms. Responsibilities Monitor and manage Mattillion ETL jobs/workflows in production and non-production environments. Provide L2/L3 support , including troubleshooting job failures, debugging errors, and applying fixes. Proactively identify performance issues and optimize ETL jobs for efficiency. Maintain KB articles and SOPs for common operational issues. Perform environment setup and administration in Mattillion. Configure projects, version control, task scheduling, and user/access management. Manage connections to databases, cloud storage, and third-party data sources. Ensure system availability, reliability, and adherence to compliance/security standards. Design and develop ETL workflows and pipelines in Mattillion to support data integration projects. Create and manage projects, workflows, and access configurations . Implement error handling, logging, and recovery mechanisms in workflows. Collaborate with data engineers, architects, and business teams to gather requirements and deliver solutions. Qualifications we seek in you! Minimum Qualifications / Skills Bachelors degree in Computer Science, Information Technology, Electrical Engineering, or a related field. Advanced degrees or relevant professional training are a plus. Strong hands-on exposure to Mattillion ETL operations/support . Experience in Mattillion setup/configurations including environment setup, project creation, and access control. Working knowledge of Mattillion development (projects, workflows, pipeline creation, access management). Strong SQL skills (querying, performance tuning, stored procedures). Experience with cloud data warehouses like Snowflake, Redshift, or BigQuery . Familiarity with cloud services ( AWS, Azure, or GCP ) such as S3, IAM, ADLS, or GCS. Strong troubleshooting and problem-solving abilities. Preferred Qualifications/ Skills Knowledge of Python or scripting for automation. Experience with DevOps/CI-CD pipelines for ETL deployment. Familiarity with orchestration/scheduling tools (Airflow, Control-M, etc.). Exposure to data governance, quality, and compliance frameworks . Mattillion or Snowflake certification.
Posted 4 days ago
6.0 - 11.0 years
10 - 20 Lacs
bengaluru
Remote
Key Responsibilities Design, build, and maintain ETL pipelines using Azure Data Factory (preferably Fabric Data Factory) and SQL. Write and optimize complex SQL logic to ensure performance and scalability across large datasets. Ensure data quality, monitoring, and observability with restartability, idempotency, and debugging principles in mind. Enhance ETL processes with Python scripting where applicable. Collaborate with business unit partners to translate requirements into effective data solutions. Document workflows, standards, and best practices; mentor junior team members. Implement version control (GitHub) and CI/CD practices across SQL and ETL processes. Work with Azure components such as Blob Storage and integrate with orchestration tools. Apply troubleshooting and performance-tuning techniques to improve data pipelines. Required Skills & Experience Strong hands-on SQL development with focus on integration, optimization, and performance tuning. Proven experience with Azure Data Factory (ADF) , with preference for Fabric Data Factory . Exposure to ETL/orchestration tools such as Matillion (preferred but not mandatory). Proficiency in Python for ETL enhancements and automation. Understanding of cloud platforms , particularly Microsoft Azure services. Familiarity with version control (GitHub) and CI/CD in data environments. Excellent communication and technical writing skills to engage with stakeholders. Having Advanced Azure certifications would be a plus. Technology & Skill Areas Core: Azure Data Factory / Fabric Data Factory, SQL, Python Secondary: Matillion, Azure Blob Storage Skill Areas: Data Integration, Data Quality, Performance Optimization, Cloud Data Engineering
Posted 5 days ago
5.0 - 7.0 years
0 Lacs
pune, maharashtra, india
On-site
The Company Gentrack provides leading utilities across the world with innovative cleantech solutions. The global pace of change is accelerating, and utilities need to rebuild for a more sustainable future. Working with some of the worlds biggest energy and water companies, as well as innovative challenger brands, we are helping companies reshape what it means to be a utilities business. We are driven by our passion to create positive impact. That is why utilities rely on us to drive innovation, deliver great customer experiences, and secure profits. Together, we are renewing utilities. Our Values and Culture Colleagues at Gentrack are one big team, working together to drive efficiency in two of the planets most precious resources, energy, and water. We are passionate people who want to drive change through technology and believe in making a difference. Our values drive decisions and how we interact and communicate with customers, partners, shareholders, and each other. Our core values are Respect for the planet Respect for our customers and Respect for each other Gentrackers are a group of smart thinkers and dedicated doers. We are a diverse team who love our work and the people we work with and who collaborate and inspire each other to deliver creative solutions that make our customers successful. We are a team that shares knowledge, asks questions, raises the bar, and are expert advisers. At Gentrack we care about doing honest business that is good for not just customers but families, communities, and ultimately the planet. Gentrackers continuously look for a better way and drive quality into everything they do. This is a truly exciting time to join Gentrack with a clear growth strategy and a world class leadership team working to fulfil Gentracks global aspirations by having the most talented people, an inspiring culture, and a technology first, people centric business. The Opportunity As a BI Developer Senior, you will play a key role in designing, governing, and optimizing the core data models within our Data & Analytics platform. You will lead the development of scalable, modular, and business-aligned models that serve as the foundation for analytics and reporting across multiple regions and clients. Collaborating closely with data engineering, BI, and business stakeholders, you will ensure that business logic is accurately embedded in our models, maintain semantic consistency, and support high-performance, secure, and compliant data structures. Your expertise will help translate complex business requirements into robust technical models that enable efficient decision-making and insight generation. Operating in the B2B software and services space, you will contribute to delivering mission-critical solutions for large-scale clients across the APAC, UK, and globally, in line with our commitment to innovation and excellence. In addition to this you will Support initiatives run by the GSTF and demonstrate our company values by providing a clear commitment to environmental and social responsibility. Contribute through identifying/proposing local sustainable practices and ideas in accordance with our Sustainability Charter. Utilize our sustainability app by taking part in challenges and improving behaviors to be more sustainable. The Specifics Lead, define, and deliver enterprise-scale BI workstreams, ensuring solutions are aligned with business objectives, scalable, and maintainable. Design, develop, and maintain enterprise-level BI solutions that provide accurate, timely, and actionable insights across the organization. Work closely with business stakeholders to gather requirements, translate them into optimised BI datasets, reports, and dashboards, and validate outputs against expectations. Define and maintain BI data models in alignment with the enterprise data architecture, ensuring scalability, modularity, and alignment with regional and global requirements. Implement and uphold governance standards to ensure consistency, data quality, and compliance across all BI deliverables. Collaborate with Data Engineering and Data Modelling teams to ensure semantic consistency, optimal data flow into BI layers, and reusability of core datasets. Optimise report and dashboard performance for large datasets and high-volume environments. Apply best practices for version control using Git/Bitbucket to manage BI artefacts, releases, and controlled deployments. Support region- and country-specific customisations while safeguarding the integrity of the global BI solution core. Mentor and guide Junior BI developers, reviewing their work for quality, performance, and adherence to standards. Ensure BI solutions adhere to data security, RBAC, and privacy requirements. Leverage ETL/ELT tools such as Matillion or equivalent platforms to manage and optimise data pipelines feeding BI solutions. Maintain comprehensive documentation for BI solutions, including metadata, lineage, and user guides. Fulfil any additional duties reasonably requested by your direct line leader. What we&aposre looking for (you dont need to be a guru at all, were looking forward to coaching and collaborating with you) 5+ years of BI development experience, with at least 3 years in a senior or lead role. Proven expertise in designing and delivering BI solutions using Power BI (must-have) and preferably Tableau or equivalent, for cloud data platforms, preferably Snowflake. Demonstrated experience delivering enterprise-scale BI solutions that support large datasets, multiple business units, and global/regional variations. Proven ability to define, implement, and enforce BI development standards, best practices, and governance frameworks. Strong understanding of data modelling principles (dimensional, Data Vault, and semantic layer design). Advanced SQL skills and ability to optimise queries for performance. Experience integrating BI tools with ELT/ETL pipelines (Matillion, DBT, Talend, Azure Data Factory, etc.). Knowledge of Snowflake-specific features (streams, tasks, dynamic tables) for BI integration. Familiarity with DevOps practices, including Git/Bitbucket for version control of BI artefacts. Experience with data governance, metadata management, and lineage tools to support BI usage. Excellent stakeholder engagement skills, with experience leading workshops and solution walkthroughs. Bachelors degree in computer science, Data Management, Business Analytics, or a related field. Can-do attitude with a focus on delivering excellence. Optimistic outlook with common sense and a sense of humour. High levels of energy, sound judgment, and determination with a sense of urgency. Exceptional attention to detail. Excellent relationship management and interpersonal skills. Open-minded consultative approach. Ability to give and receive positive and constructive feedback. Creative problem-solving skills. What we offer in return Personal growth in leadership, commercial acumen and technical excellence To be part of a global, winning high growth organization with a career path to match A vibrant, culture full of people passionate about transformation and making a difference -with a one team, collaborative ethos A competitive reward package that truly awards our top talent A chance to make a true impact on society and the planet Gentrack want to work with the best people, no matter their background. So, if you are passionate about learning new things and keen to join the mission, you will fit right in. Show more Show less
Posted 5 days ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Data Modeller, you will play a crucial role in designing, governing, and optimizing the core data models within our Data & Analytics platform. Your responsibilities will include leading the development of scalable, modular, and business-aligned models to support analytics and reporting across multiple regions and clients. Collaborating closely with data engineering, BI, and business stakeholders, you will ensure accurate embedding of business logic in the models, maintain semantic consistency, and support high-performance, secure, and compliant data structures. Your expertise will be instrumental in translating complex business requirements into robust technical models that facilitate efficient decision-making and insight generation. Working in the B2B software and services sector, you will contribute to delivering mission-critical solutions for large-scale clients globally, in accordance with our commitment to innovation and excellence. Additionally, you will support initiatives led by the GSTF, demonstrating our company's values of environmental and social responsibility. You will contribute by identifying and proposing local sustainable practices aligned with our Sustainability Charter and participate in challenges to improve sustainable behaviors through our sustainability app. Key Responsibilities: - Design, implement, and maintain scalable and modular data models for Snowflake, incorporating region and country-specific extensions without impacting the global core. - Define, document, and approve changes to the core enterprise data model, embedding business logic into model structures. - Lead data modelling workshops with stakeholders to gather requirements and ensure alignment between business, engineering, and BI teams. - Collaborate with developers, provide technical guidance, and review outputs related to data modelling tasks. - Optimize models for performance, data quality, and governance compliance. - Work with BI teams to ensure semantic consistency and enable self-service analytics. - Ensure adherence to data security, RBAC, and compliance best practices. - Utilize DevOps tools like Git/Bitbucket for version control of data models and related artifacts. - Maintain documentation, metadata, and data lineage for all models. - Preferred: Utilize tools like Matillion or equivalent ETL/ELT tools for model integration workflows. - Fulfill any additional duties as reasonably requested by your direct line leader. Required Skills: - Proven expertise in designing enterprise-level data models for cloud data platforms, preferably Snowflake. - Strong understanding of data warehouse design patterns like dimensional, Data Vault, and other modeling approaches. - Ability to embed business logic into models and translate functional requirements into technical architecture. - Experience managing and approving changes to the core data model, ensuring scalability, semantic consistency, and reusability. - Proficiency in SQL with experience in Snowflake-specific features. - Familiarity with ELT/ETL tools such as Matillion, DBT, Talend, or Azure Data Factory. - Experience with DevOps practices, including version control of modeling artifacts. - Knowledge of metadata management, data lineage, and data cataloging tools. - Strong understanding of data privacy, governance, and RBAC best practices. - Excellent communication and stakeholder engagement skills. - Positive attitude with a focus on delivering excellence. - Strong attention to detail and exceptional relationship management skills. - Open-minded consultative approach and ability to provide and receive constructive feedback. - Creative problem-solving skills and ability to work effectively in a team environment.,
Posted 6 days ago
8.0 - 12.0 years
0 Lacs
noida, uttar pradesh
On-site
You are being sought after for the position of Snowflake Analytics Support Engineer (AnzoGraph, Matillion) with our esteemed client in Noida. This is a full-time role requiring 8 to 12 years of experience. As a Data & Analytics Support Engineer, you will play a crucial role in supporting and optimizing data pipelines, graph analytics, and cloud-based data platforms. Your responsibilities will include providing technical support for AnzoGraph-based solutions, collaborating with data engineering teams on designing and troubleshooting data pipelines, monitoring and optimizing performance of graph queries, ensuring seamless integration of data platforms, and supporting data workflows across AWS services. Additionally, you will be involved in incident management, documentation, and knowledge sharing. To excel in this role, you must possess proven experience with AnzoGraph DB or similar graph database technologies, proficiency in Snowflake data warehousing and Matillion ETL, hands-on experience with AWS services, and proficiency in SQL, SPARQL, and graph query languages. Strong problem-solving skills, communication abilities, and the capacity to work both independently and within a team are essential. Preferred qualifications include experience in data support roles within enterprise-scale environments and certifications in AWS, Snowflake, or related technologies. If you are excited about this opportunity and possess the required skills and qualifications, please get in touch with Swapnil Thakur at swapnil.t@ipeopleinfosystems.com to express your interest. Thank you for considering this opportunity. Best regards, Swapnil Thakur Recruitment & Delivery Lead iPeople Infosystems LLC,
Posted 6 days ago
12.0 - 14.0 years
0 Lacs
bengaluru, karnataka, india
On-site
Location: Indore, Pune, Bangalore, Noida & Gurgaon (Immediate Joiners Preferred) Overall 12+ years of experience. Experience and ability to lead end to end delivery and produce results - requirement gathering, work with cross functional teams, with offshore/onshore, delivery planning. Excellent in communication, presentation, stakeholder management, tracking, delivery. Ability to coordinate with onshore and offshore cross-functional teams based out of India and US time zones, to deliver concurrent projects. Experience in SDLC processes like Agile - Scrum/Kanban, Waterfall. Experience in working with complex client ecosystem on large projects. Experience in data engineering as a Lead/Architect. Experience in SQL, Python, PySpark, Azure Data Services (like Azure Data Factory, Azure Blob, Azure Data Lake, Key Vault, etc.), Databricks, Data Warehousing/modelling. Quick learner, self starter, motivated, result oriented and ability to work with ownership. Good to have experience in Databricks including Medallion Architecture, DLT, Unity Catalog. Good to have experience on Snowflake. Understanding of ETL (tools like Matillion, DataStage, etc) & data warehousing concepts. Integration experience with data sources like REST webservices, Oracle, SAP HANA, Salesforce, etc. Show more Show less
Posted 6 days ago
3.0 - 8.0 years
10 - 19 Lacs
pune, bengaluru, mumbai (all areas)
Hybrid
CitiusTech is conducting drive this weekend on 13th Sep-25 for the below skill. Required Skillset: Snowfalke+Matillion or Snowflake+ADF. Total Years of experience: 3 to 12 years. Relevant experience: minimum 3 to 4 years. Work Location: Chennai, Mumbai, Pune, Bengaluru. Work Mode: Hybrid. Inerview Mode: Virtual. Interview Date: 13th Sep-25. Interested candidates kindly share your updated resume to gopinath.r@citiustech.com. Thanks & Regards, Gopinath R.
Posted 1 week ago
5.0 - 10.0 years
5 - 15 Lacs
noida, bengaluru, mumbai (all areas)
Work from Office
Job Title: Matillion Developer Banking Domain (Snowflake) Location: Noida Type: Full-Time Experience: 6+ years Must-Have Skills: Matillion, Snowflake, SQL, Banking/Financial Data Job Summary: Were looking for an experienced Matillion Developer with strong knowledge of Snowflake and exposure to the banking or financial services domain. You will design and maintain scalable ETL pipelines, ensuring data accuracy, performance, and compliance. Key Responsibilities: • Develop and optimize ETL workflows using Matillion and Snowflake • Work with banking data (transactions, compliance, risk, etc.) • Collaborate with analysts and business teams to deliver data solutions • Ensure data integrity, security, and performance tuning • Support cloud-based data integration, primarily on Azure and AWS Requirements: • 6+ years in Data Engineering / ETL development • 4+ years with Matillion and Snowflake • Strong SQL and data warehousing experience • Understanding of banking/financial data structures and compliance • Familiarity with Git, CI/CD, and cloud platform
Posted 1 week ago
5.0 - 10.0 years
24 - 42 Lacs
pune
Work from Office
Responsibilities: Design, develop, and maintain Snowflake databases using Python and Matillion. Optimize database performance through Zython scripting.
Posted 1 week ago
3.0 - 8.0 years
1 - 2 Lacs
hyderabad
Work from Office
Position: ETL Engineer Location: Gachibowli Telangana Employment Type: Full-time / Contract Experience: 3---10 Years Required Skills Hands-on experience with Fivetran (connectors, transformations, monitoring). Strong working knowledge of Matillion ETL (jobs, orchestration, transformation, performance tuning). Proficiency in SQL (complex queries, optimization, stored procedures). Experience with cloud data warehouses (Snowflake, Redshift, BigQuery, or Azure Synapse). Familiarity with data modelling techniques (star schema, snowflake schema, slowly changing dimensions). Exposure to Qlik Sense or other BI platforms (dashboard integration, data prep). Strong problem-solving skills and attention to detail. Nice-to-Have Experience with data orchestration tools (Airflow, DBT, Control-M). Familiarity with Python/JavaScript for custom transformations or APIs. Understanding of data governance, security, and compliance. Exposure to DevOps/CI-CD for ETL (Git, Jenkins, Kubernetes). Qualifications Bachelors/Master’s in Computer Science, IT, or related field. 3–7 years of data integration/ETL development experience. Prior experience with cloud-native ETL stacks (Fivetran, Matillion, Stitch, DBT) preferred.
Posted 1 week ago
6.0 - 10.0 years
22 - 25 Lacs
chennai
Remote
We are seeking a skilled Data Architect with strong expertise in Snowflake, Matillion, and data modeling. The role involves designing scalable data architectures, ensuring data quality, and delivering analytics solutions with business impact. Required Candidate profile Ideal candidate: 5+ years of experience in Snowflake & Matillion with certifications, strong data modeling and analytics expertise, and excellent communication skills to collaborate with stakeholders.
Posted 1 week ago
8.0 - 12.0 years
15 - 25 Lacs
noida
Remote
**Urgent Hiring for the role of Snowflake Developer for our team** Job Title: Snowflake Developer Work Timing: 11:00 AM IST Onwards Duration: Contract Experience: 8+ years Location: India (Remote) Job Summary: We are seeking a highly skilled Snowflake Developer to design, develop, and maintain scalable data solutions on the Snowflake platform. The ideal candidate will bring expertise in data warehousing, ETL/ELT processes, and cloud-based data architecture, with a strong ability to translate business needs into robust data solutions. This role requires fluency in English and close collaboration with cross-functional teams, including business analysts, data engineers, and stakeholders. Key Responsibilities Design and implement data pipelines using Snowflake, SQL, and ETL tools. Develop and optimize complex SQL queries for data extraction and transformation. Create and manage Snowflake objects (databases, schemas, tables, views, stored procedures). Integrate Snowflake with various data sources and third-party tools. Monitor and troubleshoot performance issues within Snowflake environments. Collaborate with data engineers, analysts, and business stakeholders to gather and fulfill data requirements. Ensure compliance with data quality, governance, and security standards. Automate data workflows and adopt best practices for data management. Required Skills and Qualifications 515 years of experience in data engineering or development roles. Strong proficiency in Snowflake SQL and Snowflake architecture. Experience with ETL/ELT tools (e.g., Informatica, Talend, dbt, Matillion). Strong understanding of cloud platforms (AWS, Azure, or GCP). Familiarity with data modeling and data warehousing concepts. Proficiency in writing high-performance SQL queries/scripts for analytics and reporting. Strong problem-solving skills with attention to detail. Experience with Agile-based development methodologies. Excellent communication skills Fluency in English is mandatory. Preferred Qualifications Snowflake certification (e.g., SnowPro Core). Experience with Python, Java, or Shell scripting. Knowledge of CI/CD pipelines and DevOps practices. Experience with BI tools (Power BI, Tableau, Looker). Note: Interested candidates can drop their resumes at aagnihotri@fcsltd.com
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
punjab
On-site
You are a skilled and proactive Senior Data Engineer with 5-8 years of hands-on experience in Snowflake, Python, Streamlit, and SQL. You also have expertise in consuming REST APIs and working with modern ETL tools such as Matillion, Fivetran, etc. Your strong foundation in data modeling, data warehousing, and data profiling will be crucial as you play a key role in designing and implementing robust data solutions that drive business insights and innovation. Your responsibilities will include designing, developing, and maintaining data pipelines and workflows using Snowflake and an ETL tool. You will develop data applications and dashboards using Python and Streamlit, as well as create and optimize complex SQL queries for data extraction, transformation, and loading. Integrating REST APIs for data access and process automation, performing data profiling, quality checks, and troubleshooting to ensure data accuracy and integrity, and designing and implementing scalable and efficient data models aligned with business requirements are also part of your key responsibilities. To excel in this role, you must have experience in HR data and databases, along with 5-8 years of professional experience in a data engineering or development role. Strong expertise in Snowflake, proficiency in Python (including data manipulation with libraries like Pandas), experience building web-based data tools using Streamlit, and a solid understanding and experience with RESTful APIs and JSON data structures are essential. Strong SQL skills, experience with advanced data transformation logic, and hands-on experience in data modeling, data warehousing concepts, and data profiling techniques are also required. Familiarity with version control (e.g., Git) and CI/CD processes is a plus. Preferred qualifications include experience working in cloud environments (AWS, Azure, or GCP), knowledge of data governance and cataloging tools, experience with agile methodologies and working in cross-functional teams, and experience in HR data and databases. Experience in Azure Data Factory would also be beneficial.,
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
hyderabad, telangana
On-site
You are an accomplished and dynamic Data Engineering Lead with 7 to 10 years of experience in data engineering, specializing in data pipeline development, architecture, and system optimization. Your proven track record includes leading data engineering teams and successfully managing end-to-end project delivery. Your technical expertise includes proficiency in programming languages such as Python or Scala, designing and delivering data pipelines in Cloud Data Warehouses (e.g., Snowflake, Redshift) using ETL/ELT tools like Matillion, dbt, Striim, etc. You have a solid understanding of database systems (relational and NoSQL), data modeling techniques, and hands-on experience in designing and developing data integration solutions using tools like Matillion and/or dbt. You are well-versed in data engineering and integration frameworks, architecting data solutions, implementing end-to-end projects with multiple transformation layers, defining coding standards and testing strategies, working with cloud platforms (AWS, Azure, GCP) and associated data services, Agile methodology, DevOps processes, containerization (Docker), orchestration tools (Airflow, Control-M), and CI/CD pipelines. As the Lead Data Engineer, you will be responsible for architecting, designing, and implementing end-to-end data pipelines and systems, ensuring optimal system architecture for performance, scalability, and reliability, evaluating and integrating new technologies, and implementing best practices in ETL/ELT processes, data integration, and data warehousing. You will lead technical project execution, collaborate with cross-functional teams, act as the primary point of contact for clients, manage and mentor a team of 5 to 7 data engineers, and optimize existing data workflows for performance and cost-efficiency. Your adaptability, innovation, and client collaboration skills will be crucial as you embrace a consulting mindset, identify opportunities for innovation, engage with stakeholders, present technical concepts to both technical and non-technical audiences, and stay updated with emerging data technologies. You will drive internal initiatives, contribute to the organization's eminence, and continuously learn, innovate, and collaborate to deliver impactful, scalable data solutions in a fast-paced consulting environment.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
You will be working with Sonata Software Limited, a prominent Modernization engineering company headquartered in Bangalore, with a global presence. Specializing in cloud and data modernization, Microsoft Dynamics Modernization, Digital contact center setup, and management, managed cloud services, and digital transformation services, Sonata focuses on innovation and customer-centricity to facilitate businesses in accelerating their modernization and digital transformation journeys. As a full-time Matillion Developer/Admin based in Hyderabad, you will undertake various responsibilities. These include managing day-to-day tasks such as installation, patching, and software development using Matillion and other tools. You will be involved in implementing data pipelines, troubleshooting, and performance tuning activities. Collaboration with cross-functional teams will be essential to ensure the successful execution of projects. To excel in this role, you should possess a minimum of 3 years of experience with the Matillion ETL tool, demonstrating expertise in developing ETL processes and SQL. A strong understanding of data modeling, data warehousing concepts, and data integration techniques is crucial. Proficiency in maintaining data pipelines, troubleshooting data workflows, and Object-Oriented Programming (OOP) is required. Familiarity with cloud platforms such as AWS, Azure, or GCP, Back-End Web Development, and Programming skills are advantageous. Moreover, you should exhibit excellent analytical and problem-solving skills, along with the ability to collaborate effectively in cross-functional teams. A Bachelor's degree in Computer Science, Information Technology, or a related field is mandatory. Possession of relevant certifications in Matillion or data integration technologies is considered a plus. If you are an immediate joiner with the desired qualifications and experience, we look forward to your application for this exciting opportunity.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
You will be responsible for designing and implementing scalable data models using Snowflake to support business intelligence and analytics solutions. This will involve implementing ETL/ELT solutions with complex business transformations and handling end-to-end data warehousing solutions. Additionally, you will be tasked with migrating data from legacy systems to Snowflake systems and writing complex SQL queries to extract, transform, and load data with a focus on high performance and accuracy. Your role will also include optimizing SnowSQL queries for better processing speeds and integrating Snowflake with 3rd party applications. To excel in this role, you should have a strong understanding of Snowflake architecture, features, and best practices. Experience in using Snowpipe and Snowpark/Streamlit, as well as familiarity with cloud platforms such as AWS, Azure, or GCP and other cloud-based data technologies, will be beneficial. Knowledge of data modeling concepts like star schema, snowflake schema, and data partitioning is essential. Experience with tools like dbt, Matillion, or Airbyte for data transformation and automation is preferred, along with familiarity with Snowflake's Time Travel, Streams, and Tasks features. Proficiency in data pipeline orchestration using tools like Airflow or Prefect, as well as scripting and automation skills in Python or Java, are required. Additionally, experience with data visualization tools like Tableau, Power BI, QlikView/QlikSense, or Looker will be advantageous.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
ahmedabad, gujarat
On-site
As a skilled Snowflake Data Engineer with over 4 years of experience, you will be responsible for designing and implementing scalable data solutions using Snowflake as a cloud data platform. Your expertise in data modelling, ETL processes, and performance optimization will be crucial for the success of our projects. Your key responsibilities will include developing and optimizing data pipelines on Snowflake, managing databases and schemas, and integrating data from various sources using ETL/ELT tools like Talend, Informatica, or Matillion. You will also be expected to monitor and optimize query performance, design data models based on business requirements, and collaborate with data analysts and other stakeholders to deliver effective data solutions. To excel in this role, you must have at least 2 years of experience working specifically with Snowflake, strong SQL skills, and knowledge of data warehousing concepts. Your familiarity with cloud platforms like AWS, Azure, or GCP, hands-on experience with Snowflake utilities, and proficiency in ETL tools and scripting languages will be essential. Preferred qualifications for this position include SnowPro Certification, experience in Agile/Scrum development methodologies, and knowledge of data governance and compliance standards such as GDPR and HIPAA. Strong problem-solving, analytical, communication, and teamwork skills are also required to succeed in this role. If you are passionate about data engineering, have a proven track record in Snowflake development, and possess the technical and soft skills necessary to thrive in a collaborative environment, we encourage you to apply for this full-time, permanent position with us. The work schedule is day shift, Monday to Friday, and the location is remote. Kindly respond to the following application questions: 1. Do you have SnowPro Certification or equivalent credentials 2. How many years of experience do you have in Agile/Scrum development methodologies 3. How many years of experience do you have as a Snowflake Data Engineer 4. What is your current monthly salary 5. What is your expected salary 6. What is your notice period Join us in revolutionizing data solutions and making a real impact in the industry!,
Posted 2 weeks ago
8.0 - 12.0 years
32 - 35 Lacs
indore, hyderabad, chennai
Work from Office
We are seeking a Senior Data Engineer with expertise in Azure Data Factory,SQL,& Python. The role involves building and optimizing ETL pipelines,ensuring data quality,performance tuning,and collaborating with stakeholders for scalable data solutions. Required Candidate profile Experienced Data Engineer(7+yrs) skilled in Azure Data Factory, SQL,Python,&ETL tools. Strong in data integration,quality,performance tuning,&cloud-based solutions with excellent communication skills.
Posted 2 weeks ago
3.0 - 8.0 years
6 - 16 Lacs
bengaluru
Hybrid
***We are looking for immediate joiners for our Bangalore Location*** Role & responsibilities Hands on experience in Admin activities on Snowflake, DBT, Matillion & AWS-Glue , with exposure in Development & Support environments. Experience in real time CDC tools - IBM Infosphere and Qlik Replicate. Willingness to quickly learn/adapt other ETL/BI cloud tools if & as needed. Possess strong Snowflake SQL & Database skills. Strong written and oral communication skills. Should be able to understand business users, their issues and ability to take them to closure. Should possess the ability to learn & adapt, understand business requirements, data models and mapping documents. Independently design & Develop and Test ETLs as per requirements. Should be familiar with ETL review & change management process. Ability to work with testing team for testing of deliverables and fix defects. Should have hands-on on any of the ticketing tools and/or ServiceNow, JIRA. Ability to work with multiple teams, and should be capable of handling tasks beyond day-to-day activities. Flexible & ready to work in Rotational shift on need basis
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You are a highly skilled and experienced Data Architect with expertise in cloud-based solutions. You will be responsible for designing, implementing, and optimizing data architecture to meet the organization's current and future needs. Your role will involve data modeling, transformation, governance, and hands-on experience with modern cloud platforms and tools such as Snowflake, Spark, Data Lakes, and Data Warehouses. Collaboration with cross-functional teams and stakeholders is crucial, and you will establish and enforce standards and guidelines across data platforms to ensure consistency, scalability, and best practices. You will be accountable for architecting and implementing scalable, secure, and high-performance cloud data platforms that integrate data lakes, data warehouses, and databases. Developing comprehensive data models to support analytics, reporting, and operational needs will be a key responsibility. Leading the design and execution of ETL/ELT pipelines to process and transform data efficiently using tools like Talend, Matillion, SQL, BigData, Hadoop, AWS EMR, and Apache Spark is essential. You will integrate diverse data sources into cohesive and reusable datasets for business intelligence and machine learning purposes. Establishing, documenting, and enforcing standards and guidelines for data architecture, data modeling, transformation, and governance across all data platforms will be part of your role. Ensuring consistency and best practices in data storage, integration, and security throughout the organization is critical. You will establish and enforce data governance standards to ensure data quality, security, and compliance with regulatory requirements, implementing processes and tools to manage metadata, lineage, and data access controls. Your expertise will be utilized in utilizing Snowflake for advanced analytics and data storage needs, optimizing performance and cost efficiency. Leveraging modern cloud platforms to manage data lakes and ensure seamless integration with other services is also a key responsibility. Collaboration with business stakeholders, data engineers, and analysts to gather requirements and translate them into technical designs is essential, along with effectively communicating architectural decisions, trade-offs, and progress to both technical and non-technical audiences. Continuous improvement is part of your role, where you will stay updated on emerging trends in cloud and data technologies, recommending innovations to enhance the organization's data capabilities and optimizing existing architectures to improve scalability, performance, and maintainability. Your technical skills should include expertise in data modeling, data architecture design principles, Talend, Matillion, SQL, BigData, Hadoop, AWS EMR, Apache Spark, Snowflake, and cloud-based data platforms. Experience with data lakes, data warehouses, relational and NoSQL databases, data transformation techniques, ETL/ELT pipelines, DevOps/DataOps/MLOps tools, and standards and governance frameworks is necessary. You should have exceptional written and verbal communication skills to interact effectively with technical teams and business stakeholders. Ideally, you should have 5+ years of experience in data architecture focusing on cloud technologies, a proven track record of delivering scalable, cloud-based data solutions, and a Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. Preferred qualifications include certifications in Snowflake, AWS data services, any RDBMS/NoSQL, AI/ML, Data Governance, familiarity with machine learning workflows and data pipelines, and experience working in Agile development environments.,
Posted 2 weeks ago
1.0 - 5.0 years
0 Lacs
hyderabad, telangana
On-site
You will be part of a team of Novartis specialists within Data and Product Solutions, embarking on a data and digital transformation journey to leverage analytics for generating actionable insights that impact millions of patients worldwide. The primary goal is to facilitate easier, faster, and more reliable decision-making for Novartis divisions globally. Your responsibilities will include exploring, developing, implementing, and evaluating innovative solutions to address customer needs. You will collaborate with key partners to establish partnerships and collaborations, as well as develop and coordinate project plans to ensure successful delivery within set KPIs. Working closely with brand teams, technical teams, and all functions to enhance value will be crucial. Additionally, you will collaborate with global and local Brand teams on Project Planning and delivery management through analytics-based solutions. To be successful in this role, you should have 1-3 years of experience in data analytics within a market research firm, pharmaceutical company, or Pharma KPO. Proficiency in SQL, Dataiku, PowerBI, Alteryx, Excel, and PowerPoint is essential. Exposure to US pharma datasets and DevOps tools like Azure DevOps and JIRA x-Ray is preferred. Strong communication, presentation, and stakeholder management skills are required, along with a proactive business results-focus and the ability to provide insights. Desirable qualifications include exposure to Python, experience in an international company with healthcare analytics exposure, and working in a cross-cultural environment. Novartis is dedicated to fostering an inclusive work environment and diverse teams that reflect the patients and communities served. The company is also committed to providing reasonable accommodation to individuals with disabilities during the recruitment process. If you require accommodation, please contact [email protected] Novartis offers a community of smart, passionate individuals working together to achieve breakthroughs that positively impact patients" lives. If you are ready to contribute to a brighter future, consider joining us at Novartis. To explore career opportunities within the Novartis Network, sign up to our talent community: https://talentnetwork.novartis.com/network For further details on benefits and rewards offered by Novartis, refer to our handbook: https://www.novartis.com/careers/benefits-rewards,
Posted 2 weeks ago
3.0 - 5.0 years
0 Lacs
gurugram, haryana, india
On-site
At Yum! Were looking for a Software Engineer to add to our dynamic and rapid scaling team. Were making this investment to help us optimize our digital channels and technology innovations with the end goal of creating competitive advantages for our restaurants around the globe. Were looking for a solid lead engineer who brings fresh ideas from past experiences and is eager to tackle new challenges in our company. Were in search of a candidate who is knowledgeable about and loves working with modern data integration frameworks, big data, and cloud technologies. Candidates must also be proficient with data programming languages (e.g., Python and SQL). The Yum! data engineer will build a variety of data pipelines and models to support advanced AI/ML analytics projects - with the intent of elevating the customer experience and driving revenue and profit growth in our restaurants globally. The candidate will work in our office in Gurgaon, India. As a Software Engineer, You Will Partner with KFC, Pizza Hut, Taco Bell & Habit Burger to build data pipelines to enable best-in-class restaurant technology solutions. Play a key role in our AIDA team - developing data solutions responsible for driving Yum! Growth. Develop & maintain high performance & scalable data solutions. Design and develop data pipelines streaming and batch to move data from point-of-sale, back of house, operational platforms and more to our Global Data Hub Contribute to standardizing and developing a framework to extend these pipelines across brands and markets Develop on the Yum! data platform by building applications using a mix of open-source frameworks (PySpark, Kubernetes, Airflow, etc.) and best in breed SaaS tools (Informatica Cloud, Snowflake, Domo, etc.). Implement and manage production support processes around data lifecycle, data quality, coding utilities, storage, reporting and other data integration points. Developing scalable REST APIs in python. Develop and maintain backend services using Python (e.g., FastAPI, Flask, Django). Minimum Requirement Vast background in all things data related (3+ years of Experience) AWS platform development experience (EKS, S3, API Gateway, Lambda, etc.) Experience with modern ETL tools such as Informatica, Matillion, or DBT; Informatica CDI is a plus High level of proficiency with SQL (Snowflake a big plus) Proficiency with Python for transforming data and automating tasks Experience with Kafka, Pulsar, or other streaming technologies Experience orchestrating complex tasks flows across a variety of technologies. Bachelors degree from an accredited institution or relevant experience Experience with at least one of NoSQL databases MongoDB, Elasticsearch etc. The Yum! Brands story is simple. We have the four distinctive, relevant and easy global brands KFC, Pizza Hut, Taco Bell and The Habit Burger Grill -- born from the hopes and dreams, ambitions and grit of passionate entrepreneurs. And we want more of this to create our future! As the worlds largest restaurant company we have a clear and compelling mission: to build the worlds most love, trusted and fastest-growing restaurant brands. The key and not-so-secret ingredient in our recipe for growth is our unrivaled talent and culture, which fuels our results. Were looking for talented, motivated, visionary and team-oriented leaders to join us as we elevate and personalize the customer experience across our 48,000 restaurants, operating in 145 countries and territories around the world! We put pizza, chicken and tacos in the hands of customers through customized ordering, unique delivery approaches, app experiences, and click and collect services and consumer data analytics creating unique customer dining experiences and we are only getting started. Employees may work for a single brand and potentially grow to support all company-owned brands depending on their role. Regardless of where they work, as a company opening an average of 8 restaurants a day worldwide, the growth opportunities are endless. Taco Bell has been named of the 10 Most Innovative Companies in the World by Fast Company; Pizza Hut delivers more pizzas than any other pizza company in the world and KFCs still use its 75-year-old finger lickin good recipe including secret herbs and spices to hand-bread its chicken every day. Yum! and its brands have offices in Chicago, IL, Louisville KY, Irvine, CA, Plano, TX and other markets around the world. We dont just say we are a great place to work our commitments to the world and our employees show it. Yum! has been named to the Dow Jones Sustainability North America Index and ranked among the top 100 Best Corporate Citizens by Corporate Responsibility Magazine in addition to being named to the Bloomberg Gender-Equality Index. Our employees work in an environment where the value of believe in all people is lived every day, enjoying benefits including but not limited to: 4 weeks vacation PLUS holidays, sick leave and 2 paid days to volunteer at the cause of their choice and a dollar-for-dollar matching gift program; generous parental leave; competitive benefits including medical, dental, vision and life insurance as well as a 6% 401k match all encompassed in Yum!s world-famous recognition culture. Show more Show less
Posted 2 weeks ago
5.0 - 8.0 years
0 - 3 Lacs
hyderabad
Remote
Job Description: Experience in ETL tools with strong SQL skills, minimum 5+ years. Matillion preferred. Experience in creating pipelines for Data warehouse and should be experienced with Snowflake data manipulation and tuning. Experience in backend programming including schema and table design, stored procedures, Triggers, Views, and Indexes. Conduct data analysis, mapping transformation, data modeling and data-warehouse concepts Strong working Experience with Agile, Scrum, Kanban, and Waterfall methodologies. Knowledge of Matillion tool is an add on. Strong communications skills written and oral Azure cloud platform experience required.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |