Jobs
Interviews

169 Snowflake Db Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 8.0 years

9 - 13 Lacs

chennai

Work from Office

About The Opportunity : Join a pioneering consulting firm in the Data Analytics and Cloud Solutions sector, where transformative data architectures empower global enterprises. We specialize in leveraging cutting-edge Snowflake technologies and innovative cloud solutions to drive real-time insights and business intelligence. This remote role, based in India, offers the opportunity to work on high-impact projects while collaborating with a diverse team of experts. Role & Responsibilities : - Design, implement, and optimize scalable data warehousing solutions using Snowflake Cortex. - Develop robust ETL pipelines that ensure data quality, reliability, and efficient integration across platforms. - Collaborate with cross-functional teams to translate business requirements into innovative cloud data solutions. - Monitor system performance, analyze query execution, and implement optimization strategies to drive efficiency. - Troubleshoot and resolve data integration issues, ensuring continuity and minimizing downtime. - Mentor junior team members on best practices and the latest trends in Snowflake and cloud technologies. Skills & Qualifications : Must-Have : Minimum of 5+ years of hands-on experience with Snowflake Cortex and cloud-based data warehousing solutions. Must-Have : Proficiency in SQL and extensive experience in building, managing, and optimizing ETL pipelines. Must-Have : Proven track record in integrating large-scale data solutions in a dynamic consulting environment. Preferred : Familiarity with major cloud platforms such as AWS, Azure, or Google Cloud. Preferred : Experience working remotely and within agile development frameworks. Preferred : Excellent problem-solving, communication, and team collaboration skills. Benefits & Culture Highlights. - Competitive salary paired with performance-based incentives. - Flexible remote work opportunities fostering a healthy work-life balance. - A dynamic, inclusive, and collaborative work culture committed to continuous learning and professional growth.

Posted 2 days ago

Apply

10.0 - 12.0 years

20 - 25 Lacs

hyderabad

Work from Office

Key Responsibilities : As an Enterprise Data Architect, you will : - Lead Data Architecture : Design, develop, and implement comprehensive enterprise data architectures, primarily leveraging Azure and Snowflake platforms. - Data Transformation & ETL : Oversee and guide complex data transformation and ETL processes for large and diverse datasets, ensuring data integrity, quality, and performance. - Customer-Centric Data Design : Specialize in designing and optimizing customer-centric datasets from various sources, including CRM, Call Center, Marketing, Offline, and Point of Sale systems. - Data Modeling : Drive the creation and maintenance of advanced data models, including Relational, Dimensional, Columnar, and Big Data models, to support analytical and operational needs. - Query Optimization : Develop, optimize, and troubleshoot complex SQL and NoSQL queries to ensure efficient data retrieval and manipulation. - Data Warehouse Management : Apply advanced data warehousing concepts to build and manage high-performing, scalable data warehouse solutions. - Tool Evaluation & Implementation : Evaluate, recommend, and implement industry-leading ETL tools such as Informatica and Unifi, ensuring best practices are followed. - Business Requirements & Analysis : Lead efforts in business requirements definition and management, structured analysis, process design, and use case documentation to translate business needs into technical specifications. - Reporting & Analytics Support : Collaborate with reporting teams, providing architectural guidance and support for reporting technologies like Tableau and PowerBI. - Software Development Practices : Apply professional software development principles and best practices to data solution delivery. - Stakeholder Collaboration : Interface effectively with sales teams and directly engage with customers to understand their data challenges and lead them to successful outcomes. - Project Management & Multi-tasking : Demonstrate exceptional organizational skills, with the ability to manage and prioritize multiple simultaneous customer projects effectively. - Strategic Thinking & Leadership : Act as a self-managed, proactive, and customer-focused leader, driving innovation and continuous improvement in data architecture. Position Requirements : of strong experience with data transformation & ETL on large data sets. - Experience with designing customer-centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale, etc.). - 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). - 5+ years of complex SQL or NoSQL experience. - Extensive experience in advanced Data Warehouse concepts. - Proven experience with industry ETL tools (i.e., Informatica, Unifi). - Solid experience with Business Requirements definition and management, structured analysis, process design, and use case documentation. - Experience with Reporting Technologies (i.e., Tableau, PowerBI). - Demonstrated experience in professional software development. - Exceptional organizational skills and ability to multi-task simultaneous different customer projects. - Strong verbal & written communication skills to interface with sales teams and lead customers to successful outcomes. - Must be self-managed, proactive, and customer-focused. Technical Skills : - Cloud Platforms : Microsoft Azure - Data Warehousing : Snowflake - ETL Methodologies : Extensive experience in ETL processes and tools - Data Transformation : Large-scale data transformation - Data Modeling : Relational, Dimensional, Columnar, Big Data - Query Languages : Complex SQL, NoSQL - ETL Tools : Informatica, Unifi (or similar enterprise-grade tools) - Reporting & BI : Tableau, PowerBI

Posted 2 days ago

Apply

5.0 - 8.0 years

9 - 13 Lacs

hyderabad

Remote

This remote role, based in India, offers the opportunity to work on high-impact projects while collaborating with a diverse team of experts. Role & Responsibilities : - Design, implement, and optimize scalable data warehousing solutions using Snowflake Cortex. - Develop robust ETL pipelines that ensure data quality, reliability, and efficient integration across platforms. - Collaborate with cross-functional teams to translate business requirements into innovative cloud data solutions. - Monitor system performance, analyze query execution, and implement optimization strategies to drive efficiency. - Troubleshoot and resolve data integration issues, ensuring continuity and minimizing downtime. - Mentor junior team members on best practices and the latest trends in Snowflake and cloud technologies. Skills & Qualifications : Must-Have : Minimum of 5+ years of hands-on experience with Snowflake Cortex and cloud-based data warehousing solutions. Must-Have : Proficiency in SQL and extensive experience in building, managing, and optimizing ETL pipelines. Must-Have : Proven track record in integrating large-scale data solutions in a dynamic consulting environment. Preferred : Familiarity with major cloud platforms such as AWS, Azure, or Google Cloud. Preferred : Experience working remotely and within agile development frameworks. Preferred : Excellent problem-solving, communication, and team collaboration skills. Benefits & Culture Highlights. - Competitive salary paired with performance-based incentives. - Flexible remote work opportunities fostering a healthy work-life balance. - A dynamic, inclusive, and collaborative work culture committed to continuous learning and professional growth.

Posted 3 days ago

Apply

10.0 - 12.0 years

20 - 25 Lacs

bengaluru

Work from Office

Key Responsibilities : As an Enterprise Data Architect, you will : - Lead Data Architecture : Design, develop, and implement comprehensive enterprise data architectures, primarily leveraging Azure and Snowflake platforms. - Data Transformation & ETL : Oversee and guide complex data transformation and ETL processes for large and diverse datasets, ensuring data integrity, quality, and performance. - Customer-Centric Data Design : Specialize in designing and optimizing customer-centric datasets from various sources, including CRM, Call Center, Marketing, Offline, and Point of Sale systems. - Data Modeling : Drive the creation and maintenance of advanced data models, including Relational, Dimensional, Columnar, and Big Data models, to support analytical and operational needs. - Query Optimization : Develop, optimize, and troubleshoot complex SQL and NoSQL queries to ensure efficient data retrieval and manipulation. - Data Warehouse Management : Apply advanced data warehousing concepts to build and manage high-performing, scalable data warehouse solutions. - Tool Evaluation & Implementation : Evaluate, recommend, and implement industry-leading ETL tools such as Informatica and Unifi, ensuring best practices are followed. - Business Requirements & Analysis : Lead efforts in business requirements definition and management, structured analysis, process design, and use case documentation to translate business needs into technical specifications. - Reporting & Analytics Support : Collaborate with reporting teams, providing architectural guidance and support for reporting technologies like Tableau and PowerBI. - Software Development Practices : Apply professional software development principles and best practices to data solution delivery. - Stakeholder Collaboration : Interface effectively with sales teams and directly engage with customers to understand their data challenges and lead them to successful outcomes. - Project Management & Multi-tasking : Demonstrate exceptional organizational skills, with the ability to manage and prioritize multiple simultaneous customer projects effectively. - Strategic Thinking & Leadership : Act as a self-managed, proactive, and customer-focused leader, driving innovation and continuous improvement in data architecture. Position Requirements : of strong experience with data transformation & ETL on large data sets. - Experience with designing customer-centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale, etc.). - 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). - 5+ years of complex SQL or NoSQL experience. - Extensive experience in advanced Data Warehouse concepts. - Proven experience with industry ETL tools (i.e., Informatica, Unifi). - Solid experience with Business Requirements definition and management, structured analysis, process design, and use case documentation. - Experience with Reporting Technologies (i.e., Tableau, PowerBI). - Demonstrated experience in professional software development. - Exceptional organizational skills and ability to multi-task simultaneous different customer projects. - Strong verbal & written communication skills to interface with sales teams and lead customers to successful outcomes. - Must be self-managed, proactive, and customer-focused. Technical Skills : - Cloud Platforms : Microsoft Azure - Data Warehousing : Snowflake - ETL Methodologies : Extensive experience in ETL processes and tools - Data Transformation : Large-scale data transformation - Data Modeling : Relational, Dimensional, Columnar, Big Data - Query Languages : Complex SQL, NoSQL - ETL Tools : Informatica, Unifi (or similar enterprise-grade tools) - Reporting & BI : Tableau, PowerBI

Posted 3 days ago

Apply

5.0 - 7.0 years

5 - 15 Lacs

chennai

Work from Office

experience: 5+ years of relevant experience We are seeking a highly skilled and experienced Snowflake Lead responsible for leading the design, development, and implementation of Snowflake-based data warehousing solutions. You will leverage your deep understanding of ETL and Data Warehousing concepts to build robust and scalable data pipelines. A key aspect of this role involves direct interaction with business users to gather and clarify requirements, ensuring that the delivered solutions meet their analytical needs. Responsibilities: Leadership & Delivery: Lead a module or a team of developers in the design, development, and deployment of Snowflake solutions. Take ownership of the end-to-end delivery of Snowflake modules, ensuring adherence to timelines and quality standards. Provide technical guidance and mentorship to team members, fostering a collaborative and high-performing environment. Contribute to project planning, estimation, and risk management activities. Snowflake Expertise: Utilize in-depth knowledge of Snowflake architecture, features, and best practices to design efficient and scalable data models and ETL/ELT processes. Develop and optimize complex SQL queries and Snowflake scripting for data manipulation and transformation. Implement Snowflake utilities such as SnowSQL, Snowpipe, Tasks, Streams, Time Travel, and Cloning as needed. Ensure data security and implement appropriate access controls within the Snowflake environment. Monitor and optimize the performance of Snowflake queries and data pipelines. Integrate PySpark with Snowflake for data ingestion and processing. Understand and apply PySpark best practices and performance tuning techniques. Experience with Spark architecture and its components (e.g., Spark Core, Spark SQL, DataFrames). ETL & Data Warehousing: Apply strong understanding of ETL/ELT concepts, data warehousing principles (including dimensional modeling, star/snowflake schemas), and data integration techniques. Design and develop data pipelines to extract data from various source systems, transform it according to business rules, and load it into Snowflake. Work with both structured and semi-structured data, including JSON and XML. Experience with ETL tools (e.g., Informatica, Talend, pyspark) is a plus, particularly in the context of integrating with Snowflake. Requirements Gathering & Clarification: Actively participate in requirement gathering sessions with business users and stakeholders. Translate business requirements into clear and concise technical specifications and design documents. Collaborate with business analysts and users to clarify ambiguities and ensure a thorough understanding of data and reporting needs. Validate proposed solutions with users to ensure they meet expectations. Collaboration & Communication: Work closely with other development teams, data engineers, and business intelligence analysts to ensure seamless integration of Snowflake solutions with other systems. Communicate effectively with both technical and non-technical stakeholders. Provide regular updates on progress and any potential roadblocks. Best Practices & Continuous Improvement: Adhere to and promote best practices in Snowflake development, data warehousing, and ETL processes. Stay up-to-date with the latest Snowflake features and industry trends. Identify opportunities for process improvement and optimization. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Minimum of 5 years of relevant experience in data warehousing and ETL development, with a significant focus on Snowflake. Strong proficiency in SQL and experience working with large datasets. Solid understanding of data modeling concepts (dimensional modeling, star/snowflake schemas). Experience in designing and developing ETL or ELT pipelines. Proven ability to gather and document business and technical requirements. Excellent communication, interpersonal, and problem-solving skills. Snowflake certifications (e.g., SnowPro Core) are a plus.

Posted 3 days ago

Apply

5.0 - 7.0 years

5 - 15 Lacs

pune

Work from Office

experience: 5+ years of relevant experience We are seeking a highly skilled and experienced Snowflake Lead responsible for leading the design, development, and implementation of Snowflake-based data warehousing solutions. You will leverage your deep understanding of ETL and Data Warehousing concepts to build robust and scalable data pipelines. A key aspect of this role involves direct interaction with business users to gather and clarify requirements, ensuring that the delivered solutions meet their analytical needs. Responsibilities: Leadership & Delivery: Lead a module or a team of developers in the design, development, and deployment of Snowflake solutions. Take ownership of the end-to-end delivery of Snowflake modules, ensuring adherence to timelines and quality standards. Provide technical guidance and mentorship to team members, fostering a collaborative and high-performing environment. Contribute to project planning, estimation, and risk management activities. Snowflake Expertise: Utilize in-depth knowledge of Snowflake architecture, features, and best practices to design efficient and scalable data models and ETL/ELT processes. Develop and optimize complex SQL queries and Snowflake scripting for data manipulation and transformation. Implement Snowflake utilities such as SnowSQL, Snowpipe, Tasks, Streams, Time Travel, and Cloning as needed. Ensure data security and implement appropriate access controls within the Snowflake environment. Monitor and optimize the performance of Snowflake queries and data pipelines. Integrate PySpark with Snowflake for data ingestion and processing. Understand and apply PySpark best practices and performance tuning techniques. Experience with Spark architecture and its components (e.g., Spark Core, Spark SQL, DataFrames). ETL & Data Warehousing: Apply strong understanding of ETL/ELT concepts, data warehousing principles (including dimensional modeling, star/snowflake schemas), and data integration techniques. Design and develop data pipelines to extract data from various source systems, transform it according to business rules, and load it into Snowflake. Work with both structured and semi-structured data, including JSON and XML. Experience with ETL tools (e.g., Informatica, Talend, pyspark) is a plus, particularly in the context of integrating with Snowflake. Requirements Gathering & Clarification: Actively participate in requirement gathering sessions with business users and stakeholders. Translate business requirements into clear and concise technical specifications and design documents. Collaborate with business analysts and users to clarify ambiguities and ensure a thorough understanding of data and reporting needs. Validate proposed solutions with users to ensure they meet expectations. Collaboration & Communication: Work closely with other development teams, data engineers, and business intelligence analysts to ensure seamless integration of Snowflake solutions with other systems. Communicate effectively with both technical and non-technical stakeholders. Provide regular updates on progress and any potential roadblocks. Best Practices & Continuous Improvement: Adhere to and promote best practices in Snowflake development, data warehousing, and ETL processes. Stay up-to-date with the latest Snowflake features and industry trends. Identify opportunities for process improvement and optimization. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Minimum of 5 years of relevant experience in data warehousing and ETL development, with a significant focus on Snowflake. Strong proficiency in SQL and experience working with large datasets. Solid understanding of data modeling concepts (dimensional modeling, star/snowflake schemas). Experience in designing and developing ETL or ELT pipelines. Proven ability to gather and document business and technical requirements. Excellent communication, interpersonal, and problem-solving skills. Snowflake certifications (e.g., SnowPro Core) are a plus.

Posted 3 days ago

Apply

5.0 - 7.0 years

5 - 15 Lacs

bengaluru

Work from Office

experience: 5+ years of relevant experience We are seeking a highly skilled and experienced Snowflake Lead responsible for leading the design, development, and implementation of Snowflake-based data warehousing solutions. You will leverage your deep understanding of ETL and Data Warehousing concepts to build robust and scalable data pipelines. A key aspect of this role involves direct interaction with business users to gather and clarify requirements, ensuring that the delivered solutions meet their analytical needs. Responsibilities: Leadership & Delivery: Lead a module or a team of developers in the design, development, and deployment of Snowflake solutions. Take ownership of the end-to-end delivery of Snowflake modules, ensuring adherence to timelines and quality standards. Provide technical guidance and mentorship to team members, fostering a collaborative and high-performing environment. Contribute to project planning, estimation, and risk management activities. Snowflake Expertise: Utilize in-depth knowledge of Snowflake architecture, features, and best practices to design efficient and scalable data models and ETL/ELT processes. Develop and optimize complex SQL queries and Snowflake scripting for data manipulation and transformation. Implement Snowflake utilities such as SnowSQL, Snowpipe, Tasks, Streams, Time Travel, and Cloning as needed. Ensure data security and implement appropriate access controls within the Snowflake environment. Monitor and optimize the performance of Snowflake queries and data pipelines. Integrate PySpark with Snowflake for data ingestion and processing. Understand and apply PySpark best practices and performance tuning techniques. Experience with Spark architecture and its components (e.g., Spark Core, Spark SQL, DataFrames). ETL & Data Warehousing: Apply strong understanding of ETL/ELT concepts, data warehousing principles (including dimensional modeling, star/snowflake schemas), and data integration techniques. Design and develop data pipelines to extract data from various source systems, transform it according to business rules, and load it into Snowflake. Work with both structured and semi-structured data, including JSON and XML. Experience with ETL tools (e.g., Informatica, Talend, pyspark) is a plus, particularly in the context of integrating with Snowflake. Requirements Gathering & Clarification: Actively participate in requirement gathering sessions with business users and stakeholders. Translate business requirements into clear and concise technical specifications and design documents. Collaborate with business analysts and users to clarify ambiguities and ensure a thorough understanding of data and reporting needs. Validate proposed solutions with users to ensure they meet expectations. Collaboration & Communication: Work closely with other development teams, data engineers, and business intelligence analysts to ensure seamless integration of Snowflake solutions with other systems. Communicate effectively with both technical and non-technical stakeholders. Provide regular updates on progress and any potential roadblocks. Best Practices & Continuous Improvement: Adhere to and promote best practices in Snowflake development, data warehousing, and ETL processes. Stay up-to-date with the latest Snowflake features and industry trends. Identify opportunities for process improvement and optimization. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Minimum of 5 years of relevant experience in data warehousing and ETL development, with a significant focus on Snowflake. Strong proficiency in SQL and experience working with large datasets. Solid understanding of data modeling concepts (dimensional modeling, star/snowflake schemas). Experience in designing and developing ETL or ELT pipelines. Proven ability to gather and document business and technical requirements. Excellent communication, interpersonal, and problem-solving skills. Snowflake certifications (e.g., SnowPro Core) are a plus.

Posted 3 days ago

Apply

6.0 - 8.0 years

8 - 13 Lacs

hyderabad

Work from Office

About The Opportunity : Operating at the forefront of the consulting and technology services sector, our client delivers transformative data management and business analytics solutions to a global clientele. Specializing in innovative approaches to data architecture and information management, they empower organizations to make data-driven decisions. We are seeking a seasoned professional to drive the evolution of our data infrastructure. This is an exciting chance to join a high-caliber team that values precision, efficiency, and strategic vision in a fully remote setting from anywhere in India. Role & Responsibilities : As a key member of our team, you will be instrumental in shaping our data landscape. Your responsibilities will include : - Designing, developing, and maintaining comprehensive logical and physical data models that meet stringent business and regulatory requirements. - Collaborating with cross-functional teams including data engineers, analysts, and business stakeholders to align data architecture with evolving business needs. - Translating complex business requirements into robust and scalable data models, ensuring optimal performance and data integrity. - Optimizing and refining existing data structures and frameworks to enhance data access, reporting, and analytics capabilities. - Driving governance and data quality initiatives by establishing best practices and documentation standards for data modeling. - Mentoring and guiding junior team members, fostering a culture of continuous improvement and technical excellence. Skills & Qualifications : Must-Have : - Experience : 6+ years of experience in data modeling and related disciplines. - Data Modeling : Extensive expertise in conceptual, logical, and physical models, including Star/Snowflake schema design, Normalization, and Denormalization techniques. - Snowflake : Proven experience with Snowflake schema design, Performance tuning, Time Travel, Streams & Tasks, and Secure & Materialized Views. - SQL & Scripting : Advanced proficiency in SQL (including CTEs and Window Functions), with a strong focus on automation and optimization. Key Skills : - Data Modeling (conceptual, logical, physical) - Schema Design (Star Schema, Snowflake Schema) - Optimization & Performance Tuning - Snowflake (Schema design, Time Travel, Streams and Tasks, Secure Views, Materialized Views) - Advanced SQL (CTEs, Window Functions) - Normalization & Denormalization - Automation - NoSQL (preferred, but not mandatory)

Posted 3 days ago

Apply

5.0 - 10.0 years

15 - 25 Lacs

noida, pune, bengaluru

Hybrid

Experience- 4 to 12 years Location- Pan India Key Responsibilities: Administer and manage Snowflake Data Warehouse environment (setup, configuration, upgrades, monitoring). Create, configure, and maintain Snowflake accounts, warehouses, databases, schemas, roles, and users . Implement and manage RBAC (Role-Based Access Control) and security policies to ensure compliance and data governance. Monitor and optimize Snowflake performance , including query tuning, warehouse sizing, clustering, and caching. Interested candidates share your CV at himani.girnar@alikethoughts.com with below details Candidate's name- Email and Alternate Email ID- Contact and Alternate Contact no- Total exp- Relevant experience- Current Org- Notice period- CCTC- ECTC- Current Location- Preferred Location- Pancard No-

Posted 3 days ago

Apply

7.0 - 10.0 years

5 - 8 Lacs

bengaluru

Work from Office

Job Title : Data Engineer / Data Modeler. Location : Remote (India). Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 3 days ago

Apply

6.0 - 8.0 years

8 - 13 Lacs

bengaluru

Work from Office

Role & Responsibilities : As a key member of our team, you will be instrumental in shaping our data landscape. Your responsibilities will include : - Designing, developing, and maintaining comprehensive logical and physical data models that meet stringent business and regulatory requirements. - Collaborating with cross-functional teams including data engineers, analysts, and business stakeholders to align data architecture with evolving business needs. - Translating complex business requirements into robust and scalable data models, ensuring optimal performance and data integrity. - Optimizing and refining existing data structures and frameworks to enhance data access, reporting, and analytics capabilities. - Driving governance and data quality initiatives by establishing best practices and documentation standards for data modeling. - Mentoring and guiding junior team members, fostering a culture of continuous improvement and technical excellence. Skills & Qualifications : Must-Have : - Experience : 6+ years of experience in data modeling and related disciplines. - Data Modeling : Extensive expertise in conceptual, logical, and physical models, including Star/Snowflake schema design, Normalization, and Denormalization techniques. - Snowflake : Proven experience with Snowflake schema design, Performance tuning, Time Travel, Streams & Tasks, and Secure & Materialized Views. - SQL & Scripting : Advanced proficiency in SQL (including CTEs and Window Functions), with a strong focus on automation and optimization. Key Skills : - Data Modeling (conceptual, logical, physical) - Schema Design (Star Schema, Snowflake Schema) - Optimization & Performance Tuning - Snowflake (Schema design, Time Travel, Streams and Tasks, Secure Views, Materialized Views) - Advanced SQL (CTEs, Window Functions) - Normalization & Denormalization - Automation - NoSQL (preferred, but not mandatory)

Posted 3 days ago

Apply

10.0 - 12.0 years

20 - 25 Lacs

chennai

Work from Office

Key Responsibilities : As an Enterprise Data Architect, you will : - Lead Data Architecture : Design, develop, and implement comprehensive enterprise data architectures, primarily leveraging Azure and Snowflake platforms. - Data Transformation & ETL : Oversee and guide complex data transformation and ETL processes for large and diverse datasets, ensuring data integrity, quality, and performance. - Customer-Centric Data Design : Specialize in designing and optimizing customer-centric datasets from various sources, including CRM, Call Center, Marketing, Offline, and Point of Sale systems. - Data Modeling : Drive the creation and maintenance of advanced data models, including Relational, Dimensional, Columnar, and Big Data models, to support analytical and operational needs. - Query Optimization : Develop, optimize, and troubleshoot complex SQL and NoSQL queries to ensure efficient data retrieval and manipulation. - Data Warehouse Management : Apply advanced data warehousing concepts to build and manage high-performing, scalable data warehouse solutions. - Tool Evaluation & Implementation : Evaluate, recommend, and implement industry-leading ETL tools such as Informatica and Unifi, ensuring best practices are followed. - Business Requirements & Analysis : Lead efforts in business requirements definition and management, structured analysis, process design, and use case documentation to translate business needs into technical specifications. - Reporting & Analytics Support : Collaborate with reporting teams, providing architectural guidance and support for reporting technologies like Tableau and PowerBI. - Software Development Practices : Apply professional software development principles and best practices to data solution delivery. - Stakeholder Collaboration : Interface effectively with sales teams and directly engage with customers to understand their data challenges and lead them to successful outcomes. - Project Management & Multi-tasking : Demonstrate exceptional organizational skills, with the ability to manage and prioritize multiple simultaneous customer projects effectively. - Strategic Thinking & Leadership : Act as a self-managed, proactive, and customer-focused leader, driving innovation and continuous improvement in data architecture. Position Requirements : of strong experience with data transformation & ETL on large data sets. - Experience with designing customer-centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale, etc.). - 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). - 5+ years of complex SQL or NoSQL experience. - Extensive experience in advanced Data Warehouse concepts. - Proven experience with industry ETL tools (i.e., Informatica, Unifi). - Solid experience with Business Requirements definition and management, structured analysis, process design, and use case documentation. - Experience with Reporting Technologies (i.e., Tableau, PowerBI). - Demonstrated experience in professional software development. - Exceptional organizational skills and ability to multi-task simultaneous different customer projects. - Strong verbal & written communication skills to interface with sales teams and lead customers to successful outcomes. - Must be self-managed, proactive, and customer-focused. Technical Skills : - Cloud Platforms : Microsoft Azure - Data Warehousing : Snowflake - ETL Methodologies : Extensive experience in ETL processes and tools - Data Transformation : Large-scale data transformation - Data Modeling : Relational, Dimensional, Columnar, Big Data - Query Languages : Complex SQL, NoSQL - ETL Tools : Informatica, Unifi (or similar enterprise-grade tools) - Reporting & BI : Tableau, PowerBI

Posted 3 days ago

Apply

10.0 - 12.0 years

18 - 22 Lacs

mumbai

Work from Office

Key Responsibilities : As an Enterprise Data Architect, you will : - Lead Data Architecture : Design, develop, and implement comprehensive enterprise data architectures, primarily leveraging Azure and Snowflake platforms. - Data Transformation & ETL : Oversee and guide complex data transformation and ETL processes for large and diverse datasets, ensuring data integrity, quality, and performance. - Customer-Centric Data Design : Specialize in designing and optimizing customer-centric datasets from various sources, including CRM, Call Center, Marketing, Offline, and Point of Sale systems. - Data Modeling : Drive the creation and maintenance of advanced data models, including Relational, Dimensional, Columnar, and Big Data models, to support analytical and operational needs. - Query Optimization : Develop, optimize, and troubleshoot complex SQL and NoSQL queries to ensure efficient data retrieval and manipulation. - Data Warehouse Management : Apply advanced data warehousing concepts to build and manage high-performing, scalable data warehouse solutions. - Tool Evaluation & Implementation : Evaluate, recommend, and implement industry-leading ETL tools such as Informatica and Unifi, ensuring best practices are followed. - Business Requirements & Analysis : Lead efforts in business requirements definition and management, structured analysis, process design, and use case documentation to translate business needs into technical specifications. - Reporting & Analytics Support : Collaborate with reporting teams, providing architectural guidance and support for reporting technologies like Tableau and PowerBI. - Software Development Practices : Apply professional software development principles and best practices to data solution delivery. - Stakeholder Collaboration : Interface effectively with sales teams and directly engage with customers to understand their data challenges and lead them to successful outcomes. - Project Management & Multi-tasking : Demonstrate exceptional organizational skills, with the ability to manage and prioritize multiple simultaneous customer projects effectively. - Strategic Thinking & Leadership : Act as a self-managed, proactive, and customer-focused leader, driving innovation and continuous improvement in data architecture. Position Requirements : of strong experience with data transformation & ETL on large data sets. - Experience with designing customer-centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale, etc.). - 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). - 5+ years of complex SQL or NoSQL experience. - Extensive experience in advanced Data Warehouse concepts. - Proven experience with industry ETL tools (i.e., Informatica, Unifi). - Solid experience with Business Requirements definition and management, structured analysis, process design, and use case documentation. - Experience with Reporting Technologies (i.e., Tableau, PowerBI). - Demonstrated experience in professional software development. - Exceptional organizational skills and ability to multi-task simultaneous different customer projects. - Strong verbal & written communication skills to interface with sales teams and lead customers to successful outcomes. - Must be self-managed, proactive, and customer-focused. Technical Skills : - Cloud Platforms : Microsoft Azure - Data Warehousing : Snowflake - ETL Methodologies : Extensive experience in ETL processes and tools - Data Transformation : Large-scale data transformation - Data Modeling : Relational, Dimensional, Columnar, Big Data - Query Languages : Complex SQL, NoSQL - ETL Tools : Informatica, Unifi (or similar enterprise-grade tools) - Reporting & BI : Tableau, PowerBI

Posted 3 days ago

Apply

6.0 - 8.0 years

5 - 9 Lacs

mumbai

Work from Office

About The Opportunity : Operating at the forefront of the consulting and technology services sector, our client delivers transformative data management and business analytics solutions to a global clientele. Specializing in innovative approaches to data architecture and information management, they empower organizations to make data-driven decisions. We are seeking a seasoned professional to drive the evolution of our data infrastructure. This is an exciting chance to join a high-caliber team that values precision, efficiency, and strategic vision in a fully remote setting from anywhere in India. Role & Responsibilities : As a key member of our team, you will be instrumental in shaping our data landscape. Your responsibilities will include : - Designing, developing, and maintaining comprehensive logical and physical data models that meet stringent business and regulatory requirements. - Collaborating with cross-functional teams including data engineers, analysts, and business stakeholders to align data architecture with evolving business needs. - Translating complex business requirements into robust and scalable data models, ensuring optimal performance and data integrity. - Optimizing and refining existing data structures and frameworks to enhance data access, reporting, and analytics capabilities. - Driving governance and data quality initiatives by establishing best practices and documentation standards for data modeling. - Mentoring and guiding junior team members, fostering a culture of continuous improvement and technical excellence. Skills & Qualifications : Must-Have : - Experience : 6+ years of experience in data modeling and related disciplines. - Data Modeling : Extensive expertise in conceptual, logical, and physical models, including Star/Snowflake schema design, Normalization, and Denormalization techniques. - Snowflake : Proven experience with Snowflake schema design, Performance tuning, Time Travel, Streams & Tasks, and Secure & Materialized Views. - SQL & Scripting : Advanced proficiency in SQL (including CTEs and Window Functions), with a strong focus on automation and optimization. Key Skills : - Data Modeling (conceptual, logical, physical) - Schema Design (Star Schema, Snowflake Schema) - Optimization & Performance Tuning - Snowflake (Schema design, Time Travel, Streams and Tasks, Secure Views, Materialized Views) - Advanced SQL (CTEs, Window Functions) - Normalization & Denormalization - Automation - NoSQL (preferred, but not mandatory)

Posted 3 days ago

Apply

5.0 - 8.0 years

4 - 8 Lacs

gurugram

Remote

About The Opportunity : Join a pioneering consulting firm in the Data Analytics and Cloud Solutions sector, where transformative data architectures empower global enterprises. We specialize in leveraging cutting-edge Snowflake technologies and innovative cloud solutions to drive real-time insights and business intelligence. This remote role, based in India, offers the opportunity to work on high-impact projects while collaborating with a diverse team of experts. Role & Responsibilities : - Design, implement, and optimize scalable data warehousing solutions using Snowflake Cortex. - Develop robust ETL pipelines that ensure data quality, reliability, and efficient integration across platforms. - Collaborate with cross-functional teams to translate business requirements into innovative cloud data solutions. - Monitor system performance, analyze query execution, and implement optimization strategies to drive efficiency. - Troubleshoot and resolve data integration issues, ensuring continuity and minimizing downtime. - Mentor junior team members on best practices and the latest trends in Snowflake and cloud technologies. Skills & Qualifications : Must-Have : Minimum of 5+ years of hands-on experience with Snowflake Cortex and cloud-based data warehousing solutions. Must-Have : Proficiency in SQL and extensive experience in building, managing, and optimizing ETL pipelines. Must-Have : Proven track record in integrating large-scale data solutions in a dynamic consulting environment. Preferred : Familiarity with major cloud platforms such as AWS, Azure, or Google Cloud. Preferred : Experience working remotely and within agile development frameworks. Preferred : Excellent problem-solving, communication, and team collaboration skills. Benefits & Culture Highlights. - Competitive salary paired with performance-based incentives. - Flexible remote work opportunities fostering a healthy work-life balance. - A dynamic, inclusive, and collaborative work culture committed to continuous learning and professional growth.

Posted 3 days ago

Apply

8.0 - 13.0 years

5 - 10 Lacs

bengaluru

Work from Office

About the job : Role : Senior Databricks Engineer / Databricks Technical Lead/ Data Architect Experience : 8-15 years Primary Roles And Responsibilities : - Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack - Ability to provide solutions that are forward-thinking in data engineering and analytics space - Collaborate with DW/BI leads to understand new ETL pipeline development requirements. - Triage issues to find gaps in existing pipelines and fix the issues - Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs - Help joiner team members to resolve issues and technical challenges. - Drive technical discussion with client architect and team members - Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications : - Bachelor's and/or masters degree in computer science or equivalent experience. - Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. - Deep understanding of Star and Snowflake dimensional modelling. - Strong knowledge of Data Management principles - Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture - Should have hands-on experience in SQL, Python and Spark (PySpark) - Candidate must have experience in AWS/ Azure stack - Desirable to have ETL with batch and streaming (Kinesis). - Experience in building ETL / data warehouse transformation processes - Experience with Apache Kafka for use with streaming data / event-based data - Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) - Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) - Experience working with structured and unstructured data including imaging & geospatial data. - Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. - Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot - Databricks Certified Data Engineer Associate/Professional Certification (Desirable). - Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects - Should have experience working in Agile methodology - Strong verbal and written communication skills. - Strong analytical and problem-solving skills with a high attention to detail.

Posted 3 days ago

Apply

7.0 - 10.0 years

5 - 8 Lacs

mumbai

Remote

Employment Type : Contract (Remote). Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 3 days ago

Apply

8.0 - 13.0 years

5 - 10 Lacs

pune

Work from Office

Primary Roles And Responsibilities : - Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack - Ability to provide solutions that are forward-thinking in data engineering and analytics space - Collaborate with DW/BI leads to understand new ETL pipeline development requirements. - Triage issues to find gaps in existing pipelines and fix the issues - Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs - Help joiner team members to resolve issues and technical challenges. - Drive technical discussion with client architect and team members - Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications : - Bachelor's and/or masters degree in computer science or equivalent experience. - Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. - Deep understanding of Star and Snowflake dimensional modelling. - Strong knowledge of Data Management principles - Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture - Should have hands-on experience in SQL, Python and Spark (PySpark) - Candidate must have experience in AWS/ Azure stack - Desirable to have ETL with batch and streaming (Kinesis). - Experience in building ETL / data warehouse transformation processes - Experience with Apache Kafka for use with streaming data / event-based data - Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) - Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) - Experience working with structured and unstructured data including imaging & geospatial data. - Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. - Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot - Databricks Certified Data Engineer Associate/Professional Certification (Desirable). - Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects - Should have experience working in Agile methodology - Strong verbal and written communication skills. - Strong analytical and problem-solving skills with a high attention to detail.

Posted 3 days ago

Apply

7.0 - 10.0 years

5 - 8 Lacs

gurugram

Remote

Employment Type : Contract (Remote). Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 3 days ago

Apply

5.0 - 8.0 years

9 - 13 Lacs

mumbai

Work from Office

Role & Responsibilities : - Design, implement, and optimize scalable data warehousing solutions using Snowflake Cortex. - Develop robust ETL pipelines that ensure data quality, reliability, and efficient integration across platforms. - Collaborate with cross-functional teams to translate business requirements into innovative cloud data solutions. - Monitor system performance, analyze query execution, and implement optimization strategies to drive efficiency. - Troubleshoot and resolve data integration issues, ensuring continuity and minimizing downtime. - Mentor junior team members on best practices and the latest trends in Snowflake and cloud technologies. Skills & Qualifications : Must-Have : Minimum of 5+ years of hands-on experience with Snowflake Cortex and cloud-based data warehousing solutions. Must-Have : Proficiency in SQL and extensive experience in building, managing, and optimizing ETL pipelines. Must-Have : Proven track record in integrating large-scale data solutions in a dynamic consulting environment. Preferred : Familiarity with major cloud platforms such as AWS, Azure, or Google Cloud. Preferred : Experience working remotely and within agile development frameworks. Preferred : Excellent problem-solving, communication, and team collaboration skills.

Posted 4 days ago

Apply

6.0 - 8.0 years

8 - 13 Lacs

gurugram

Remote

Role & Responsibilities : As a key member of our team, you will be instrumental in shaping our data landscape. Your responsibilities will include : - Designing, developing, and maintaining comprehensive logical and physical data models that meet stringent business and regulatory requirements. - Collaborating with cross-functional teams including data engineers, analysts, and business stakeholders to align data architecture with evolving business needs. - Translating complex business requirements into robust and scalable data models, ensuring optimal performance and data integrity. - Optimizing and refining existing data structures and frameworks to enhance data access, reporting, and analytics capabilities. - Driving governance and data quality initiatives by establishing best practices and documentation standards for data modeling. - Mentoring and guiding junior team members, fostering a culture of continuous improvement and technical excellence. Skills & Qualifications : Must-Have : - Experience : 6+ years of experience in data modeling and related disciplines. - Data Modeling : Extensive expertise in conceptual, logical, and physical models, including Star/Snowflake schema design, Normalization, and Denormalization techniques. - Snowflake : Proven experience with Snowflake schema design, Performance tuning, Time Travel, Streams & Tasks, and Secure & Materialized Views. - SQL & Scripting : Advanced proficiency in SQL (including CTEs and Window Functions), with a strong focus on automation and optimization. Key Skills : - Data Modeling (conceptual, logical, physical) - Schema Design (Star Schema, Snowflake Schema) - Optimization & Performance Tuning - Snowflake (Schema design, Time Travel, Streams and Tasks, Secure Views, Materialized Views) - Advanced SQL (CTEs, Window Functions) - Normalization & Denormalization - Automation - NoSQL (preferred, but not mandatory)

Posted 4 days ago

Apply

8.0 - 13.0 years

5 - 10 Lacs

chennai

Work from Office

Primary Roles And Responsibilities : - Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack - Ability to provide solutions that are forward-thinking in data engineering and analytics space - Collaborate with DW/BI leads to understand new ETL pipeline development requirements. - Triage issues to find gaps in existing pipelines and fix the issues - Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs - Help joiner team members to resolve issues and technical challenges. - Drive technical discussion with client architect and team members - Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications : - Bachelor's and/or masters degree in computer science or equivalent experience. - Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. - Deep understanding of Star and Snowflake dimensional modelling. - Strong knowledge of Data Management principles - Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture - Should have hands-on experience in SQL, Python and Spark (PySpark) - Candidate must have experience in AWS/ Azure stack - Desirable to have ETL with batch and streaming (Kinesis). - Experience in building ETL / data warehouse transformation processes - Experience with Apache Kafka for use with streaming data / event-based data - Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) - Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) - Experience working with structured and unstructured data including imaging & geospatial data. - Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. - Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot - Databricks Certified Data Engineer Associate/Professional Certification (Desirable). - Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects - Should have experience working in Agile methodology - Strong verbal and written communication skills. - Strong analytical and problem-solving skills with a high attention to detail.

Posted 4 days ago

Apply

1.0 - 3.0 years

3 - 8 Lacs

hyderabad

Work from Office

Position: SQL Developer (Immediate Joiner) Location: Hyderabad Experience Required: 1-3 Years Education- Bachelors degree Key Responsibilities •Typically requires 1-3 years of SQL experience. •Requires some knowledge of application development, procedures, utilities, and job control languages is preferred. •Has awareness of the functional impact upon work processes and other functions. •Able to work in teams using Software Control Repository tools (TFS, DevOps). Experience on the following tools may be required: • Oracle/SQL Sever/ Snowflake Database & tools. • Reporting Tools • Visualization Tools • Data Virtualization Tools Interested candidates can apply on hr@akvmsolutions.com or hr@akvmsolutions.com Role & responsibilities Preferred candidate profile

Posted 4 days ago

Apply

5.0 - 10.0 years

6 - 16 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. • You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. • You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Knowledge of design principles and fundamentals of architecture • Understanding of performance engineering • Knowledge of quality processes and estimation techniques • Basic understanding of project domain • Ability to translate functional / nonfunctional requirements to systems requirements • Ability to design and code complex programs • Ability to write test cases and scenarios based on the specifications • Good understanding of SDLC and agile methodologies • Awareness of latest technologies and trends • Logical thinking and problem solving skills along with an ability to collaborate Technical and Professional Requirements: • Primary skills: Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake

Posted 4 weeks ago

Apply

8.0 - 13.0 years

15 - 20 Lacs

Pune

Work from Office

8+ years of experience in ETL/DW projects, having migration experience and team management having delivery experience. Proven expertise in Snowflake data warehousing, ETL, and data governance. Experience with cloud ETL/ETL migration tools.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies