Jobs
Interviews

14 Bi Projects Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

11.0 - 20.0 years

12 - 16 Lacs

Pune

Hybrid

Roles & Responsibilities: The Senior Tech Lead specializing in Analytics and Visualization leads the design, development, and optimization of data visualization solutions Lead the design and implementation of analytics and visualization solutions using tools like Power BI and Tableau Architect and optimize dashboards and reports for performance, scalability, and user experience Provide technical leadership and mentorship to a team of data analysts and visualization experts Collaborate with stakeholders to define project requirements and ensure alignment with business goals Ensure best practices in data visualization, governance, and security. Troubleshoot and resolve complex technical issues in analytics and visualization environments Strong Experience on Presales, RFP responses, customer connects Requirements: Strong Experience in data preparation, BI projects to understand business requirements in BI context and understand data model to transform raw data into meaningful insights in Power BI Design, develop, test, and deploy reports in Power BI Strong exposure to Visualization, transformation and data analysis Connecting to data sources, importing data and transforming data for Business Intelligence. Good Exposure to DAX queries in Power BI desktop. Experience in publishing and scheduling Power Bi reports /Tableau Should have knowledge and experience in prototyping, designing, and requirement analysis. Sound knowledge of database management, SQL querying, data warehousing, business intelligence, and OLAP(Online Analytical Processing) Must have Solutioning and presales experience. Primary Skills Power BI and Secondary Skill - Tableau Total Experience Range: - 11 - 20 Yrs Our Offering: Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment. Wellbeing programs & work-life balance - integration and passion sharing events. Attractive Salary and Company Initiative Benefits Courses and conferences. Attractive Salary. Hybrid work culture.

Posted 5 days ago

Apply

8.0 - 13.0 years

2 - 5 Lacs

Bengaluru, Karnataka, India

On-site

Position Description: 8+ years of experience Ability to quickly adapt to a very fast moving environment with weekly deliverables Preferably have experience as BI PM in fortune 500 org Must have strong leadership skills and have an unyielding ownership andaccountability to produce quality results in a challenging atmosphere. Strong and proven project management skillsmanaging schedule, efforts amidst dynamic environment Strong analytical skills and understanding of SDLC as it relates to BI solutions Very strong communication skills - written and verbal Positive attitude, ability to work in a high pace and ambiguous environment required. Solid foundation and experience managing BI projects Experience in OBIEE and Oracle based DW is a plus Experience in Consumer Goods industry is a strong plus Experience in Service support is a strong plus Architect Solid foundation and experience in BI Architecture, Equal functional and technical skill

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Dynatrace Developer/Consultant, you will be responsible for setting up and maintaining monitoring systems to track the health and performance of data pipelines. Your role will involve configuring alerts and notifications to promptly identify and respond to issues or anomalies in data pipelines. You will develop procedures and playbooks for incident response and resolution, collaborating with data engineers to optimize data flows and processing. Your experience in working with data, ETL, Data warehouse & BI Projects will be invaluable as you continuously monitor and analyze pipeline performance to identify bottlenecks and areas for improvement. Implementing logging mechanisms and error handling strategies will be crucial to capture and analyze pipeline features for quick detection and troubleshooting. Working closely with data engineers and data analysts, you will monitor data quality metrics, delete data anomalies, and develop processes to address data quality issues. Forecasting resource requirements based on data growth and usage patterns will ensure that pipelines can handle increasing data volumes without performance degradation. Developing and maintaining dashboards and reports to visualize key pipeline performance metrics will provide stakeholders with insights into system health and data flow. Automating monitoring tasks and developing tools for streamlined management and observability of data pipelines will be part of your responsibilities. Ensuring data pipeline observability aligns with security and compliance standards, such as data privacy regulations and industry best practices, will be crucial. You will document monitoring processes, best practices, and system configurations, sharing knowledge with team members to improve overall data pipeline reliability and efficiency. Collaborating with cross-functional teams, including data engineers, data scientists, and IT operations, you will troubleshoot issues and implement improvements. Keeping abreast of the latest developments in data pipeline monitoring and observability technologies and practices will enable you to recommend and implement advancements. Knowledge in AWS Glue, S3, Athena is a nice-to-have, along with experience in JIRA and knowledge in any programming language such as Python, Java, or Scala. This is a full-time position with a Monday to Friday schedule and in-person work location.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

You are an experienced Data Engineer who will be responsible for leading the end-to-end migration of the data analytics and reporting environment to Looker at Frequence. Your role will involve designing scalable data models, translating business logic into LookML, and empowering teams across the organization with self-service analytics and actionable insights. You will collaborate closely with stakeholders from data, engineering, and business teams to ensure a smooth transition to Looker, establish best practices for data modeling, governance, and dashboard development. Your responsibilities will include: - Leading the migration of existing BI tools, dashboards, and reporting infrastructure to Looker - Designing, developing, and maintaining scalable LookML data models, dimensions, measures, and explores - Creating intuitive, actionable, and visually compelling Looker dashboards and reports - Collaborating with data engineers and analysts to ensure consistency across data sources - Translating business requirements into technical specifications and LookML implementations - Optimizing SQL queries and LookML models for performance and scalability - Implementing and managing Looker's security settings, permissions, and user roles in alignment with data governance standards - Troubleshooting issues and supporting end users in their Looker adoption - Maintaining version control of LookML projects using Git - Advocating for best practices in BI development, testing, and documentation You should have: - Proven experience with Looker and deep expertise in LookML syntax and functionality - Hands-on experience building and maintaining LookML data models, explores, dimensions, and measures - Strong SQL skills, including complex joins, aggregations, and performance tuning - Experience working with semantic layers and data modeling for analytics - Solid understanding of data analysis and visualization best practices - Ability to create clear, concise, and impactful dashboards and visualizations - Strong problem-solving skills and attention to detail in debugging Looker models and queries - Familiarity with Looker's security features and data governance principles - Experience using version control systems, preferably Git - Excellent communication skills and the ability to work cross-functionally - Familiarity with modern data warehousing platforms (e.g., Snowflake, BigQuery, Redshift) - Experience migrating from legacy BI tools (e.g., Tableau, Power BI, etc.) to Looker - Experience working in agile data teams and managing BI projects - Familiarity with dbt or other data transformation frameworks At Frequence, you will be part of a dynamic, diverse, innovative, and friendly work environment that values creativity and collaboration. The company embraces differences and believes they drive creativity and innovation. The team consists of individuals from varied backgrounds who are all trail-blazing team players, thinking big and aiming to make a significant impact. Please note that third-party recruiting agencies will not be involved in this search.,

Posted 1 week ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Pune

Work from Office

Snowflake Data Engineer1 Snowflake Data Engineer Overall Experience 5+ years of experience in Snowflake and Python. Experience of 5+ years in data preparation. BI projects to understand business requirements in BI context and understand data model to transform raw data into meaningful data using snowflake and Python. Designing and creating data models that define the structure and relationships of various data elements within the organization. This includes conceptual, logical, and physical data models, which help ensure data accuracy, consistency, and integrity. Designing data integration solutions that allow different systems and applications to share and exchange data seamlessly. This may involve selecting appropriate integration technologies, developing ETL (Extract, Transform, Load) processes, and ensuring data quality during the integration process. Create and maintain optimal data pipeline architecture. Good knowledge of cloud platforms like AWS/Azure/GCP Good hands-on knowledge of Snowflake is a must. Experience with various data ingestion methods (Snow pipe & others), time travel and data sharing and other Snowflake capabilities Good knowledge of Python/Py Spark, advanced features of Python Support business development efforts (proposals and client presentations). Ability to thrive in a fast-paced, dynamic, client-facing role where delivering solid work products to exceed high expectations is a measure of success. Excellent leadership and interpersonal skills. Eager to contribute to a team-oriented environment. Strong prioritization and multi-tasking skills with a track record of meeting deadlines. Ability to be creative and analytical in a problem-solving environment. Effective verbal and written communication skills. Adaptable to new environments, people, technologies, and processes Ability to manage ambiguity and solve undefined problems.

Posted 2 weeks ago

Apply

15.0 - 20.0 years

35 - 40 Lacs

Dubai, Bengaluru, Mumbai (All Areas)

Work from Office

Greetings!!! This is in regard to a Job opportunity for Technical Program Manager_Data Warehousing & BI with Datamatics Global Services Ltd. Position: PeopleSoft Solution Architect Website: https://www.datamatics.com/ Job Location: Dubai (Onsite)/ Mumbai / Bangalore Job Description: We are looking for Technical Program manager to lead Data Warehouse and BI projects Essential Duties and Responsibilities: Understand the business requirement and project in detail Define the project scope, estimation of resources and timelines, goals, deliverables & measures of success Ensure the team is thoroughly trained on application / technology Guide the team in creating a detailed project work plan & timeline Conduct and oversee project implementation and evaluate the progress regularly Ensures all project information is appropriately documented Track deliverables; Monitor and report on project(s) progress to client/stakeholders Ensure execution and quality of the work basis set benchmarks Manage team and client issues / escalations / impediments to effective closures. Implement and manage change when necessary to meet project outputs Contribute to continuous improvement in terms of productivity Ensure process / product quality parameters compliance Manage budget & allocate project resources Maintain project margins and profitability, ensuring zero revenue leakage People, Client and Stakeholder management Support in presales & create / review estimations Act as a mentor for technical solutions as and when required Provide Data Warehouse and Business Intelligence subject matter expertise and leadership Qualifications: 10 to 15 years of experience with 5 to 8 years of project / program management experience 5+ years hands on experience with managing and leading data warehouse projects preferably with Banking experience Ability to establish and maintain effective working relationships with clients, project team members, supervisors, and employees from other departments. Strong knowledge and hands on experience development data warehouse. Excellent conflict resolution skills is required. Strong interpersonal skills. Must be able to learn, understand, and apply new technologies. Ability to effectively prioritize and execute tasks in a high-pressure environment is crucial. Hands on experience with a Tier I BI tool is desired (experience with PowerBI/Business Objects is a plus) Bachelors degree in computer science, information system, computer engineering or equivalent PMI/PMP or Prince2 certification is required

Posted 2 weeks ago

Apply

5.0 - 9.0 years

10 - 15 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

5-9yrs exp in Pipeline Monitoring, working data, ETL, Data warehouse & BI Projects. Coding exp, Develop & maintain Dashboards. Knowledge in AWS Glue, S3, Athena, JIRA, Programming- Python, Java, Scala etc. Bangalore/ Pune/ Hyderabad/ Greater Noida.

Posted 1 month ago

Apply

8.0 - 13.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Urgent opening for BI Project Manager - New Jersey Posted On 27th Oct 2015 06:18 AM Location New Jersey Role / Position BI Project Manager Experience (required) 8 plus years Description Our client is a data analytics startup looking for a BI Project Manager for an existing project in New Jersey. About the company: Our capabilitiesrange from Data Visualization, Data Management to Advanced analytics, Big Data andMachine Learning. Our uniqueness is in bringing the right mix of technology andbusiness analytics to create sustainable white-box solutions that are transitioned to ourclients at the end of the engagement. We do this cost effectively using a global executionmodel leveraging our clients' existing technology and data assets. We also come in withsome strong IP and pre-built analytics solutions in data mining, BI and Big Data. We are looking for full time hire for Business Intelligence project manager role based inParsippany, New Jersey. Position Description: 8+ years of experience Ability to quickly adapt to a very fast moving environment with weekly deliverables Preferably have experience as BI PM in fortune 500 org Must have strong leadership skills and have an unyielding ownership andaccountability to produce quality results in a challenging atmosphere. Strong and proven project management skillsmanaging schedule, efforts amidst dynamic environment Strong analytical skills and understanding of SDLC as it relates to BI solutions Very strong communication skills - written and verbal Positive attitude, ability to work in a high pace and ambiguous environment required. Solid foundation and experience managing BI projects Experience in OBIEE and Oracle based DW is a plus Experience in Consumer Goods industry is a strong plus Experience in Service support is a strong plus ArchitectSolid foundation and experience in BI Architecture, Equal functional and technical skill If interested, please share your updated profile Send Resumes to ananth@expertiz.in -->Upload Resume

Posted 1 month ago

Apply

1.0 - 4.0 years

1 - 4 Lacs

Indore, Madhya Pradesh, India

On-site

Contract Duration: 6 Months Responsibilities: Snowflake Administration & Development in Data Warehouse, ETL, and BI projects. End-to-end implementation of Snowflake cloud data warehouse and on-premise data warehouse solutions (Oracle/SQL Server). Expertise in Snowflake: Data modeling, ELT using Snowflake SQL, complex stored procedures, and standard DWH/ETL concepts. Advanced features: resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning. Zero copy clone, time travel, and data sharing deployment. Hands-on experience with Snowflake utilities: SnowSQL, SnowPipe, Big Data model techniques using Python. Data Migration from RDBMS to Snowflake cloud data warehouse. Deep understanding of relational and NoSQL data stores, including star and snowflake dimensional modeling. Data security & access controls design expertise. Experience with AWS/Azure data storage and management technologies (S3, Blob). Process development for data transformation, metadata, dependency, and workload management. Proficiency in RDBMS: Complex SQL, PL/SQL, Unix Shell Scripting, performance tuning, and troubleshooting. Problem resolution for complex data pipeline issues, proactively and as they arise. Agile development methodologies experience. Good-to-Have Skills: CI/CD in Talend using Jenkins and Nexus. TAC configuration with LDAP, Job servers, Log servers, and databases. Job conductor, scheduler, and monitoring expertise. GIT repository management, including user roles and access control. Agile methodology and 24/7 Admin & Platform support. Effort estimation based on requirements. Strong written communication skills, effective and persuasive in both written and oral communication.

Posted 1 month ago

Apply

1.0 - 4.0 years

1 - 4 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Contract Duration: 6 Months Responsibilities: Snowflake Administration & Development in Data Warehouse, ETL, and BI projects. End-to-end implementation of Snowflake cloud data warehouse and on-premise data warehouse solutions (Oracle/SQL Server). Expertise in Snowflake: Data modeling, ELT using Snowflake SQL, complex stored procedures, and standard DWH/ETL concepts. Advanced features: resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning. Zero copy clone, time travel, and data sharing deployment. Hands-on experience with Snowflake utilities: SnowSQL, SnowPipe, Big Data model techniques using Python. Data Migration from RDBMS to Snowflake cloud data warehouse. Deep understanding of relational and NoSQL data stores, including star and snowflake dimensional modeling. Data security & access controls design expertise. Experience with AWS/Azure data storage and management technologies (S3, Blob). Process development for data transformation, metadata, dependency, and workload management. Proficiency in RDBMS: Complex SQL, PL/SQL, Unix Shell Scripting, performance tuning, and troubleshooting. Problem resolution for complex data pipeline issues, proactively and as they arise. Agile development methodologies experience. Good-to-Have Skills: CI/CD in Talend using Jenkins and Nexus. TAC configuration with LDAP, Job servers, Log servers, and databases. Job conductor, scheduler, and monitoring expertise. GIT repository management, including user roles and access control. Agile methodology and 24/7 Admin & Platform support. Effort estimation based on requirements. Strong written communication skills, effective and persuasive in both written and oral communication.

Posted 1 month ago

Apply

15.0 - 18.0 years

30 - 40 Lacs

Mohali, Bengaluru

Work from Office

Job Summary: We are seeking an experienced and detail-oriented Technical Project Manager , with strong interpersonal skills to lead and manage Data, Business Intelligence (BI), and Analytics initiatives across single and multiple client engagements. The ideal candidate will have a solid background in data project delivery, knowledge of modern cloud platforms, and familiarity with tools like Snowflake , Tableau , and Power BI . Understanding of AI and machine learning projects is a strong plus. This role requires strong communication and leadership skills, with the ability to translate complex technical requirements into actionable plans and ensure successful, timely, and high-quality delivery with attention to details. Key Responsibilities: Project & Program Delivery Manage end-to-end, the full lifecycle of data engineering and analytics, projects including data platform migrations, dashboard/report development, and advanced analytics initiatives. Define project scope, timelines, milestones, resource needs, and deliverables in alignment with stakeholder objectives. Manage budgets, resource allocation, and risk mitigation strategies to ensure successful program delivery. Use Agile, Scrum, or hybrid methodologies to ensure iterative delivery and continuous improvement. Monitor performance, track KPIs, and adjust plans to maintain scope, schedule, and quality. Excellence in execution and ensure client satisfaction Client & Stakeholder Engagement Serve as the primary point of contact for clients and internal teams across all data initiatives. Translate business needs into actionable technical requirements and facilitate alignment across teams. Conduct regular status meetings, monthly and quarterly reviews, executive updates, and retrospectives. Manage Large teams Ability to manage up to 50+ resources working on different projects for different clients. Work with practice and talent acquisition teams for resourcing needs Manage P & L Manage allocation, gross margin, utilization etc effectively Team Coordination Lead and coordinate cross-functional teams including data engineers, BI developers, analysts, and QA testers. Ensure appropriate allocation of resources across concurrent projects and clients. Foster collaboration, accountability, and a results-oriented team culture. Data, AI and BI Technology Oversight Manage project delivery using modern cloud data platforms Oversee BI development using Tableau and/or Power BI , ensuring dashboards meet user needs and follow visualization best practices. Conduct UATs Manage initiatives involving ETL/ELT processes, data modeling, and real-time analytics pipelines. Ensure compatibility with data governance, security, and privacy requirements. Manage AL ML projects Data & Cloud Understanding Oversee delivery of solutions involving cloud data platforms (e.g., Azure, AWS, GCP), data lakes, and modern data stacks. Support planning for data migrations, ETL processes, data modeling, and analytics pipelines. Be conversant in tools such as Power BI, Tableau, Snowflake, Databricks, Azure Synapse, or BigQuery. Risk, Quality & Governance Identify and mitigate risks related to data quality, project timelines, and resource availability. Ensure adherence to governance, compliance, and data privacy standards (e.g., GDPR, HIPAA). Maintain thorough project documentation including charters, RACI matrices, RAID logs, and retrospectives. Qualifications: Bachelor’s degree in Computer Science, Information Systems, Business, or a related field. Certifications (Preferred): PMP, PRINCE2, or Certified ScrumMaster (CSM) Cloud certifications (e.g., AWS Cloud Practitioner , Azure Fundamentals , Google Cloud Certified ) BI/analytics certifications (e.g., Tableau Desktop Specialist , Power BI Data Analyst Associate , DA-100 ) Must Have Skills: Strong communication skills Strong interpersonal Ability to work collaboratively Excellent Organizing skills Stakeholder Management Customer Management People Management Contract Management Risk & Compliance Management C-suite reporting Team Management Resourcing Experience using tools like JIRA, MS Plan etc. Desirable Skills: 15 years of IT experience with 8+ years of proven project management experience, in delivering data, AI Ml, BI / analytics-focused environments. Experience delivering projects with cloud platforms (e.g., Azure , AWS , GCP ) and data platforms like Snowflake . Proficiency in managing BI projects preferably Tableau and/or Power BI . Knowledge or hands on experience on legacy tools is a plus. Solid understanding of the data lifecycle including ingestion, transformation, visualization, and reporting. Comfortable using PM tools like Jira, Azure DevOps, Monday.com, or Smartsheet. Experience managing projects involving data governance , metadata management , or master data management (MDM) .

Posted 1 month ago

Apply

1.0 - 4.0 years

4 - 7 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Locations-Pune, Bangalore, Hyderabad, Indore Contract duration 6 month Responsibilities - Must have experience working as a Snowflake Admin/Development in Data Warehouse, ETL, BI projects. - Must have prior experience with end to end implementation of Snowflake cloud data warehouse and end to end data warehouse implementations on-premise preferably on Oracle/Sql server. - Expertise in Snowflake - data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts - Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, - Zero copy clone, time travel and understand how to use these features - Expertise in deploying Snowflake features such as data sharing. - Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python - Experience in Data Migration from RDBMS to Snowflake cloud data warehouse - Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling) - Experience with data security and data access controls and design- - Experience with AWS or Azure data storage and management technologies such as S3 and Blob - Build processes supporting data transformation, data structures, metadata, dependency and workload management- - Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot. - Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface. - Must have experience of Agile development methodologies. Good to have - CI/CD in Talend using Jenkins and Nexus. - TAC configuration with LDAP, Job servers, Log servers, database. - Job conductor, scheduler and monitoring. - GIT repository, creating user & roles and provide access to them. - Agile methodology and 24/7 Admin and Platform support. - Estimation of effort based on the requirement. - Strong written communication skills. Is effective and persuasive in both written and oral communication.

Posted 1 month ago

Apply

6.0 - 10.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Job Information Job Opening ID ZR_2393_JOB Date Opened 09/11/2024 Industry IT Services Job Type Work Experience 6-10 years Job Title Snowflake Engineer - Database Administraion City Bangalore South Province Karnataka Country India Postal Code 560066 Number of Positions 1 Locations-Pune, Bangalore, Hyderabad, Indore Contract duration 6 month Responsibilities - Must have experience working as a Snowflake Admin/Development in Data Warehouse, ETL, BI projects. - Must have prior experience with end to end implementation of Snowflake cloud data warehouse and end to end data warehouse implementations on-premise preferably on Oracle/Sql server. - Expertise in Snowflake - data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts - Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, - Zero copy clone, time travel and understand how to use these features - Expertise in deploying Snowflake features such as data sharing. - Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python - Experience in Data Migration from RDBMS to Snowflake cloud data warehouse - Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling) - Experience with data security and data access controls and design- - Experience with AWS or Azure data storage and management technologies such as S3 and Blob - Build processes supporting data transformation, data structures, metadata, dependency and workload management- - Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot. - Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface. - Must have experience of Agile development methodologies. Good to have - CI/CD in Talend using Jenkins and Nexus. - TAC configuration with LDAP, Job servers, Log servers, database. - Job conductor, scheduler and monitoring. - GIT repository, creating user & roles and provide access to them. - Agile methodology and 24/7 Admin and Platform support. - Estimation of effort based on the requirement. - Strong written communication skills. Is effective and persuasive in both written and oral communication. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 1 month ago

Apply

6.0 - 10.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Job Information Job Opening ID ZR_2384_JOB Date Opened 23/10/2024 Industry IT Services Job Type Work Experience 6-10 years Job Title Snowflake DBA City Bangalore South Province Karnataka Country India Postal Code 560066 Number of Positions 1 Contract duration 6 month Locations-Pune/Bangalore/hyderabad/Indore Responsibilities - Must have experience working as a Snowflake Admin/Development in Data Warehouse, ETL, BI projects. - Must have prior experience with end to end implementation of Snowflake cloud data warehouse and end to end data warehouse implementations on-premise preferably on Oracle/Sql server. - Expertise in Snowflake - data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts - Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, - Zero copy clone, time travel and understand how to use these features - Expertise in deploying Snowflake features such as data sharing. - Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python - Experience in Data Migration from RDBMS to Snowflake cloud data warehouse - Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling) - Experience with data security and data access controls and design- - Experience with AWS or Azure data storage and management technologies such as S3 and Blob - Build processes supporting data transformation, data structures, metadata, dependency and workload management- - Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot. - Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface. - Must have experience of Agile development methodologies. Good to have - CI/CD in Talend using Jenkins and Nexus. - TAC configuration with LDAP, Job servers, Log servers, database. - Job conductor, scheduler and monitoring. - GIT repository, creating user & roles and provide access to them. - Agile methodology and 24/7 Admin and Platform support. - Estimation of effort based on the requirement. - Strong written communication skills. Is effective and persuasive in both written and oral communication. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies