Home
Jobs

1082 Snowflake Jobs - Page 34

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 14.0 years

37 - 45 Lacs

Indore, Hyderabad, Ahmedabad

Work from Office

Naukri logo

Job Summary: We are looking for a highly technical and experienced Data Architect / Solution Architect / Technical Architect with expertise in Data Analytics. The candidate should have strong hands-on experience in solutioning, architecture, and cloud technologies to drive data-driven decisions. Technical Architect / Solution Architect / Data Architect (Data Analytics) Key Responsibilities: Design, develop, and implement end-to-end data architecture solutions. Provide technical leadership in Azure, Databricks, Snowflake, and Microsoft Fabric. Architect scalable, secure, and high-performing data solutions. Work on data strategy, governance, and optimization. Implement and optimize Power BI dashboards and SQL-based analytics. Collaborate with cross-functional teams to deliver robust data solutions. Primary Skills Required: Data Architecture & Solutioning Azure Cloud (Data Services, Storage, Synapse, etc.) Databricks & Snowflake (Data Engineering & Warehousing) Power BI (Visualization & Reporting) Microsoft Fabric (Data & AI Integration) SQL (Advanced Querying & Optimization) Send Resume To: navaneetha@suzva.com Contact: 9032956160 Looking for immediate to 15-day joiners! Apply now!

Posted 3 weeks ago

Apply

5.0 - 10.0 years

10 - 15 Lacs

Chennai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

We are looking for an experienced Data Engineer with a strong background in data engineering, storage, and cloud technologies. The role involves designing, building, and optimizing scalable data pipelines, ETL/ELT workflows, and data models for efficient analytics and reporting. The ideal candidate must have strong SQL expertise, including complex joins, stored procedures, and certificate-auth-based queries. Experience with NoSQL databases such as Firestore, DynamoDB, or MongoDB is required, along with proficiency in data modeling and warehousing solutions like BigQuery (preferred), Redshift, or Snowflake. The candidate should have hands-on experience working with ETL/ELT pipelines using Airflow, dbt, Kafka, or Spark. Proficiency in scripting languages such as PySpark, Python, or Scala is essential. Strong hands-on experience with Google Cloud Platform (GCP) is a must. Additionally, experience with visualization tools such as Google Looker Studio, LookerML, Power BI, or Tableau is preferred. Good-to-have skills include exposure to Master Data Management (MDM) systems and an interest in Web3 data and blockchain analytics.

Posted 3 weeks ago

Apply

5.0 - 7.0 years

12 - 13 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

We are looking for an experienced Data Engineer with a strong background in data engineering, storage, and cloud technologies. The role involves designing, building, and optimizing scalable data pipelines, ETL/ELT workflows, and data models for efficient analytics and reporting. The ideal candidate must have strong SQL expertise, including complex joins, stored procedures, and certificate-auth-based queries. Experience with NoSQL databases such as Firestore, DynamoDB, or MongoDB is required, along with proficiency in data modeling and warehousing solutions like BigQuery (preferred), Redshift, or Snowflake. The candidate should have hands-on experience working with ETL/ELT pipelines using Airflow, dbt, Kafka, or Spark. Proficiency in scripting languages such as PySpark, Python, or Scala is essential. Strong hands-on experience with Google Cloud Platform (GCP) is a must. Additionally, experience with visualization tools such as Google Looker Studio, LookerML, Power BI, or Tableau is preferred. Good-to-have skills include exposure to Master Data Management (MDM) systems and an interest in Web3 data and blockchain analytics. Location - Remote, Hyderabad,ahmedabad,pune,chennai,kolkata.

Posted 3 weeks ago

Apply

8.0 - 13.0 years

16 - 20 Lacs

Noida

Work from Office

Naukri logo

Who We Are Build a brighter future while learning and growing with a Siemens company at the intersection of technology, community and sustainability. Our global team of innovators is always looking to create meaningful solutions to some of the toughest challenges facing our world. Find out how far your passion can take you. About the Job At Brightly, our dedication to innovation drives our product management team to create products that address our customers' evolving needs. As a Senior Product Manager for Data, you will work in a collaborative, energetic, dynamic, and creative environment to drive the product strategy of market-leading data products for our emerging and existing vertical-focused products and services. Reporting to the Director of Product Management, you will play a crucial role in assisting in a forward-thinking Data & AI strategy aligned with market needs. What you'll be doing Your key responsibilities are to develop and execute a comprehensive product strategy aligned with market demands and business goals through: Drive monetizationBuild new high value offers on our Snowflake Data cloud. Build Data-as-a-product to drive revenue. Enable Reporting, Analytics and AI roadmap. Data-Driven Decision MakingUtilize analytics and data to drive product decisions, measure success, and iterate on features. Market AnalysisStay up to date with industry trends, competitor products, and emerging technologies to ensure our data products remain competitive. Stakeholder ManagementCollaborate with stakeholders across the organization to align on product goals, priorities, financials, and timelines. Exposure working with Legal to ensure compliance and data governance, data integrity and data retention Customer FocusDeep empathy for users and a passion for creating delightful product experiences. User Research and InsightsConduct user research and gather insights to inform product decisions and ensure the data products meet user needs and expectations. Resource ManagementIdentify value driven opportunities (including establishing TAM, SAM, SOM), managing, understand and share financial outlook, align resources effectively to drive success for your domain. What you'll need EducationBachelor's degree and advanced degree in a technical field or an MBA from a top business school (IIM, XLRI, ISB) or equivalent experience. ExperienceOverall 8+ years of experience with at least 2 years in a business facing role, preferably in SaaS / PaaS. AdaptabilityComfortable navigating and prioritizing in situations of ambiguity, especially in the early stages of discovery and product development. Motivated self-starter with the ability to learn and adapt. Communication Skills: Strong communication and social skills; ability to work across teams with geographically remote team members, including the ability to frame complex concepts for a non-technical audience. InfluenceDemonstrated ability to influence without authority and communicate to multi-level audiences including growing and mentoring more junior product peers. Who we are Brightly, the global leader in intelligent asset management solutions, enables organizations to transform the performance of their assets. Brightly's sophisticated cloud-based platform leverages more than 20 years of data to deliver predictive insights that help users through the key phases of the entire asset lifecycle. More than 12,000 clients of every size worldwide depend on Brightly's complete suite of intuitive software "“ including CMMS, EAM, Strategic Asset Management, IoT Remote Monitoring, Sustainability and Community Engagement. Paired with award-winning training, support, and consulting services, Brightly helps light the way to a bright future with smarter assets and sustainable communities. The Brightly culture Service. Ingenuity. Integrity. Together. These values are core to who we are and help us make the best decisions, manage change, and provide the foundations for our future. These guiding principles help us innovate, flourish, and make a real impact in the businesses and communities we help to thrive. We are committed to the great experiences that nurture our employees and the people we serve while protecting the environments in which we live. Together we are Brightly.

Posted 3 weeks ago

Apply

5.0 - 9.0 years

16 - 22 Lacs

Hyderabad

Hybrid

Naukri logo

End Client: Hitachi. Job: On payroll with Horizontal India Work Location: Hyderabad Data Engineer Strong in SQL Complex queries and Python with ETL, Data warehouse and Snowflake experience. Should have basic understandings of AWS services(ANY). 5 to 9 yrs exp. ONLY HYDERABAD (No relocation) Should be available for F2F interview as well. Immediate Joiner to 10Days. Note: If interested please share your resume at smoluguri@horizontal.com

Posted 3 weeks ago

Apply

6.0 - 11.0 years

5 - 9 Lacs

Gurugram

Work from Office

Naukri logo

POSITION SUMMARY : We are looking for a highly skilled Python Backend Developer with 6-12 years of experience with backend development, the candidate will drive projects independently while ensuring high code quality and efficiency. The role requires expertise in Python frameworks (Django, Flask), cloud platforms (AWS), and database management (Snowflake), with a strong emphasis on software best practices, problem-solving, and stakeholder collaboration. EXPERIENCE AND REQUIRED SETS : - 6-12 years of backend development experience with Python - Understanding of cloud platforms, particularly AWS. - Proficiency in using Snowflake for database management and optimization. - Experience working with data-intensive applications. - Demonstrated ability to build dynamic and static reports using Python libraries such as Pandas, Matplotlib, or Plotly. - Strong understanding of RESTful APIs and microservices architecture. - Proficiency with Python frameworks like Django, Flask, or Tornado, including basic skills required to develop and maintain applications using these frameworks. - Knowledge of both relational and non-relational databases. - Proficiency with version control systems, especially Git. - Backend DevelopmentDesign, develop, and maintain scalable and resilient backend services using Python, ensuring optimal performance and reliability. - Data-Intensive ApplicationsDevelop and manage data-intensive applications, ensuring efficient data processing and handling. - Report GenerationCreate dynamic and static reports utilizing common Python libraries (e.g., Pandas, Matplotlib, Plotly) to deliver actionable insights. - Python FrameworksUtilize frameworks such as Django, Flask, or Tornado to build and maintain robust backend systems, ensuring best practices in application architecture. - Cloud PlatformsDeploy and manage applications on cloud development platforms such as AWS and Beacon, leveraging their full capabilities to support our solutions. - Database ManagementArchitect, implement, and optimize database solutions using Snowflake to ensure data integrity and performance. - Stakeholder CollaborationEngage directly with Tech Owners and Business Owners to gather requirements, provide progress updates, and ensure alignment with business objectives. - Ownership & InitiativeTake full ownership of projects, driving them from conception through to completion with minimal supervision. - Software Best PracticesImplement and uphold software development best practices, including version control, automated testing, code reviews, and CI/CD pipelines. - GenAI Tools UtilizationUtilize GenAI tools such as GitHub Copilot to enhance coding efficiency, streamline workflows, and maintain high code quality. - Problem-SolvingProactively identify, troubleshoot, and resolve technical issues, ensuring timely delivery of solutions. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 3 weeks ago

Apply

5.0 - 7.0 years

6 - 11 Lacs

Pune

Work from Office

Naukri logo

About The Role : We are seeking a highly skilled and experienced MSTR (MicroStrategy) Developer to join our Business Intelligence team. In this role, you will be responsible for the design, development, implementation, and maintenance of robust and scalable BI solutions using the MicroStrategy platform. Your primary focus will be on leveraging your deep understanding of MicroStrategy architecture and strong SQL skills to deliver insightful and actionable data to our stakeholders. This is an excellent opportunity to contribute to critical business decisions by providing high-quality BI solutions. Responsibilities - Design, develop, and deploy MicroStrategy objects including reports, dashboards, cubes (Intelligent Cubes, OLAP Cubes), documents, and visualizations. - Utilize various MicroStrategy features and functionalities such as Freeform SQL, Query Builder, MDX connectivity, and data blending. - Optimize MicroStrategy schema objects (attributes, facts, hierarchies) for performance and usability. - Implement security models within MicroStrategy, including user and group management, object-level security, and data-level security. - Perform performance tuning and optimization of MicroStrategy reports and dashboards. - Participate in the administration and maintenance of the MicroStrategy environment, including metadata management, project configuration, and user support. - Troubleshoot and resolve issues related to MicroStrategy reports, dashboards, and the overall platform. - Write complex and efficient SQL queries to extract, transform, and load data from various data sources. - Understand database schema design and data modeling principles. - Optimize SQL queries for performance within the MicroStrategy environment. - Work with different database platforms (e.g., Oracle, SQL Server, Teradata, Snowflake) and understand their specific SQL dialects. - Develop and maintain database views and stored procedures to support MicroStrategy development. - Collaborate with business analysts and end-users to understand their reporting and analytical requirements. - Translate business requirements into technical specifications for MicroStrategy development. - Participate in the design and prototyping of BI solutions. - Develop and execute unit tests and integration tests for MicroStrategy objects. - Participate in user acceptance testing (UAT) and provide support to end-users during the testing phase. - Ensure the accuracy and reliability of data presented in MicroStrategy reports and dashboards. - Create and maintain technical documentation for MicroStrategy solutions, including design documents, user guides, and deployment instructions. - Provide training and support to end-users on how to effectively use MicroStrategy reports and dashboards. - Adhere to MicroStrategy best practices and development standards. - Stay updated with the latest MicroStrategy features and functionalities. - Proactively identify opportunities to improve existing MicroStrategy solutions and processes. Required Skills and Expertise - Strong proficiency in MicroStrategy development (5+ years of hands-on experience is essential). This includes a deep understanding of the MicroStrategy architecture, object creation, report development, dashboard design, and administration. - Excellent SQL skills (5+ years of experience writing complex queries, optimizing performance, and working with various database systems). - Experience in data modeling and understanding of dimensional modeling concepts (e.g., star schema, snowflake schema). - Solid understanding of BI concepts, data warehousing principles, and ETL processes. - Experience in performance tuning and optimization of MicroStrategy reports and SQL queries. - Ability to gather and analyze business requirements and translate them into technical specifications. - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills, with the ability to work effectively with both technical and business stakeholders. - Experience with version control systems (e.g., Git). - Ability to work independently and as part of a team. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 3 weeks ago

Apply

3.0 - 5.0 years

3 - 7 Lacs

Mumbai

Work from Office

Naukri logo

Job Summary We are seeking a highly analytical and detail-oriented Data Specialist with deep expertise in SQL, Python, statistics, and automation. The ideal candidate will be responsible for designing robust data pipelines, analyzing large datasets, driving insights through statistical methods, and automating workflows to enhance data accessibility and business decision-making. Key Responsibilities - Write and optimize complex SQL queries for data extraction, transformation, and reporting. - Develop and maintain Python scripts for data analysis, ETL processes, and automation tasks. - Conduct statistical analysis to identify trends, anomalies, and actionable insights. - Build and manage automated dashboards and data pipelines using tools such as Airflow, Pandas, or Apache Spark. - Collaborate with cross-functional teams (product, engineering, business) to understand data needs and deliver scalable solutions. - Implement data quality checks and validation procedures to ensure accuracy and consistency. - Support machine learning model deployment and performance tracking (if applicable). - Document data flows, models, and processes for internal knowledge sharing. Key Requirements - Strong proficiency in SQL (joins, CTEs, window functions, performance tuning). - Solid experience with Python (data manipulation using Pandas, NumPy, scripting, and automation). - Applied knowledge of statistics (hypothesis testing, regression, probability, distributions). - Experience with data automation tools (Airflow, dbt, or equivalent). - Familiarity with data visualization tools (Tableau, Power BI, or Plotly) is a plus. - Understanding of data warehousing concepts (e.g., Snowflake, BigQuery, Redshift). - Strong problem-solving skills and the ability to work independently. Preferred Qualifications - Bachelor's or Masters degree in Computer Science, Data Science, Statistics, or a related field. - Exposure to cloud platforms like AWS, GCP, or Azure. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 3 weeks ago

Apply

5.0 - 7.0 years

10 - 14 Lacs

Gurugram

Work from Office

Naukri logo

RolePower BI Developer. Exp 5 plus. Location Bangalore, Pune, Gurgaon, Noida & HyderabadWork Model (Hybrid). Experience Range 5 yrs plus. Notice Immediate. Must have skills :- Experience in the Investments domain.- 5+ years' experience with data analysis and visualization.- 5+ years hands on experience with Power BI developments. - 5+ years hand on experience with SQL/Snowflake. - Working knowledge on Power BI Report Builder.- Power BI modeling and SQL querying skills- Design Power BI models and metadata for reporting.- Share & collaborate within the Power Platform ecosystem.- Experienced in tuning and troubleshooting issues.- Ability to connect to multiple data sources including file systems, databases, and cloud sources - SQL, Oracle, Snowflake, SharePoint.- High level understanding of Scrum methodologies, ceremonies, and metrics.- Create technical stories and convert business needs and inputs to technical solution and design.- Experience with end-to-end delivery of reporting solutions. Good to have skills :- Experience with other BI/ Reporting/ Visualization tools like Tableau, Cognos.- Scrum / Safe / agile methodologies / task and story estimation.- Experience with Power Apps.- Data integration skills.- ETL, Data mapping, massaging and transformation.Experience 5-7 YearsApplyInsightsFollow-upSave this job for future referenceDid you find something suspiciousReport Here! Hide This JobClick here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 3 weeks ago

Apply

15.0 - 24.0 years

40 - 50 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Role: Tech Engineering Manager Location: Bangalore, Chennai, Hyderabad (Hybrid) Looking for someone: Recent project, at least last 1 year on AWS data and Snowflake. We need someone who can strong enough to talk the data & analytics language fluently. Project Management: Managed a team of minimum of 10 member in data projects Communication: Very good communication skills working with client stakeholders and people management skills

Posted 3 weeks ago

Apply

10.0 - 18.0 years

20 - 35 Lacs

Kolkata, Pune, Chennai

Work from Office

Naukri logo

Job Summary: We are seeking a Snowflake Solution Architect to drive technical sales, solution design, and client engagement for Snowflake-based data solutions. This role requires a deep understanding of Snowflakes architecture, data engineering, cloud ecosystems (AWS/Azure/GCP), and analytics workflows. The ideal candidate will work closely with sales teams, customers, and partners to showcase Snowflakes capabilities, provide technical guidance, and ensure successful adoption. Key Responsibilities: Solution Architecture : Act as a trusted advisor to clients, understanding business challenges and recommending Snowflake-based solutions. Design and present Snowflake architecture, data pipelines, and integration strategies tailored to customer needs. Develop proof-of-concept (POC) and demos to showcase Snowflake’s capabilities. 2. Technical Consultation & Client Engagement: Conduct technical discovery sessions with clients to assess their data architecture, workflows, and pain points. Provide best practices for performance optimization, security, data governance, and cost efficiency in Snowflake. Assist in RFPs, RFIs, and technical documentation for customer proposals. 3. Collaboration & Enablement: Work closely with sales, product, engineering, and customer success teams to drive Snowflake adoption. Conduct workshops, webinars, and training sessions for clients and partners. Stay updated with Snowflake’s latest features, roadmap, and industry trends. 4. Integration & Ecosystem Expertise: Provide guidance on integrating Snowflake with ETL tools (dbt, Matillion, Informatica, Fivetran), BI tools (Tableau, Power BI), and AI/ML frameworks (Databricks, Python, TensorFlow). Understand multi-cloud strategies and data migration best practices from legacy systems to Snowflake. Required Skills & Qualifications: Experience: 10+ years in data architecture, pre-sales, or solution consulting, with 3+ years of Snowflake hands-on expertise. Technical Expertise: Deep knowledge of Snowflake’s architecture, SnowSQL, Snowpipe, Streams, Tasks, Stored Procedures. Strong understanding of cloud platforms (AWS, Azure, GCP). Proficiency in SQL, Python, or scripting languages for data operations. Experience with ETL/ELT tools, data integration, and performance tuning. Familiarity with data security, governance, and compliance standards (GDPR, HIPAA, SOC 2). Soft Skills: Excellent communication, presentation, and client engagement skills. Ability to translate complex technical concepts into business value propositions. Strong problem-solving and consultative approach. Preferred Locations: Offshore [Kochi, Trivandrum, Chennai, Pune, Kolkata]Role & responsibilities Preferred candidate profile

Posted 3 weeks ago

Apply

2.0 - 7.0 years

7 - 17 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Hiring for Snowflake Developer with experience range 2 years & above Mandatory Skills: Snowflake Developer Education: BE/B.Tech/MCA/M.Tech/MSc./MS Interview Mode-F2

Posted 3 weeks ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

Category: Technology Location: Shuru is a technology-consulting company that embeds senior product and engineering teams into fast-growing companies worldwide to accelerate growth and de-risk strategy Our work is global, high-stakes, and unapologetically business-first, Role Overview Youll join a lean, senior-only business intelligence team as a Senior Data Analyst who will sit shoulder-to-shoulder with our clients, operating as their in-house analytics brain-trust Your mandate: design the data questions worth asking, own the pipelines that answer them, and convert findings into clear, bottom-line actions If you need daily direction, this isnt for you If you see a vague brief as oxygen, read on, Key Responsibilities Frame the right questions Translate ambiguous product or commercial goals into testable hypotheses, selecting the metrics that truly explain user behaviour and unit economics, Own data end-to-end Model, query, and transform data in SQL and dbt, pushing to cloud warehouses such as Snowflake/BigQuery, with zero babysitting, Build self-service BI Deliver dashboards in Metabase/Looker that non-technical stakeholders can tweak without coming back to you every week, Tell unforgettable stories Turn complex analyses into visuals and narratives that drive decisions in the C-suite and on the sprint board, Guard the data moat Champion data governance, privacy, and quality controls that scale across multiple client engagements, Mentor & multiply Level-up engineers and product managers on analytical thinking, setting coding and insight standards for future analysts, Requirements Must-Have Skills & Experience Minimum Experience of 3 years Core Analytics: Expert SQL; comfort with Python or R for advanced analysis; solid grasp of statistical inference and experimentation, Modern Data Stack: Hands-on with dbt, Snowflake/BigQuery/Redshift, and at least one orchestration tool (Airflow, Dagster, or similar), BI & Visualisation: Proven delivery in Metabase, Looker, or Tableau (including performance tuning for big data models ) Product & Growth Metrics: Demonstrated ability to define retention, activation, and LTV/Payback KPI for SaaS or consumer-tech products, Communication: Relentless clarity; you can defend an insight to both engineers and the CFO, and change course when the data disproves you, Independence: History of thriving with ?figure it out? briefs and distributed teams across time zones, Bonus Points Feature-flag experimentation at scale (e-g , Optimizely, LaunchDarkly), Familiarity with privacy-enhancing tech (differential privacy, data clean rooms), Benefits Work on international projects Execute with founders and execs from around the globe, stacking your playbook fast, Regular team outings We fund quarterly off-sites and virtual socials to keep the remote vibe human, Collaborative & growth-oriented Learn directly from CXOs, leads, and seasoned PMs; no silos, no artificial ceilings, Competitive salary & benefits Benchmark ?90th percentile for similar-stage firms, plus performance upside, Details

Posted 3 weeks ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Chennai

Work from Office

Naukri logo

Job Title: Data Engineer Experience: 6-7 Years Location: Chennai (Hybrid) Key Skills: Python, PySpark, AWS (S3, Lambda, Glue, EMR, Redshift), SQL, Snowflake, DBT, MongoDB, Kafka, Airflow Job Description: Virtusa is hiring a Senior Data Engineer with expertise in building scalable data pipelines using Python, PySpark, and AWS services The role involves data modeling in Snowflake, ETL development with DBT, and orchestration via Airflow Experience with MongoDB, Kafka, and data streaming is essential, Responsibilities: Develop and optimize data pipelines using PySpark & Python Leverage AWS for data ingestion and processing Manage Snowflake data models and transformations via DBT Work with SQL across multiple databases Integrate streaming and NoSQL sources (Kafka, MongoDB) Support analytics and ML workflows Maintain data quality, lineage, and governance

Posted 3 weeks ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Kotagiri

Work from Office

Naukri logo

Position Overview: As a Data Analytics Engineer, you will operate at the intersection of data engineering and analytics, contributing to both the development of scalable data infrastructure and the generation of actionable insights This role involves building and maintaining data pipelines, ensuring data accuracy, and supporting business decision-making through data analysis and visualization You will collaborate closely with analysts, product teams, and other stakeholders to enable data-driven strategies and initiatives, Key Responsibilities: Data Pipeline Development: Design, build, and maintain efficient data pipelines for extracting, transforming, and loading (ETL) data, Analytics Support: Collaborate with analysts to deliver actionable insights through reports, dashboards, and ad hoc analyses, REST API Integration: Develop and manage REST API integrations to ensure seamless data flow between systems, Workflow Orchestration: Use Apache Airflow to manage and automate workflows and tasks, Python Scripting: Write Python scripts for data processing, automation, and analytics, Snowflake Expertise: Leverage Snowflake to manage and optimize data pipelines and provide datasets for analysis, Data Visualization: Assist in creating dashboards and visualizations using tools like Tableau, Power BI, or Looker, Data Quality Assurance: Ensure the accuracy and integrity of data through validations and error-handling mechanisms, Optimization: Continuously optimize data pipelines and workflows for performance and scalability, Collaboration: Work closely with cross-functional teams, including product managers and stakeholders, to address data requirements and provide analytics solutions, Qualifications: Bachelors degree in Computer Science, Data Science, or a related field, Experience in both data engineering and analytics roles, Proficiency in Python scripting and SQL, Hands-on experience with Apache Airflow and Snowflake, Familiarity with data visualization tools (e-g , Tableau, Power BI, Looker), Strong problem-solving and communication skills, Ability to work collaboratively in cross-functional teams, Willingness to relocate to Nilgiris, Job Category: Data science Job Type: Full Time Job Location: Nilgiris

Posted 3 weeks ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Hubli, Mangaluru, Mysuru

Work from Office

Naukri logo

About the Team You will be joining the newly formed AI, Data & Analytics team, primarily responsible as a Data Engineer leading various projects within the new Data Platform team The new team is focused on driving increased value from the data InvestCloud captures to enable a smarter financial future for our clients, in particular focused on ?enhanced intelligence? Ensuring we have fit-for-purpose modern capabilities is a key goal for the team, Key Responsibilities Design, develop, and maintain scalable data pipelines to support diverse analytics and machine learning needs, Optimize and manage data architectures for reliability, scalability, and performance, Implement and support data integration solutions from our data partners, including ETL/ELT processes, ensuring seamless data flow across platforms, Collaborate with Data Scientists, Analysts, and Product Teams to define and support data requirements, Manage and maintain data platforms such as Oracle, Snowflake, and/or Databricks, ensuring high availability and performance, whilst optimizing for cost, Ensure data security and compliance with company policies and relevant regulations, Monitor and troubleshoot data systems to identify and resolve performance issues, Develop and maintain datasets and data pipelines to support Machine Learning model training and deployment Analyze large datasets to identify patterns, trends, and insights that can inform business decisions, Work with 3rd party providers of Data and Data Platform products to evaluate and implement solutions achieving Investclouds business objectives, Lead a small team, as part of the global team, based in India and working closely with co-located data scientists as well as the broader global team, Required Skills Bachelors or Masters degree in Computer Science, Engineering, or a related field, or equivalent practical experience, Minimum of 5 years of professional experience in data engineering or a related role, Proficiency in database technologies, including Oracle and PostgreSQL, Hands-on experience with Snowflake and/or Databricks, with a solid understanding of their ecosystems, Expertise in programming languages such as Python or SQL, Strong knowledge of ETL/ELT tools and data integration frameworks, Experience with cloud platforms such as AWS, GCP, or Azure, Familiarity with containerization and CI/CD tools (e-g , Docker, Git), Excellent problem-solving skills and the ability to handle complex datasets, Outstanding communication skills to collaborate with technical and non-technical stakeholders globally, Knowledge of data preprocessing, feature engineering, and model evaluation metrics Excellent proficiency in English Ability to work in a fast-paced environment across multiple projects simultaneously Ability to lead a small team ensuring a highly productive, collaborative and positive environment Ability to collaborate effectively as a team player, fostering a culture of open communication and mutual respect, Preferred Skills Experience with real-time data processing and streaming platforms (e-g , Apache Kafka), Knowledge of data warehousing and data lake architectures, Familiarity with governance frameworks for data management and security, Knowledge of Machine Learning frameworks (TensorFlow, PyTorch, Scikit-learn) and LLM frameworks (e-g Langchain) What do we offer Join our diverse and international cross-functional team, comprising data scientists, product managers, business analyst and software engineers As a key member of our team, you will have the opportunity to implement cutting-edge technology to create a next-generation advisor and client experience, Location and Travel The ideal candidate will be expected to work from the office (with some flexibility) Occasional travel may be required,

Posted 3 weeks ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Hubli, Mangaluru, Mysuru

Work from Office

Naukri logo

About The Team You will be joining the newly formed AI, Data & Analytics team, primarily responsible as a Data Engineer leading various projects within the new Data Platform team The new team is focused on driving increased value from the data InvestCloud captures to enable a smarter financial future for our clients, in particular focused on ?enhanced intelligence? Ensuring we have fit-for-purpose modern capabilities is a key goal for the team, Key Responsibilities Assist in the Design, development, and maintenance of scalable data pipelines to support diverse analytics and machine learning needs, Manage data architectures for reliability, scalability, and performance, Support data integration solutions from our data partners, including ETL/ELT processes, ensuring seamless data flow across platforms, Collaborate with Data Scientists, Analysts, and Product Teams to define and support data requirements, Manage and maintain data platforms such as Oracle, Snowflake, and/or Databricks, ensuring high availability and performance, whilst optimizing for cost, Ensure data security and compliance with company policies and relevant regulations, Monitor and troubleshoot data systems to identify and resolve performance issues, Develop and maintain datasets and data pipelines to support Machine Learning model training and deployment Analyze large datasets to identify patterns, trends, and insights that can inform business decisions, Work with 3rd party providers of Data and Data Platform products to evaluate and implement solutions achieving Investclouds business objectives, Required Skills Bachelors or Masters degree in Computer Science, Engineering, or a related field, or equivalent practical experience, Minimum of 4 years of professional experience in data engineering or a related role, Proficiency in database technologies, including Oracle and PostgreSQL, Hands-on experience with Snowflake and/or Databricks, with a solid understanding of their ecosystems, Experience with programming languages such as Python or SQL, Familiarity with ETL/ELT tools and data integration frameworks, Experience with cloud platforms such as AWS, GCP, or Azure, Familiarity with containerization and CI/CD tools (e-g , Docker, Git), Strong problem-solving skills and the ability to handle complex datasets, Good communication skills to collaborate with global technical and non-technical stakeholders, Knowledge of data preprocessing, feature engineering, and model evaluation metrics Excellent proficiency in English Ability to work in a fast-paced environment across multiple projects simultaneously Ability to collaborate effectively as a team player, fostering a culture of open communication and mutual respect, Preferred Skills Knowledge of data warehousing and data lake architectures, Familiarity with governance frameworks for data management and security, Familiarity with Machine Learning frameworks (TensorFlow, PyTorch, Scikit-learn) and LLM frameworks (e-g Langchain)

Posted 3 weeks ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title: Data Engineer (35 Years Experience) Location: Gurgaon, Pune, Bangalore, Hyderabad Job Summary : We are seeking a skilled and motivated Data Engineer with 3 to 5 years of experience to join our growing team The ideal candidate will have hands-on expertise in building robust, scalable data pipelines, working with modern data platforms, and enabling data-driven decision-making across the organization Youll work closely with data scientists, analysts, and engineering teams to build and maintain efficient data infrastructure and toolin Key Responsibiliti es: Design, develop, and maintain scalable ETL/ELT pipelines to support analytics and product use ca ses Collaborate with data analysts, scientists, and business stakeholders to gather requirements and translate them into data soluti ons Manage data integrations from various internal and external data sour ces Optimize data workflows for performance, cost-efficiency, and reliabil ity Build and maintain data models and data warehouses using industry best practi ces Monitor, troubleshoot, and improve existing data pipeli nes Implement data quality frameworks and ensure data governance standards are follo wed Contribute to documentation, code reviews, and knowledge sharing within the t eam Required Skills & Qualificati ons: Bachelor's degree in Computer Science, Engineering, Information Technology, or a related f ield 35 years of experience as a Data Engineer or in a similar data-focused role Strong command of SQL and proficiency in Py thon Good Engineering pract ices Experience with data pipeline orchestration tools such as Apache Airflow or equiva lent Hands-on experience with cloud data platforms (AWS/GCP/Azure) and services such as S3, Redshift, BigQuery, or Azure Data Lake Experience in data warehousing concepts and tools like Snowflake, Redshift, databr icks,Familiarity with version control tools such as Git Strong analytical and communication sk ills Preferred Qualificat ions: Exposure to big data tools and frameworks such as Spark, Hadoop, or Kafka Experience with containerization (Docker/Kubern etes) Familiarity with CI/CD pipelines and automation in data engine ering Awareness of data security, privacy, and compliance princ iples What We Offer: A collaborative and inclusive work envir onment Opportunities for continuous learning and career growth petitive compensation and be nefits Flexibility to work from any of our offices in Gurgaon, Pune, Bangalore, or Hy derabad

Posted 3 weeks ago

Apply

7.0 - 13.0 years

9 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

About Us At ExxonMobil, our vision is to lead in energy innovations that advance modern living and a net-zero future As one of the worlds largest publicly traded energy and chemical companies, we are powered by a unique and diverse workforce fueled by the pride in what we do and what we stand for, The success of our Upstream, Product Solutions and Low Carbon Solutions businesses is the result of the talent, curiosity and drive of our people They bring solutions every day to optimize our strategy in energy, chemicals, lubricants and lower-emissions technologies, We invite you to bring your ideas to ExxonMobil to help create sustainable solutions that improve quality of life and meet societys evolving needs Learn more about our What and our Why and how we can work together, ExxonMobils affiliates in India ExxonMobils affiliates have offices in India in Bengaluru, Mumbai and the National Capital Region, ExxonMobils affiliates in India supporting the Product Solutions business engage in the marketing, sales and distribution of performance as well as specialty products across chemicals and lubricants businesses The India planning teams are also embedded with global business units for business planning and analytics, ExxonMobils LNG affiliate in India supporting the upstream business provides consultant services for other ExxonMobil upstream affiliates and conducts LNG market-development activities, The Global Business Center Technology Center provides a range of technical and business support services for ExxonMobils operations around the globe, ExxonMobil strives to make a positive contribution to the communities where we operate and its affiliates support a range of education, health and community-building programs in India Read more about our Corporate Responsibility Framework, To know more about ExxonMobil in India, visit ExxonMobil India and the Energy Factor India, What Role You Will Play In Team ExxonMobils Planning & Stewardship (P&S DIGITAL) has a mission to develop enterprise data sets in support of the corporate strategy to treat data as an asset to maximize business value The Data Architect may work independently or work closely with other Data Architects, subject matter experts (SMEs), data engineers and developers to design enterprise data sets and consult with projects to ensure data architecture is aligned with enterprise data principles and standards The position will report to the Data Architect Manager, Job will be based at Bangalore , Whitefield office ( WFO) for 5 days in a week, What You Will Do The P&S DIGITAL Data Architect will work towards building and/or providing knowledge in support of the overall P&S digital mission and data strategy To achieve these goals, the Data Architect will be required to analyze current state data architectures and conceive desired future state data architectures and identify activities needed to close the gap to achieve the future state Some examples of these activities/deliverables are: Deliver Data Artifacts: Drive the creation of data reference architectures, models, and structures for business domains, Strategic Guidance: Provide strategic direction on data components and endorse data architecture assessments, Design & Implementation: Design and implement data models, schemas, and structures for planning and stewardship, Develop & Test Pipelines: Develop and test data pipelines, ETL processes, and system integration with IT Documentation Coordination: Ensure completion of data definitions, ownership, standards, policies, and procedures About You Required Skills and Qualifications: Educational Background: Masters or bachelors degree in business, computer science, engineering, systems analysis, or a related field, Degrees & certifications required for the job should be listed here, listing B S or equivalent work experiencein this section is acceptable if an actual degree is not required Leadership Experience: Experience leading initiatives in these domains is desirable, Data Concepts: Familiarity with data governance, modeling, and integration, Technical Skills: Understanding of databases, data warehouses (e-g , Snowflake), SQL, and table/view design, ETL & Cloud Solutions: Ability to design/implement data connections (ETL) to/from cloud solutions (APIs, Azure Data Factory, Qlik, Fivetran), Educational Background: Masters or bachelors degree in business, computer science, engineering, systems analysis, or a related field, Experience: Minimum 3 years in data design/architecture with a strong willingness to continue learning, Recent Experience: Developing reference data architecture, data modeling (conceptual, logical, physical), data profiling, data quality analysis, and building business data glossaries and catalogs, Data Governance: Knowledge of data governance and master/reference data management programs, Tool Proficiency: Experience with SQL Server, SQL query language, and E/R Studio data modeling tool, Agile Experience: Experience working with agile delivery teams, Soft Skills: Effective planning, communication, collaboration, and persuasion skills to drive change, Communication Skills: Expert written and verbal communication skills familiarity with SharePoint for collaboration, Self-Starter: Takes initiative and can work in a fast-paced environment, Your Benefits An ExxonMobil career is one designed to last Our commitment to you runs deep: our employees grow personally and professionally, with benefits built on our core categories of health, security, finance and life We offer you: Competitive compensation Medical plans, maternity leave and benefits, life, accidental death and dismemberment benefits Retirement benefits Global networking & cross-functional opportunities Annual vacations & holidays Day care assistance program Training and development program Tuition assistance program Workplace flexibility policy Relocation program Transportation facility Please note benefits may change from time to time without notice, subject to applicable laws The benefits programs are based on the Companys eligibility guidelines, Stay connected with us Learn more about ExxonMobil in India, visit ExxonMobil India and Energy Factor India, Follow us on LinkedIn and Instagram Like us on Facebook Subscribe our channel at YouTube EEO Statement ExxonMobil is an Equal Opportunity Employer: All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, age, national origin or disability status, Business solicitation and recruiting scams ExxonMobil does not use recruiting or placement agencies that charge candidates an advance fee of any kind (e-g , placement fees, immigration processing fees, etc ) Follow the LINK to understand more about recruitment scams in the name of ExxonMobil, Nothing herein is intended to override the corporate separateness of local entities Working relationships discussed herein do not necessarily represent a reporting connection, but may reflect a functional guidance, stewardship, or service relationship, Exxon Mobil Corporation has numerous affiliates, many with names that include ExxonMobil, Exxon, Esso and Mobil For convenience and simplicity, those terms and terms like corporation, company, our, we and its are sometimes used as abbreviated references to specific affiliates or affiliate groups Abbreviated references describing global or regional operational organizations and global or regional business lines are also sometimes used for convenience and simplicity Similarly, ExxonMobil has business relationships with thousands of customers, suppliers, governments, and others For convenience and simplicity, words like venture, joint venture, partnership, co-venturer, and partner are used to indicate business relationships involving common activities and interests, and those words may not indicate precise legal relationships,

Posted 3 weeks ago

Apply

4.0 - 8.0 years

20 - 30 Lacs

Pune

Hybrid

Naukri logo

Job Summary: We are seeking a skilled developer with strong expertise in Snowflake and Python to join our data engineering team. The ideal candidate will be responsible for building scalable data pipelines, optimizing queries, and integrating Snowflake with various data sources using Python. Key Responsibilities: Develop and optimize ETL/ELT pipelines using Snowflake and Python. Work with large datasets to design and implement efficient data models in Snowflake. Write complex SQL queries and stored procedures. Integrate APIs and external data sources with Snowflake using Python scripts. Collaborate with data analysts, engineers, and business teams to meet data needs. Requirements: 3+ years of hands-on experience with Snowflake and Python. Strong SQL skills and understanding of cloud data warehousing concepts. Experience in performance tuning and query optimization. Familiarity with tools like Airflow, DBT, or similar orchestration tools is a plus. Knowledge of cloud platforms (AWS/Azure/GCP) preferred.

Posted 3 weeks ago

Apply

4.0 - 5.0 years

6 - 10 Lacs

Kochi, Bengaluru

Work from Office

Naukri logo

4+ yrs experience Work from Office - 1st preference Kochi, 2nd preference Bangalore Good exp in any EtL tool Good knowledge in python Integration experience Good attitude and Cross skilling ability

Posted 3 weeks ago

Apply

1.0 - 3.0 years

11 - 15 Lacs

Mumbai

Work from Office

Naukri logo

Overview The Data Technology team at MSCI is responsible for meeting the data requirements across various business areas, including Index, Analytics, and Sustainability. Our team collates data from multiple sources such as vendors (e.g., Bloomberg, Reuters), website acquisitions, and web scraping (e.g., financial news sites, company websites, exchange websites, filings). This data can be in structured or semi-structured formats. We normalize the data, perform quality checks, assign internal identifiers, and release it to downstream applications. Responsibilities As data engineers, we build scalable systems to process data in various formats and volumes, ranging from megabytes to terabytes. Our systems perform quality checks, match data across various sources, and release it in multiple formats. We leverage the latest technologies, sources, and tools to process the data. Some of the exciting technologies we work with include Snowflake, Databricks, and Apache Spark. Qualifications Core Java, Spring Boot, Apache Spark, Spring Batch, Python. Exposure to sql databases like Oracle, Mysql, Microsoft Sql is a must. Any experience/knowledge/certification on Cloud technology preferrably Microsoft Azure or Google cloud platform is good to have. Exposures to non sql databases like Neo4j or Document database is again good to have. What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com

Posted 3 weeks ago

Apply

0.0 - 2.0 years

4 - 7 Lacs

Navi Mumbai

Work from Office

Naukri logo

Title Our corporate activities are growing rapidly, and we are currently seeking a full-time, office-based Data Engineerto join our Information Technology team. This position will work on a team to accomplish tasks and projects that are instrumental to the company’s success. If you want an exciting career where you use your previous expertise and can develop and grow your career even further, then this is the opportunity for you. Overview Medpace is a full-service clinical contract research organization (CRO). We provide Phase I-IV clinical development services to the biotechnology, pharmaceutical and medical device industries. Our mission is to accelerate the global development of safe and effective medical therapeutics through its scientific and disciplined approach. We leverage local regulatory and therapeutic expertise across all major areas including oncology, cardiology, metabolic disease, endocrinology, central nervous system, anti-viral and anti-infective. Headquartered in Cincinnati, Ohio, employing more than 5,000 people across 40+ countries. Responsibilities Utilize skills in development areas including data warehousing, business intelligence, and databases (Snowflake, ANSI SQL, SQL Server, T-SQL); Support programming/software development using Extract, Transform, and Load (ETL) and Extract, Load and Transform (ELT) tools, (dbt, Azure Data Factory, SSIS); Design, develop, enhance and support business intelligence systems primarily using Microsoft Power BI; Collect, analyze and document user requirements; Participate in software validation process through development, review, and/or execution of test plan/cases/scripts; Create software applications by following software development lifecycle process, which includes requirements gathering, design, development, testing, release, and maintenance; Communicate with team members regarding projects, development, tools, and procedures; and Provide end-user support including setup, installation, and maintenance for applications Qualifications Bachelor's Degree in Computer Science, Data Science, or a related field; 5+ years of experience in Data Engineering; Knowledge of developing dimensional data models and awareness of the advantages and limitations of Star Schema and Snowflake schema designs; Solid ETL development, reporting knowledge based off intricate understanding of business process and measures; Knowledge of Snowflake cloud data warehouse, Fivetran data integration and dbt transformations is preferred; Knowledge of Python is preferred; Knowledge of REST API; Basic knowledge of SQL Server databases is required; Knowledge of C#, Azure development is a bonus; and Excellent analytical, written and oral communication skills. People. Purpose. Passion. Make a Difference Tomorrow. Join Us Today. The work we’ve done over the past 30+ years has positively impacted the lives of countless patients and families who face hundreds of diseases across all key therapeutic areas. The work we do today will improve the lives of people living with illness and disease in the future. Medpace Perks Flexible work environment Competitive compensation and benefits package Competitive PTO packages Structured career paths with opportunities for professional growth Company-sponsored employee appreciation events Employee health and wellness initiatives Awards Recognized by Forbes as one of America's Most Successful Midsize Companies in 2021, 2022, 2023 and 2024 Continually recognized with CRO Leadership Awards from Life Science Leader magazine based on expertise, quality, capabilities, reliability, and compatibility What to Expect Next A Medpace team member will review your qualifications and, if interested, you will be contacted with details for next steps. EO/AA Employer M/F/Disability/Vets

Posted 3 weeks ago

Apply

0.0 - 1.0 years

3 - 6 Lacs

Navi Mumbai

Work from Office

Naukri logo

Title Our corporate activities are growing rapidly, and we are currently seeking a full-time, office-based Data Engineerto join our Information Technology team. This position will work on a team to accomplish tasks and projects that are instrumental to the company’s success. If you want an exciting career where you use your previous expertise and can develop and grow your career even further, then this is the opportunity for you. Overview Medpace is a full-service clinical contract research organization (CRO). We provide Phase I-IV clinical development services to the biotechnology, pharmaceutical and medical device industries. Our mission is to accelerate the global development of safe and effective medical therapeutics through its scientific and disciplined approach. We leverage local regulatory and therapeutic expertise across all major areas including oncology, cardiology, metabolic disease, endocrinology, central nervous system, anti-viral and anti-infective. Headquartered in Cincinnati, Ohio, employing more than 5,000 people across 40+ countries. Responsibilities Utilize skills in development areas including data warehousing, business intelligence, and databases (Snowflake, ANSI SQL, SQL Server, T-SQL); Support programming/software development using Extract, Transform, and Load (ETL) and Extract, Load and Transform (ELT) tools, (dbt, Azure Data Factory, SSIS); Design, develop, enhance and support business intelligence systems primarily using Microsoft Power BI; Collect, analyze and document user requirements; Participate in software validation process through development, review, and/or execution of test plan/cases/scripts; Create software applications by following software development lifecycle process, which includes requirements gathering, design, development, testing, release, and maintenance; Communicate with team members regarding projects, development, tools, and procedures; and Provide end-user support including setup, installation, and maintenance for applications Qualifications Bachelor's Degree in Computer Science, Data Science, or a related field; 3+ years of experience in Data Engineering; Knowledge of developing dimensional data models and awareness of the advantages and limitations of Star Schema and Snowflake schema designs; Solid ETL development, reporting knowledge based off intricate understanding of business process and measures; Knowledge of Snowflake cloud data warehouse, Fivetran data integration and dbt transformations is preferred; Knowledge of Python is preferred; Knowledge of REST API; Basic knowledge of SQL Server databases is required; Knowledge of C#, Azure development is a bonus; and Excellent analytical, written and oral communication skills. People. Purpose. Passion. Make a Difference Tomorrow. Join Us Today. The work we’ve done over the past 30+ years has positively impacted the lives of countless patients and families who face hundreds of diseases across all key therapeutic areas. The work we do today will improve the lives of people living with illness and disease in the future. Medpace Perks Flexible work environment Competitive compensation and benefits package Competitive PTO packages Structured career paths with opportunities for professional growth Company-sponsored employee appreciation events Employee health and wellness initiatives Awards Recognized by Forbes as one of America's Most Successful Midsize Companies in 2021, 2022, 2023 and 2024 Continually recognized with CRO Leadership Awards from Life Science Leader magazine based on expertise, quality, capabilities, reliability, and compatibility What to Expect Next A Medpace team member will review your qualifications and, if interested, you will be contacted with details for next steps. EO/AA Employer M/F/Disability/Vets

Posted 3 weeks ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Responsibilities A day in the life of an InfoscionAs part of the Infosys consulting team, your primary role would be to lead the engagement effort of providing high-quality and value-adding consulting solutions to customers at different stages- from problem definition to diagnosis to solution design, development and deployment. You will review the proposals prepared by consultants, provide guidance, and analyze the solutions defined for the client business problems to identify any potential risks and issues. You will identify change Management requirements and propose a structured approach to client for managing the change using multiple communication mechanisms. You will also coach and create a vision for the team, provide subject matter training for your focus areas, motivate and inspire team members through effective and timely feedback and recognition for high performance. You would be a key contributor in unit-level and organizational initiatives with an objective of providing high-quality, value-adding consulting solutions to customers adhering to the guidelines and processes of the organization. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills:Technology->Data on Cloud-DataStore->Snowflake Preferred Skills: Technology->Data on Cloud-DataStore->Snowflake Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Educational Requirements- Bachelor of Engineering, MCA, BCA, B.tech, B.sc, ms/m.sc Location- PAN INDIA

Posted 3 weeks ago

Apply

Exploring Snowflake Jobs in India

Snowflake has become one of the most sought-after skills in the tech industry, with a growing demand for professionals who are proficient in handling data warehousing and analytics using this cloud-based platform. In India, the job market for Snowflake roles is flourishing, offering numerous opportunities for job seekers with the right skill set.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Chennai

These cities are known for their thriving tech industries and have a high demand for Snowflake professionals.

Average Salary Range

The average salary range for Snowflake professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum

Career Path

A typical career path in Snowflake may include roles such as: - Junior Snowflake Developer - Snowflake Developer - Senior Snowflake Developer - Snowflake Architect - Snowflake Consultant - Snowflake Administrator

Related Skills

In addition to expertise in Snowflake, professionals in this field are often expected to have knowledge in: - SQL - Data warehousing concepts - ETL tools - Cloud platforms (AWS, Azure, GCP) - Database management

Interview Questions

  • What is Snowflake and how does it differ from traditional data warehousing solutions? (basic)
  • Explain how Snowflake handles data storage and compute resources in the cloud. (medium)
  • How do you optimize query performance in Snowflake? (medium)
  • Can you explain how data sharing works in Snowflake? (medium)
  • What are the different stages in the Snowflake architecture? (advanced)
  • How do you handle data encryption in Snowflake? (medium)
  • Describe a challenging project you worked on using Snowflake and how you overcame obstacles. (advanced)
  • How does Snowflake ensure data security and compliance? (medium)
  • What are the benefits of using Snowflake over traditional data warehouses? (basic)
  • Explain the concept of virtual warehouses in Snowflake. (medium)
  • How do you monitor and troubleshoot performance issues in Snowflake? (medium)
  • Can you discuss your experience with Snowflake's semi-structured data handling capabilities? (advanced)
  • What are Snowflake's data loading options and best practices? (medium)
  • How do you manage access control and permissions in Snowflake? (medium)
  • Describe a scenario where you had to optimize a Snowflake data pipeline for efficiency. (advanced)
  • How do you handle versioning and change management in Snowflake? (medium)
  • What are the limitations of Snowflake and how would you work around them? (advanced)
  • Explain how Snowflake supports semi-structured data formats like JSON and XML. (medium)
  • What are the considerations for scaling Snowflake for large datasets and high concurrency? (advanced)
  • How do you approach data modeling in Snowflake compared to traditional databases? (medium)
  • Discuss your experience with Snowflake's time travel and data retention features. (medium)
  • How would you migrate an on-premise data warehouse to Snowflake in a production environment? (advanced)
  • What are the best practices for data governance and metadata management in Snowflake? (medium)
  • How do you ensure data quality and integrity in Snowflake pipelines? (medium)

Closing Remark

As you explore opportunities in the Snowflake job market in India, remember to showcase your expertise in handling data analytics and warehousing using this powerful platform. Prepare thoroughly for interviews, demonstrate your skills confidently, and keep abreast of the latest developments in Snowflake to stay competitive in the tech industry. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies