Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5.0 - 10.0 years
11 - 15 Lacs
Gurugram
Work from Office
Position Summary This is the Requisition for Employee Referrals Campaign and JD is Generic. We are looking for Associates with 5+ years of experience in delivering solutions around Data Engineering, Big data analytics and data lakes, MDM, BI, and data visualization. Experienced to Integrate and standardize structured and unstructured data to enable faster insights using cloud technology. Enabling data-driven insights across the enterprise. Job Responsibilities He/she should be able to design implement and deliver complex Data Warehousing/Data Lake, Cloud Data Management, and Data Integration project assignments. Technical Design and Development – Expertise in any of the following skills. Any ETL tools (Informatica, Talend, Matillion, Data Stage), andhosting technologies like the AWS stack (Redshift, EC2) is mandatory. Any BI toolsamong Tablau, Qlik & Power BI and MSTR. Informatica MDM, Customer Data Management. Expert knowledge of SQL with the capability to performance tune complex SQL queries in tradition and distributed RDDMS systems is must. Experience across Python, PySpark and Unix/Linux Shell Scripting. Project Managementis must to have. Should be able create simple to complex project plans in Microsoft Project Plan and think in advance about potential risks and mitigation plans as per project plan. Task Management – Should be able to onboard team on the project plan and delegate tasks to accomplish milestones as per plan. Should be comfortable in discussing and prioritizing work items with team members in an onshore-offshore model. Handle Client Relationship – Manage client communication and client expectations independently or with support of reporting manager. Should be able to deliver results back to the Client as per plan. Should have excellent communication skills. Education Bachelor of Technology Master's Equivalent - Engineering Work Experience Overall, 5- 7years of relevant experience inData Warehousing, Data management projects with some experience in the Pharma domain. We are hiring for following roles across Data management tech stacks - ETL toolsamong Informatica, IICS/Snowflake,Python& Matillion and other Cloud ETL. BI toolsamong Power BI and Tableau. MDM - Informatica/ Raltio, Customer Data Management. Azure cloud Developer using Data Factory and Databricks Data Modeler-Modelling of data - understanding source data, creating data models for landing, integration. Python/PySpark -Spark/ PySpark Design, Development, and Deployment
Posted 1 week ago
4.0 - 8.0 years
13 - 18 Lacs
Noida
Work from Office
Position Summary To be a technology expert architecting solutions and mentoring people in BI / Reporting processes with prior expertise in the Pharma domain. Job Responsibilities Independently, he/she should be able to drive and deliver complex reporting and BI project assignments in PowerBI on AWS/Azure Cloud. Should be able to design and deliver across Power BI services, Power Query, DAX, and data modelling concepts. Should be able to write complex SQLs focusing on Data Aggregation and analytic calculations used in the reporting KPIs. Be able to analyse the data and understand the requirements directly from customer or from project teams across pharma commercial data sets Should be able to drive the team on the day-to-day tasks in alignment with the project plan and collaborate with team to accomplish milestones as per plan. Should be comfortable in discussing and prioritizing work items in an onshore-offshore model. Able to think analytically, use a systematic and logical approach to analyse data, problems, and situations. Manage client communication and client expectations independently. Should be able to deliver results back to the Client as per plan. Should have excellent communication skills . Education BE/B.Tech Master of Computer Application Work Experience Should have 4-8 years of working on experience in developing Power BI reports. Must have proficiency in Power BI services, Power Query, DAX, and data modelling concepts. Should have experience in design techniques such as UI designing and creating mock-ups/intuitive visualizations seamless user experience. Should have expertise in writing complex SQLs focusing on Data Aggregation and analytic calculations used for deriving the reporting KPIs. Strong understanding of data integration, ETL processes, data warehousing , preferably on AWS Redshift and/or Snowflake. Excellent problem-solving skills with the ability to troubleshoot and resolve technical issues. Strong communication and interpersonal skills, with the ability to collaborate effectively with cross-functional teams. Good to have experience in the Pharma Commercial data sets and related KPIs for Sale Performance, Managed Market, Customer 360, Patient Journey etc. Good to have experience and additional know-how on other reporting tools. Behavioural Competencies Teamwork & Leadership Motivation to Learn and Grow Ownership Cultural Fit Talent Management Technical Competencies Problem Solving Lifescience Knowledge Communication Capability Building / Thought Leadership Power BI SQL Business Intelligence(BI) Snowflake
Posted 1 week ago
10.0 - 15.0 years
13 - 18 Lacs
Bengaluru
Work from Office
Position Summary Looking for a Salesforce Data Cloud Engineer to design, implement, and manage data integrations and solutions using Salesforce Data Cloud (formerly Salesforce CDP). This role is essential for building a unified, 360-degree view of the customer by integrating and harmonizing data across platforms. Job Responsibilities Consolidate the Customer data to create a Unified Customer profile Design and implement data ingestion pipelines into Salesforce Data Cloud from internal and third-party systems . Work with stakeholders to define Customer 360 data model requirements, identity resolution rules, and calculated insights. Configure and manage the Data Cloud environment, including data streams, data bundles, and harmonization. Implement identity resolution, micro segmentation, and activation strategies. Collaborate with Salesforce Marketing Cloud, to enable real-time personalization and journey orchestration. Ensure data governance, and platform security. Monitor data quality, ingestion jobs, and overall platform performance. Education BE/B.Tech Master of Computer Application Work Experience Overall experience of minimum 10 years in Data Management and Data Engineering role, with a minimum experience of 3 years as Salesforce Data Cloud Data Engineer Hands-on experience with Salesforce Data Cloud (CDP), including data ingestion, harmonization, and segmentation. Proficient in working with large datasets, data modeling, and ETL/ELT processes. Understanding of Salesforce core clouds (Sales, Service, Marketing) and how they integrate with Data Cloud. Experience with Salesforce tools such as Marketing Cloud. Strong knowledge of SQL, JSON, Apache Iceberg and data transformation logic. Familiarity with identity resolution and customer 360 data unification concepts. Salesforce certifications (e.g., Salesforce Data Cloud Accredited Professional, Salesforce Administrator, Platform App Builder). Experience with CDP platforms other than Salesforce (e.g., Segment, Adobe Experience Platform (Good to have)). Experience with cloud data storage and processing tools (Azure, Snowflake, etc.). Behavioural Competencies Teamwork & Leadership Motivation to Learn and Grow Ownership Cultural Fit Talent Management Technical Competencies Problem Solving Azure Data Factory Azure DevOps Azure SQL
Posted 1 week ago
5.0 - 7.0 years
12 - 18 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
We are hiring an experienced Integration Engineer with deep expertise in Dell Boomi and proven skills in Python, AWS, and automation frameworks. This role focuses on building and maintaining robust integration pipelines between enterprise systems like Salesforce, Snowflake, and EDI platforms, enabling seamless data flow and test automation. Key Responsibilities: Design, develop, and maintain integration workflows using Dell Boomi. Build and enhance backend utilities and services using Python to support Boomi integrations. Integrate test frameworks with AWS services such as Lambda, API Gateway, CloudWatch, etc. Develop utilities for EDI document automation (e.g., generating and validating EDI 850 purchase orders). Perform data syncing and transformation between systems like Salesforce, Boomi, and Snowflake. Automate post-test data cleanup and validation within Salesforce using Boomi and Python. Implement infrastructure-as-code using Terraform to manage cloud resources. Create and execute API tests using Postman, and automate test cases using Cucumber and Gherkin. Integrate test results into Jira and X-Ray for traceability and reporting. Must-Have Qualifications: 5 to 7 years of professional experience in software or integration development. Strong hands-on experience with Dell Boomi (Atoms, Integration Processes, Connectors, APIs). Solid programming experience with Python. Experience working with AWS services: Lambda, API Gateway, CloudWatch, S3, etc. Working knowledge of Terraform for cloud infrastructure automation. Familiarity with SQL and modern data platforms (e.g., Snowflake). Experience working with Salesforce and writing SOQL queries. Understanding of EDI document standards and related integration use cases. Test automation experience using Cucumber, Gherkin, Postman. Integration of QA/test reports with Jira, X-Ray, or similar platforms. Familiarity with CI/CD tools like GitHub Actions, Jenkins, or similar. Tools & Technologies: Integration: Dell Boomi, REST/SOAP APIs Languages: Python, SQL Cloud: AWS (Lambda, API Gateway, CloudWatch, S3) Infrastructure: Terraform Data Platforms: Snowflake, Salesforce Automation & Testing: Cucumber, Gherkin, Postman DevOps: Git, GitHub Actions Tracking/Reporting: Jira, X-Ray Location-Remote, Delhi NCR, Bangalore, Chennai, Pune, Kolkata, Ahmedabad, Mumbai, Hyderabad
Posted 1 week ago
6.0 - 8.0 years
10 - 20 Lacs
Noida, Hyderabad, Pune
Work from Office
3-4 Years hands-on experience with Snowflake database Strong SQL, PL/SQL, and Snowflake functionality experience Strong exposure Oracle, SQL server, etc. Exposure to cloud storage services like AWS S3 2-3 years Informatica PowerCenter
Posted 1 week ago
8.0 - 13.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Job Overview: We are looking for a BI & Visualization Developer who will be part of our Analytics Practice and will be expected to actively work in a multi-disciplinary fast paced environment. This role requires a broad range of skills and the ability to step into different roles depending on the size and scope of the project; its primary responsibility is to support the design, development and maintainance of business intelligence and analytics solutions. Responsibilities: Develop reports, dashboards, and advanced visualizations. Works closely with the product managers, business analysts, clients etc. to understand the needs / requirements and develop visualizations needed. Provide support to new of existing applications while recommending best practices and leading projects to implement new functionality. Learn and develop new visualization techniques as required to keep up with the contemporary visualization design and presentation. Reviews the solution requirements and architecture to ensure selection of appropriate technology, efficient use of resources and integration of multiple systems and technology. Collaborate in design reviews and code reviews to ensure standards are met. Recommend new standards for visualizations. Build and reuse template/components/web services across multiple dashboards Support presentations to Customers and Partners Advising on new technology trends and possible adoption to maintain competitive advantage Mentoring Associates Experience Needed: 8+ years of related experience is required. A Bachelor degree or Masters degree in Computer Science or related technical discipline is required Highly skilled in data visualization tools like PowerBI, Tableau, Qlikview etc. Very Good Understanding of PowerBI Tabular Model/Azure Analysis Services using large datasets. Strong SQL coding experience with performance optimization experience for data queries. Understands different data models like normalized, de-normalied, stars, and snowflake models. Worked in big data environments, cloud data stores, different RDBMS and OLAP solutions. Experience in design, development, and deployment of BI systems. Candidates with ETL experience preferred. Is familiar with the principles and practices involved in development and maintenance of software solutions and architectures and in service delivery. Has strong technical background and remains evergreen with technology and industry developments. Additional Demonstrated ability to have successfully completed multiple, complex technical projects Prior experience with application delivery using an Onshore/Offshore model Experience with business processes across multiple Master data domains in a services based company Demonstrates a rational and organized approach to the tasks undertaken and an awareness of the need to achieve quality. Demonstrates high standards of professional behavior in dealings with clients, colleagues and staff. Strong written communication skills. Is effective and persuasive in both written and oral communication. Experience with gathering end user requirements and writing technical documentation Time management and multitasking skills to effectively meet deadlines under time-to-market pressure May require occasional travel
Posted 1 week ago
5.0 - 10.0 years
20 - 35 Lacs
Bengaluru
Work from Office
Senior Data Engineer Our Mission SPAN is enabling electrification for all We are a mission-driven company designing, building, and deploying products that electrify the built environment, reduce carbon emissions, and slow the effects of climate change. Decarbonization is the process to reduce or remove greenhouse gas emissions, especially carbon dioxide, from entering our atmosphere. Electrification is the process of replacing fossil fuel appliances that run on gas or oil with all-electric upgrades for a cleaner way to power our lives. At SPAN, we believe in: Enabling homes and vehicles powered by clean energy Making electrification upgrades possible Building more resilient homes with reliable backup Designing a flexible and distributed electrical grid The Role As a Data Engineer you would be working to design, build, test and create infrastructure necessary for real time analytics and batch analytics pipelines. You will work with multiple teams within the org to provide analysis, insights on the data. You will also be involved in writing ETL processes that support data ingestion. You will also guide and enforce best practices for data management, governance and security. You will build infrastructure to monitor these data pipelines / ETL jobs / tasks and create tooling/infrastructure for providing visibility into these. Responsibilities We are looking for a Data Engineer with passion for building data pipelines, working with product, data science and business intelligence teams and delivering great solutions. As a part of the team you:- Acquire deep business understanding on how SPAN data flows from IoT device to cloud through the system and build scalable and optimized data solutions that impact many stakeholders. Be an advocate for data quality and excellence of our platform. Build tools that help streamline the management and operation of our data ecosystem. Ensure best practices and standards in our data ecosystem are shared across teams. Work with teams within the company to build close relationships with our partners to understand the value our platform can bring and how we can make it better. Improve data discovery by creating data exploration processes and promoting adoption of data sources across the company. Have a desire to write tools and applications to automate work rather than do everything by hand. Assist internal teams in building out data logging, alerting and monitoring for their applications Are passionate about CI/CD process. Design, develop and establish KPIs to monitor analysis and provide strategic insights to drive growth and performance. About You Required Qualifications Bachelor's Degree in a quantitative discipline: computer science, statistics, operations research, informatics, engineering, applied mathematics, economics, etc. 5+ years of relevant work experience in data engineering, business intelligence, research or related fields. Expert level production-grade, programming experience in at least one of these languages (Python, Kotlin, or other JVM based languages) Experience in writing clean, concise and well structured code in one of the above languages. Experience working with Infrastructure-as-code tools: Pulumi, Terraform, etc. Experience working with CI/CD systems: Circle-CI, Github Actions, Argo-CD, etc. Experience managing data engineering infrastructure through Docker and Kubernetes Experience working with latency data processing solutions like Flink, Prefect, AWS Kinesis, Kafka, Spark Stream processing etc. Experience with SQL/Relational databases, OLAP databases like Snowflake. Experience working in AWS: S3, Glue, Athena, MSK, EMR, ECR etc. Bonus Qualifications Experience with the Energy industry Experience with building IoT and/or hardware products Understanding of electrical systems and residential loads Experience with data visualization using Tableau. Experience in Data loading tools like FiveTran as well as data debugging tools such as DataDog Life at SPAN Our Bengaluru team plays a pivotal role in SPANs continued growth and expansion. Together, were driving engineering , product development , and operational excellence to shape the future of home energy solutions. As part of our team in India, youll have the opportunity to collaborate closely with our teams in the US and across the globe. This international collaboration fosters innovation, learning, and growth, while helping us achieve our bold mission of electrifying homes and advancing clean energy solutions worldwide. Our in-office culture offers the chance for dynamic interactions and hands-on teamwork, making SPAN a truly collaborative environment where every team members contribution matters. Our climate-focused culture is driven by a team of forward-thinkers, engineers, and problem-solvers who push boundaries every day. Do mission-driven work: Every role at SPAN directly advances clean energy adoption. Bring powerful ideas to life: We encourage diverse ideas and perspectives to drive stronger products. Nurture an innovation-first mindset: We encourage big thinking and bold action. Deliver exceptional customer value: We value hard work, and the ability to deliver exceptional customer value. Benefits at SPAN India Generous paid leave Comprehensive Insurance & Health Benefits Centrally located office in Bengaluru with easy access to public transit, dining, and city amenities Interested in joining our team? Apply today and well be in touch with the next steps!
Posted 1 week ago
5.0 - 10.0 years
10 - 15 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
We are looking for a skilled Data Analyst with excellent communication skills and deep expertise in SQL, Tableau, and modern data warehousing technologies. This role involves designing data models, building insightful dashboards, ensuring data quality, and extracting meaningful insights from large datasets to support strategic business decisions. Key Responsibilities: Write advanced SQL queries to retrieve and manipulate data from cloud data warehouses such as Snowflake, Redshift, or BigQuery. Design and develop data models that support analytics and reporting needs. Build dynamic, interactive dashboards and reports using tools like Tableau, Looker, or Domo. Perform advanced analytics techniques including cohort analysis, time series analysis, scenario analysis, and predictive analytics. Validate data accuracy and perform thorough data QA to ensure high-quality output. Investigate and troubleshoot data issues; perform root cause analysis in collaboration with BI or data engineering teams. Communicate analytical insights clearly and effectively to stakeholders. Required Skills & Qualifications: Excellent communication skills are mandatory for this role. 5+ years of experience in data analytics, BI analytics, or BI engineering roles. Expert-level skills in SQL, with experience writing complex queries and building views. Proven experience using data visualization tools like Tableau, Looker, or Domo. Strong understanding of data modeling principles and best practices. Hands-on experience working with cloud data warehouses such as Snowflake, Redshift, BigQuery, SQL Server, or Oracle. Intermediate-level proficiency with spreadsheet tools like Excel, Google Sheets, or Power BI, including functions, pivots, and lookups. Bachelor's or advanced degree in a relevant field such as Data Science, Computer Science, Statistics, Mathematics, or Information Systems. Ability to collaborate with cross-functional teams, including BI engineers, to optimize reporting solutions. Experience in handling large-scale enterprise data environments. Familiarity with data governance, data cataloging, and metadata management tools (a plus but not required). Location : - Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote
Posted 1 week ago
6.0 - 11.0 years
10 - 20 Lacs
Hyderabad
Hybrid
Business Data Analyst - HealthCare Position Description Job Summary We are seeking an experienced and results-driven Business Data Analyst with 5+ years of hands-on experience in data analytics, visualization, and business insight generation. This role is ideal for someone who thrives at the intersection of business and datatranslating complex data sets into compelling insights, dashboards, and strategies that support decision-making across the organization. You will collaborate closely with stakeholders across departments to identify business needs, design and build analytical solutions, and tell compelling data stories using advanced visualization tools. Key Responsibilities Data Analytics & Insights Analyze large and complex data sets to identify trends, anomalies, and opportunities that help drive business strategy and operational efficiency. • Dashboard Development & Data Visualization Design, develop, and maintain interactive dashboards and visual reports using tools like Power BI, Tableau, or Looker to enable data-driven decisions. • Business Stakeholder Engagement Collaborate with cross-functional teams to understand business goals, define metrics, and convert ambiguous requirements into concrete analytical deliverables. • KPI Definition & Performance Monitoring Define, track, and report key performance indicators (KPIs), ensuring alignment with business objectives and consistent measurement across teams. • Data Modeling & Reporting Automation Work with data engineering and BI teams to create scalable, reusable data models and automate recurring reports and analysis processes. • Storytelling with Data Communicate findings through clear narratives supported by data visualizations and actionable recommendations to both technical and non-technical audiences. • Data Quality & Governance Ensure accuracy, consistency, and integrity of data through validation, testing, and documentation practices. Required Qualifications Bachelor’s or Master’s degree in Business, Economics, Statistics, Computer Science, Information Systems, or a related field. • 5+ years of professional experience in a data analyst or business analyst role with a focus on data visualization and analytics. • Proficiency in data visualization tools: Power BI, Tableau, Looker (at least one). • Strong experience in SQL and working with relational databases to extract, manipulate, and analyze data. • Deep understanding of business processes, KPIs, and analytical methods. • Excellent problem-solving skills with attention to detail and accuracy. • Strong communication and stakeholder management skills with the ability to explain technical concepts in a clear and business-friendly manner. • Experience working in Agile or fast-paced environments. Preferred Qualifications Experience working with cloud data platforms (e.g., Snowflake, BigQuery, Redshift). • Exposure to Python or R for data manipulation and statistical analysis. • Knowledge of data warehousing, dimensional modeling, or ELT/ETL processes. • Domain experience in Healthcare is a plus.
Posted 1 week ago
5.0 - 10.0 years
5 - 10 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
We are seeking a skilled Snowflake Developer to design, develop, and manage scalable data solutions using the Snowflake cloud data platform. The ideal candidate will have deep experience in data warehousing, SQL development, ETL processes, and cloud data architecture. Understanding of Control M and Tableau will be an added advantage. 1. Snowflake (Cloud Data Warehouse):1. Good understanding of Snowflake ECO system2. Good experience on Data modeling and Dimensional Modeling and techniques and will be able to drive the Technical discussions with IT & Business and Architects / Data Modelers3. Need to guide the team and provide the technical solutions4. Need to prepare the technical solution and architectures as part of project requirements5. Virtual Warehouse (Compute) - Good Understanding of Warehouse creation & manage6. Data Modeling & Storage - Strong knowledge on LDM/PDM design7. Data Loading/Unloading and Data Sharing- Should have good knowledge8. SnowSQL (CLI)- Expertise and excellent understanding of Snowflake Internals and Integration9. Strong hands on experience on SNOWSQL queries and Stored procedures and performance tuning techniques10. Good knowledge on SNOWSQL Scripts preparation the data validation and Audits11. SnowPipe Good knowledge of Snow pipe implementation12. Expertise and excellent understanding of S3 - Internal data copy/movement13. Good knowledge on Security & Readers and Consumers accounts14. Good knowledge and hands on experience on Query performance tuning implementation techniques2. SQL Knowledge:1. Advance SQL knowledge and hands on experience on complex queries writing using with Analytical functions2. Strong knowledge on stored procedures3. Troubleshooting, problem solving and performance tuning of SQL queries accessing data warehouse
Posted 1 week ago
2.0 - 7.0 years
12 - 16 Lacs
Hyderabad
Work from Office
India-based candidates only. We’re primarily a NYC-based team, but have a growing international team. ROLE OVERVIEW: As a Senior Data Analyst, you will be pivotal in driving strategic decision-making for the Codecademy consumer and enterprise business lines. Reporting to the senior manager, Strategy and business Operations, this role will work cross-functionally to tackle the business’ highest priorities. You will utilize your technical expertise in data analytics, financial modeling, and executive communication to create actionable business strategies that drive growth. OPPORTUNITY HIGHLIGHTS and KEY RESPONSIBILITIES: Collaborate with a diverse array of stakeholders (Product, Marketing, Curriculum, etc.) to analyze key business drivers, identify new opportunities, risks, & influence strategic roadmaps Own marketing analytics insights – evaluate promotion, campaign, and performance marketing results, make recommendations, and contribute to future strategy Structure complex ambiguous strategic problems into clear hypothesis, build financial models, & create investment cases to drive growth Define and set business goals & success criteria Become expert in our competitive landscape and analyze market trends SKILLS & QUALIFICATIONS: 5 years of relevant professional experience in business operations, analytics, strategy, or a related industry/discipline Strong analytical ability with experience in data analysis, business/financial modeling, and forecasting Demonstrated ability to distill complex issues into structured frameworks and develop concrete action plans Experience building relationships and communicating with a broad array of stakeholders across all levels and functions. You have outstanding written & verbal communication skills with the ability to operate independently with senior stakeholders, manage relationships, deliverables, and timelines with minimal supervision. Data self-sufficiency - proficiency with SQL or Python & data visualization tools (Tech stack: Snowflake, Looker, Python, DBT).
Posted 1 week ago
3.0 - 5.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries,Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Mandatory Skills: Snowflake. Experience: 3-5 Years.
Posted 1 week ago
4.0 - 9.0 years
12 - 15 Lacs
Pune
Work from Office
Position: Data Engineer Location: Remote Duration: 10-12 Months (Contract) Experience: 4-10 Years Shift Time: Australian Shift (5AM TO 1 PM IST) Key Requirements Strong SQL skills Snowflake Azure Data Factory Power BI SSIS (Nice to have)
Posted 1 week ago
3.0 - 8.0 years
11 - 21 Lacs
Pune
Work from Office
Hiring for Denodo Admin with 3+ years experience with below skills: Must Have: - Denodo admin logical data models, views & caching - ETL pipelines (Informatica/Talend) for EDW/data lakes, performance issues - SQL, Informatica, Talend, Big Data, Hive Required Candidate profile - Design, develop & maintain ETL pipelines using Informatica PowerCenter or Talend to extract, Hive - Optimize & troubleshoot complex SQL queries - Immediate Joiner is plus - Work from office is must
Posted 1 week ago
2.0 - 4.0 years
6 - 10 Lacs
Bengaluru
Work from Office
As a TechOps Engineer at the L1 level, your role will involve working in collaboration with other teams to ensure smooth and efficient operations of the software development process. Your primary responsibility will be to develop/configure and maintain automated systems for building, testing, deploying, and monitoring software applications. Key responsibilities and duties for this role may include: Should have knowledge on Windows / Linux Operating systems. Basic automation skills such as Shell Scripting, SQL Queries will be an added advantage. Excellent written and verbal communication skills in English. Knowledge of ITILv4 Process, Vmware and Networks Basics Must be willing to work in 24/7 environment. Server monitoring & support on heterogeneous Infrastructure domains comprising of Server, Storage, Network. Level 1 Support is responsible for initial triage, trouble shooting and escalation as per service levels and documentation. Follow appropriate departmental and company procedures and policies (i.e. change control, security and auditing, release, configuration, problem and incident management). Generating reports related to availability, performance and capacity bottle necks at desired intervals as per operational requirements. Logging of incidents and events based on appropriate categories using ServiceNow & JIRA ticketing tool and assigning it to the appropriate stake holders for resolution. Interact with internal teams and external 3rd party vendors to trouble shoot and resolve complex problems. Maintain and enhance KB (Knowledge Base) articles. Handle shift alone without any dependencies. Knowledge of best practices and IT operations in an always-up, always-available service Develop and provide system performance and status reports as required by business unit. Roles and Responsibilities Qualifications: Bachelor's degree in computer science, information technology, or a related field. Familiarity with production monitoring tools, a strong trouble shooter who can help w/ investigation/data gathering w/I both Windows and Unix environments. Someone w/ prior experience in application monitoring setup (solar winds experience is preferred). Middleware application hands-on with Tomcat, JBoss and Apache. Scripting knowledge on Shell / Ansible is preferred. Good understanding of production support processes, SOP's and working with SLAs Work in 24/7 shifts, coordinate with production support leads and monitor production jobs. Understand SOP's and execute jobs and troubleshooting procedures. Adhere to SLA set by the production control lead. Knowledge on cloud computing environments such as AWS, GCP, or Azure is an added advantage.
Posted 1 week ago
4.0 - 7.0 years
5 - 16 Lacs
Hyderabad, Bengaluru
Work from Office
Roles and Responsibilities : Design, develop, test, deploy and maintain Snowflake data warehouses for clients. Collaborate with cross-functional teams to gather requirements and deliver high-quality solutions. Develop ETL processes using Python scripts to extract data from various sources and load it into Snowflake tables. Troubleshoot issues related to Snowflake performance tuning, query optimization, and data quality. Job Requirements : 4-7 years of experience in developing large-scale data warehouses on AWS using Snowflake. Strong understanding of Lambda expressions in Snowflake SQL language. Experience with Python programming language for ETL development.
Posted 1 week ago
10.0 - 15.0 years
30 - 35 Lacs
Hyderabad, Delhi / NCR
Hybrid
- Develop and manage SSIS ETL packages for large financial firm onboarding data sets - Debugging and performance tuning of data warehouse - develop stored procedures for application team Required Candidate profile 5+ years of development experience in SQL and SSIS and ETL Some background in Python will be a plus Proven experience with designing and managing data warehouse Financial industry experience preferred
Posted 1 week ago
5.0 - 10.0 years
15 - 25 Lacs
Hyderabad
Work from Office
Job Summary: We are seeking a highly skilled Informatica IDMC & Data Governance Specialist to join our dynamic team. This role requires hands-on expertise in various IDMC modules, including Snowflake, Cloud Data Quality (CDQ), Cloud Application Integration (CAI), Cloud Data Governance & Catalog (CDGC), and Cloud Data Marketplace (CDMP). The ideal candidate will have a strong background in data governance and data quality, with a proven ability to create and configure data profiles and quality rules. If you have a passion for cloud technologies and data management, this is an exciting opportunity to contribute to the ongoing success of our data initiatives. Key Responsibilities: IDMC Modules Management: Manage and support the following IDMC modules: Snowflake (including RBAC) Cloud Data Quality (CDQ) Cloud Application Integration (CAI) Cloud Data Governance & Catalog (CDGC) Cloud Data Marketplace (CDMP) Data Profiling & Quality: Create and configure Data Profiles with appropriate quality rules to ensure data meets required standards. Data Governance & Compliance: Collaborate with data governance teams to ensure compliance with data governance policies and standards. Platform Optimization & Troubleshooting: Continuously monitor the performance of IDMC platforms, resolving issues, optimizing configurations, and supporting ongoing integrations and data workflows. Collaboration & Support: Work cross-functionally with infrastructure, application, and data teams to ensure efficient use of the IDMC platform and ensure business data needs are met. Mandatory Skills: Extensive work experience is required for below areas to configure everything from scratch for any complex requirement CDI (Cloud Data Integration) CDQ (Cloud Data Quality) CAI (Cloud Application Integration) EDC (On-Prem Informatica Enterprise Data Catalog) Nice-to-Have Skills: Knowledge of Data Integration (IPC) and Git. Prior experience in Data Governance initiatives, including the use of governance frameworks and best practices. Qualifications: Strong analytical and problem-solving abilities. Excellent communication skills, both written and verbal, to explain technical concepts to nontechnical stakeholders. Ability to prioritize tasks, manage multiple projects, and work in a fast-paced environment.
Posted 1 week ago
5.0 - 9.0 years
7 - 17 Lacs
Pune
Work from Office
Job Overview: Diacto is seeking an experienced and highly skilled Data Architect to lead the design and development of scalable and efficient data solutions. The ideal candidate will have strong expertise in Azure Databricks, Snowflake (with DBT, GitHub, Airflow), and Google BigQuery. This is a full-time, on-site role based out of our Baner, Pune office. Qualifications: B.E./B.Tech in Computer Science, IT, or related discipline MCS/MCA or equivalent preferred Key Responsibilities: Design, build, and optimize robust data architecture frameworks for large-scale enterprise solutions Architect and manage cloud-based data platforms using Azure Databricks, Snowflake, and BigQuery Define and implement best practices for data modeling, integration, governance, and security Collaborate with engineering and analytics teams to ensure data solutions meet business needs Lead development using tools such as DBT, Airflow, and GitHub for orchestration and version control Troubleshoot data issues and ensure system performance, reliability, and scalability Guide and mentor junior data engineers and developers Experience and Skills Required: 5 to12 years of experience in data architecture, engineering, or analytics roles Hands-on expertise in Databricks , especially Azure Databricks Proficient in Snowflake , with working knowledge of DBT, Airflow, and GitHub Experience with Google BigQuery and cloud-native data processing workflows Strong knowledge of modern data architecture, data lakes, warehousing, and ETL pipelines Excellent problem-solving, communication, and analytical skills Nice to Have: Certifications in Azure, Snowflake, or GCP Experience with containerization (Docker/Kubernetes) Exposure to real-time data streaming and event-driven architecture Why Join Diacto Technologies? Collaborate with experienced data professionals and work on high-impact projects Exposure to a variety of industries and enterprise data ecosystems Competitive compensation, learning opportunities, and an innovation-driven culture Work from our collaborative office space in Baner, Pune How to Apply: Option 1 (Preferred) Copy and paste the following link on your browser and submit your application for the automated interview process : - https://app.candidhr.ai/app/candidate/gAAAAABoRrTQoMsfqaoNwTxsE_qwWYcpcRyYJk7NzSUmO3LKb6rM-8FcU58CUPYQKc65n66feHor-TGdCEfyouj0NmKdgYcNbA==/ Option 2 1. Please visit our website's career section at https://www.diacto.com/careers/ 2. Scroll down to the " Who are we looking for ?" section 3. Find the listing for " Data Architect (Data Bricks) " and 4. Proceed with the virtual interview by clicking on " Apply Now ."
Posted 1 week ago
8.0 - 13.0 years
17 - 27 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Job Description for QA Engineer: 7+ years of experience in ETL Testing, Snowflake, DWH Concepts. Strong SQL knowledge & debugging skills are a must. Experience on Azure and Snowflake Testing is plus Experience with Qlik Replicate and Compose tools (Change Data Capture) tools is considered a plus Strong Data warehousing Concepts, ETL tools like Talend Cloud Data Integration, Pentaho/Kettle tool Experience in JIRA, Xray defect management toolis good to have. Exposure to the financial domain knowledge is considered a plus Testing the data-readiness (data quality) address code or data issues Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions Demonstrate strong collaborative experience across regions (APAC, EMEA and NA) to effectively and efficiently identify root cause of code/data issues and come up with a permanent solution Prior experience with State Street and Charles River Development (CRD) considered a plus Experience in tools such as PowerPoint, Excel, SQL Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus
Posted 1 week ago
6.0 - 11.0 years
9 - 19 Lacs
Hyderabad
Work from Office
Location: Hyderabad (Preferred) / Bangalore (If strong candidates are available) Type: Contractual Key Responsibilities: Use data mappings and models provided by the data modeling team to build robust Snowflake data pipelines . Design and implement pipelines adhering to 2NF/3NF normalization standards . Develop and maintain ETL processes for integrating data from multiple ERP and source systems . Build scalable and secure Snowflake data architecture supporting Data Quality (DQ) needs. Raise CAB requests via Carriers change process and manage production deployments . Provide UAT support and ensure smooth transition of finalized pipelines to support teams. Create and maintain comprehensive technical documentation for traceability and handover. Collaborate with data modelers, business stakeholders, and governance teams to enable DQ integration. Optimize complex SQL queries , perform performance tuning , and ensure data ops best practices . Requirements: Strong hands-on experience with Snowflake Expert-level SQL skills and deep understanding of data transformation Solid grasp of data architecture and 2NF/3NF normalization techniques Experience with cloud-based data platforms and modern data pipeline design Exposure to AWS data services like S3, Glue, Lambda, Step Functions (preferred) Proficiency with ETL tools and working in Agile environments Familiarity with Carrier CAB process or similar structured deployment frameworks Proven ability to debug complex pipeline issues and enhance pipeline scalability Strong communication and collaboration skills
Posted 1 week ago
3.0 - 8.0 years
7 - 17 Lacs
Hyderabad
Work from Office
Job Title: Database Engineer Analytics – L Responsibilities As a Database Engineer supporting the bank’s Analytics platforms, you will be a part of a centralized team of database engineers who are responsible for the maintenance and support of Citizens’ most critical databases. A Database Engineer will be responsible for: • Requires conceptual knowledge of database practices and procedures such as DDL, DML and DCL. • Requires how to use basic SQL skills including SELECT, FROM, WHERE and ORDER BY. • Ability to code SQL Joins, subqueries, aggregate functions (AVG, SUM, COUNT), and use data manipulation techniques (UPDATE, DELETE). • Understanding basic data relationships and schemas. • Develop Basic Entity-Relationship diagrams. • Conceptual understanding of cloud computing • Can solves routine problems using existing procedures and standard practices. • Can look up error codes and open tickets with vendors • Ability to execute explains and identify poorly written queries • Review data structures to ensure they adhere to database design best practices. • Develop a comprehensive backup plan. • Understanding the different cloud models (IaaS, PaaS, SaaS), service models, and deployment options (public, private, hybrid). • Solves standard problems by analyzing possible solutions using experience, judgment and precedents. • Troubleshoot database issues, such as integrity issues, blocking/deadlocking issues, log shipping issues, connectivity issues, security issues, memory issues, disk space, etc. • Understanding cloud security concepts, including data protection, access control, and compliance. • Manages risks that are associated with the use of information technology. • Identifies, assesses, and treats risks that might affect the confidentiality, integrity, and availability of the organization's assets. • Ability to design and implement highly performing database using partitioning & indexing that meet or exceed the business requirements. • Documents a complex software system design as an easily understood diagram, using text and symbols to represent the way data needs to flow. • Ability to code complex SQL. • Performs effective backup management and periodic databases restoration testing. • General DB Cloud networking skills – VPCs, SGs, KMS keys, private links. JOB DESCRIPTION • Ability to develop stored procedures and at least one scripting language for reusable code and improved performance. Know how to import and export data into and out of databases using ETL tools, code, migration tools like DMS or scripts • Knowledge of DevOps principles and tools, such as CI/CD. • Attention to detail and demonstrate a customer centric approach. • Solves complex problems by taking a new perspective on existing solutions; exercises judgment based on the analysis of multiple sources of information • Ability to optimize queries for performance and resource efficiency • Review database metrics to identify performance issues. Required Qualifications • 2-10+ years of experience with database management/administration, Redshift, Snowflake or Neo4J • 2-10+ years of experience working with incident, change and problem management processes and procedures. • Experience maintaining and supporting large-scale critical database systems in the cloud. • 2+ years of experience working with AWS cloud hosted databases • An understanding of one programming languages, including at least one front end framework (Angular/React/Vue), such as Python3, Java, JavaScript, Ruby, Golang, C, C++, etc. • Experience with cloud computing, ETL and streaming technologies – OpenShift, DataStage, Kafka • Experience with agile development methodology • Strong SQL performance & tuning skills • Excellent communication and client interfacing skills • Strong team collaboration skills and capacity to prioritize tasks efficiently. Desired Qualifications • Experience working in an agile development environment • Experience working in the banking industry • Experience working in cloud environments such as AWS, Azure or Google • Experience with CI/CD pipeline (Jenkins, Liquibase or equivalent) Education and Certifications • Bachelor’s degree in computer science or related discipline
Posted 1 week ago
3.0 - 8.0 years
20 - 30 Lacs
Ahmedabad
Hybrid
Compare and match data between systems; investigate and fix mismatches. Help build dashboards, support audits, and maintain clear documentation. Manage new data entries, ensure accuracy, and oversee smooth data processes. Required Candidate profile 3 to 5 years experience Experience of Data Governance and systems related DG Confidence in using applications, some systems experience - SAP, HFM, Oracle, Snowflake, Autonomy to review and research
Posted 1 week ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
We are looking for a Data Engineer to join our team and help us to improve the platform that supports one of the best experimentation tools in the world. You will work side by side with other data engineers and site reliability engineers to improve the reliability, scalability, maintenance and operations of all the data products that are part of the experimentation tool at Booking.com. Your day to day work includes but is not limited to: maintenance and operations of data pipelines and products that handles data at big scale; the development of capabilities for monitoring, alerting, testing and troubleshooting of the data ecosystem of the experiment platform; and the delivery of data products that produce metrics for experimentation at scale. You will collaborate with colleagues in Amsterdam to achieve results the right way. This will include engineering managers, product managers, engineers and data scientists. Key Responsibilities and Duties Take ownership of multiple data pipelines and products and provide innovative solutions to reduce the operational workload required to maintain them Rapidly developing next-generation scalable, flexible, and high-performance data pipelines. Contribute to the development of data platform capabilities such as testing, monitoring, debugging and alerting to improve the development environment of data products Solve issues with data and data pipelines, prioritizing based on customer impact. End-to-end ownership of data quality in complex datasets and data pipelines. Experiment with new tools and technologies, driving innovative engineering solutions to meet business requirements regarding performance, scaling, and data quality. Provide self-organizing tools that help the analytics community discover data, assess quality, explore usage, and find peers with relevant expertise. Serve as the main point of contact for technical and business stakeholders regarding data engineering issues, such as pipeline failures and data quality concerns Role requirements Minimum 5 years of hands-on experience in data engineering as a Data Engineer or as a Software Engineer developing data pipelines and products. Bachelors degree in Computer Science, Computer or Electrical Engineering, Mathematics, or a related field or 5 years of progressively responsible experience in the specialty as equivalent Solid experience in at least one programming language. We use Java and Python Experience building production data pipelines in the cloud, setting up data-lakes and server-less solutions Hands-on experience with schema design and data modeling Experience designing systems E2E and knowledge of basic concepts (lb, db, caching, NoSQL, etc) Knowledge of Flink, CDC, Kafka, Airflow, Snowflake, DBT or equivalent tools Practical experience building data platform capabilities like testing, alerting, monitoring, debugging, security Experience working with big data. Experience working with teams located in different timezones is a plus Experience with experimentation, statistics and A/B testing is a plus
Posted 1 week ago
6.0 - 8.0 years
6 - 12 Lacs
Hyderabad
Work from Office
Key Responsibilities: Design, develop, and maintain scalable data pipelines using Snowflake . Develop and optimize complex SQL queries , views, and stored procedures. Migrate data from legacy systems to Snowflake using ETL tools like Informatica, Talend, dbt, or Matillion . Implement data modeling techniques (Star, Snowflake schemas) and maintain data dictionary. Ensure performance tuning, data quality, and security across all Snowflake objects. Integrate Snowflake with BI tools like Tableau, Power BI , or Looker . Collaborate with data analysts, data scientists, and business teams to understand requirements and deliver solutions. Monitor and manage Snowflake environments using tools like SnowSight, Snowsql , or CloudWatch . Participate in code reviews and enforce best practices for data governance and security. Develop automation scripts using Python, Shell , or Airflow for data workflows. Required Skills: 6+ years of experience in data engineering / data warehousing . 3+ years hands-on experience with Snowflake Cloud Data Platform . Strong expertise in SQL, performance tuning, data modeling, and query optimization . Experience with ETL tools like Informatica, Talend, Apache NiFi , or dbt . Proficient in cloud platforms: AWS / Azure / GCP (preferably AWS). Good understanding of DevOps/CI-CD principles for Snowflake deployments. Hands-on experience with scripting languages: Python, Bash, etc. Knowledge of RBAC, masking policies, row access policies in Snowflake.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Snowflake has become one of the most sought-after skills in the tech industry, with a growing demand for professionals who are proficient in handling data warehousing and analytics using this cloud-based platform. In India, the job market for Snowflake roles is flourishing, offering numerous opportunities for job seekers with the right skill set.
These cities are known for their thriving tech industries and have a high demand for Snowflake professionals.
The average salary range for Snowflake professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum
A typical career path in Snowflake may include roles such as: - Junior Snowflake Developer - Snowflake Developer - Senior Snowflake Developer - Snowflake Architect - Snowflake Consultant - Snowflake Administrator
In addition to expertise in Snowflake, professionals in this field are often expected to have knowledge in: - SQL - Data warehousing concepts - ETL tools - Cloud platforms (AWS, Azure, GCP) - Database management
As you explore opportunities in the Snowflake job market in India, remember to showcase your expertise in handling data analytics and warehousing using this powerful platform. Prepare thoroughly for interviews, demonstrate your skills confidently, and keep abreast of the latest developments in Snowflake to stay competitive in the tech industry. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.