Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 10.0 years
20 - 35 Lacs
Pune
Hybrid
At Medtronic you can begin a life-long career of exploration and innovation, while helping champion healthcare access and equity for all. Youll lead with purpose, breaking down barriers to innovation in a more connected, compassionate world. A Day in the Life Our Global Diabetes Capability Center in Pune is expanding to serve more people living with diabetes globally. Our state-of-the-art facility is dedicated to transforming diabetes management through innovative solutions and technologies that reduce the burden of living with diabetes. We’re a mission-driven leader in medical technology and solutions with a legacy of integrity and innovation, join our new Minimed India Hub as Senior Digital Engineer. Responsibilities may include the following and other duties may be assigned: Expertise in translating conceptual needs and business requirements into finalized architectural design. Able to manage large projects or processes that span across other collaborative teams both within and beyond Digital Technology. Operate autonomously to defines, describe, diagram and document the role and interaction of the high-level technological and human components that combine to provide cost effective and innovative solutions to meet evolving business needs. Promotes, guides and governs good architectural practice through the application of well-defined, proven technology and human interaction patterns and through architecture mentorship. Responsible for designing, developing, and maintaining scalable data pipelines, preferably using PySpark. Work with structured and unstructured data from various sources. Optimize and tune PySpark applications for performance and scalability. Deep experience supporting the full lifecycle management of the entire IT portfolio including the selection, appropriate usage, enhancement and replacement of information technology applications, infrastructure and services. Implement data quality checks and ensure data integrity. Monitor and troubleshoot data pipeline issues and ensure timely resolution. Document technical specifications and maintain comprehensive documentation for data pipelines. The ideal candidate is exposed to the fast-paced world of Big Data technology and has experience in building ETL/ELT data solutions using new and emerging technologies while maintaining stability of the platform. Required Knowledge and Experience: Have strong programming knowledge in Java, Scala, or Python or PySpark, SQL. 4-8 years of experience in data engineering, with a focus on PySpark. Proficiency in Python and Spark, with strong coding and debugging skills. Have experience in designing and building Enterprise Data solutions on AWS Cloud or Azure, or Google Cloud Platform (GCP). Experience with big data technologies such as Hadoop, Hive, and Kafka. Strong knowledge of SQL and experience with relational databases (e.g., PostgreSQL, MySQL, SQL Server). Experience with data warehousing solutions like Redshift, Snowflake, Databricks or Google Big Query. Familiarity with data lake architectures and data storage solutions. Knowledge of CI/CD pipelines and version control systems (e.g., Git). Excellent problem-solving skills and the ability to troubleshoot complex issues. Strong communication and collaboration skills, with the ability to work effectively in a team environment. Physical Job Requirements The above statements are intended to describe the general nature and level of work being performed by employees assigned to this position, but they are not an exhaustive list of all the required responsibilities and skills of this position. Regards, Ashwini Ukekar Sourcing Specialist
Posted 2 weeks ago
5.0 - 10.0 years
13 - 22 Lacs
Bengaluru
Work from Office
Job Opportunity: Senior Data Analyst Bangalore Location: Bangalore, India Company: GSPANN Technologies Apply: Send your resume to heena.ruchwani@gspann.com GSPANN is hiring a Senior Data Analyst with 57 years of experience to join our dynamic team in Bangalore! What Were Looking For: Education: Bachelor’s degree in Computer Science, MIS, or a related field Experience: 5–7 years in data analysis, with a strong ability to translate business strategy into actionable insight Advanced SQL expertise Proficiency in Tableau , Power BI , or Domo Experience with AWS , Hive , Snowflake , Presto Ability to define and track KPIs across domains like Sales, Consumer Behavior, and Supply Chain Strong problem-solving skills and attention to detail Excellent communication and collaboration abilities Experience working in Agile environments Retail or eCommerce domain experience is a plus If this sounds like the right fit for you, don’t wait— send your updated resume to heena.ruchwani@gspann.com today!
Posted 2 weeks ago
5.0 - 10.0 years
12 - 27 Lacs
Hyderabad
Work from Office
Proven hands-on experience with Matillion ETL and Snowflake.Strong proficiency in SQL and experience with data modelling best practices.Experience working with international teams and environments.Willingness to work for daily CET overlap
Posted 2 weeks ago
2.0 - 7.0 years
4 - 9 Lacs
Bengaluru
Work from Office
Diverse Lynx is looking for Snowflake Professional to join our dynamic team and embark on a rewarding career journey. Design and Development : Create and implement data warehouse solutions using Snowflake, including data modeling, schema design, and ETL (Extract, Transform, Load) processes. Performance Optimization : Optimize queries, performance-tune databases, and ensure efficient use of Snowflake resources for faster data retrieval and processing. Data Integration : Integrate data from various sources, ensuring compatibility, consistency, and accuracy. Security and Compliance : Implement security measures and ensure compliance with data governance and regulatory requirements, including access control and data encryption. Monitoring and Maintenance : Monitor system performance, troubleshoot issues, and perform routine maintenance tasks to ensure system health and reliability. Collaboration : Collaborate with other teams, such as data engineers, analysts, and business stakeholders, to understand requirements and deliver effective data solutions. Skills and Qualifications : Snowflake Expertise : In-depth knowledge and hands-on experience working with Snowflake's architecture, features, and functionalities. SQL and Database Skills : Proficiency in SQL querying and database management, with a strong understanding of relational databases and data warehousing concepts. Data Modeling : Experience in designing and implementing effective data models for optimal performance and scalability. ETL Tools and Processes : Familiarity with ETL tools and processes to extract, transform, and load data into Snowflake. Performance Tuning : Ability to identify and resolve performance bottlenecks, optimize queries, and improve overall system performance. Data Security and Compliance : Understanding of data security best practices, encryption methods, and compliance standards (such as GDPR, HIPAA, etc. ). Problem-Solving and Troubleshooting : Strong analytical and problem-solving skills to diagnose and resolve issues within the Snowflake environment. Communication and Collaboration : Good communication skills to interact with cross-functional teams and effectively translate business requirements into technical solutions. Scripting and Automation : Knowledge of scripting languages (like Python) and experience in automating processes within Snowflake.
Posted 2 weeks ago
10.0 - 20.0 years
30 - 45 Lacs
Hyderabad
Work from Office
Data Architect Microsoft Fabric, Snowflake & Modern Data Platforms Description: Be a part of our success story. Launch offers talented and motivated people the opportunity to do the best work of their lives in a dynamic and growing company. Through competitive salaries, outstanding benefits, internal advancement opportunities, and recognized community involvement, you will have the chance to create a career you can be proud of. Your new trajectory starts here at Launch. What are we looking for: We are seeking a seasoned Data Architect with strong consulting experience to lead the design and delivery of modern data solutions across global clients. This role emphasizes hands-on architecture and engineering using Microsoft Fabric and Snowflake, while also contributing to internal capability development and practice growth. The ideal candidate will bring deep expertise in data modeling, modern data architecture, and data engineering, with a passion for innovation and client impact. Role: Data Architect Location: Hyderabad Shift Timings: US Overlapping hours Years of Experience: 10+ Years Key Responsibilities: Serve as the lead architect for client engagements, designing scalable, secure, and high performance data solutions using Microsoft Fabric and Snowflake. Apply modern data architecture principles including data lakehouse, ELT/ETL pipelines, and real-time streaming. Collaborate with cross-functional teams (data engineers, analysts, architects) to deliver end-to-end solutions. Translate business requirements into technical strategies with measurable outcomes. Ensure best practices in data governance, quality, and security are embedded in all solutions. Deliver scalable data modeling solutions for various use cases leveraging a modern data platform. Practice & Capability Development Contribute to the development of reusable assets, accelerators, and reference architectures. Support internal knowledge sharing and mentoring across the India-based consulting team. Stay current with emerging trends in data platforms, AI/ML integration, and cloud-native architectures. Collaborate with global teams to align on delivery standards and innovation initiatives. Qualifications: 10+ years of experience in data architecture and engineering, preferably in a consulting environment. Proven experience with Microsoft Fabric and Snowflake platforms. Strong skills in data modeling, data pipeline development, and performance optimization. Familiarity with Azure Synapse, Azure Data Factory, Power BI, and related Azure services. Excellent communication and stakeholder management skills. Experience working with global delivery teams and agile methodologies. Preferred Certifications: SnowPro Core Certification (preferred but not required) Microsoft Certified: Fabric Analytics Engineer Associate Microsoft Certified: Azure Solutions Architect Expert We are Navigators in the Age of Transformation: We use sophisticated technology to transform clients into the digital age, but our top priority is our positive impact on human experience. We ease anxiety and fear around digital transformation and replace it with opportunity. Launch IT is an equal opportunity employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Launch IT is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. About Company: About Launch IT Launch IT India is wholly owned subsidiary of The Planet Group (http://www.launchcg.com; http://theplanetgroup.com ) a US company, offers attractive compensation and work environment for the prospective employees. Launch is an entrepreneurial business and technology consultancy. We help businesses and people navigate from current state to future state.Technology, tenacity, and creativity fuel our solutions with offices in Bellevue, Sacramento, Dallas, San Francisco,Hyderabad&WashingtonD.C. https://www.linkedin.com/company/launch-consulting-group- india/
Posted 2 weeks ago
2.0 - 4.0 years
6 - 11 Lacs
Bengaluru
Work from Office
Zeta Global is looking for an experienced Machine Learning Engineer with industry-proven hands-on experience of delivering machine learning models to production to solve business problems. To be a good fit to join our AI/ML team, you should ideally: Be a thought leader that can work with cross-functional partners to foster a data-driven organisation. Be a strong team player, have experience contributing to a large project as part of a collaborative team effort. Have extensive knowledge and expertise with machine learning engineering best-practices and industry standards. Empower the product and engineering teams to make data-driven decisions. What you need to succeed: 2 to 4 years of proven experience as a Machine Learning Engineer in a professional setting. Proficiency in any programming language (Python preferable). Prior experience in building and deploying Machine learning systems. Experience with containerization: Docker & Kubernetes. Experience with AWS cloud services like EKS, ECS, EMR, Lambda, and others. Fluency with workflow management tools like Airflow or dbt. Familiarity with distributed batch compute technologies such as Spark. Experience with modern data warehouses like Snowflake or BigQuery. Knowledge of MLFlow, Feast, and Terraform is a plus.
Posted 2 weeks ago
4.0 - 9.0 years
7 - 17 Lacs
Hyderabad
Hybrid
Mega Walkin Drive for Senior Software Engineer- Informatica, Teradata, SQL Your future duties and responsibilities: Job Summary: CGI is seeking a skilled and detail-oriented Informatica Developer to join our data engineering team. The ideal candidate will be responsible for designing, developing, and implementing ETL (Extract, Transform, Load) workflows using Informatica PowerCenter (or Informatica Cloud), as well as optimizing data pipelines and ensuring data quality and integrity across systems. Key Responsibilities : Develop, test, and deploy ETL processes using Informatica PowerCenter or Informatica Cloud. Work with business analysts and data architects to understand data requirements and translate them into technical solutions. Integrate data from various sources including relational databases, flat files, APIs, and cloud-based platforms. Create and maintain technical documentation for ETL processes and data flows. Optimize existing ETL workflows for performance and scalability. Troubleshoot and resolve ETL and data-related issues in a timely manner. Implement data validation, transformation, and cleansing techniques. Collaborate with QA teams to support data testing and verification.Ensure compliance with data governance and security policies. Required qualifications to be successful in this role: Minimum 4 years of experience with Informatica PowerCenter or Informatica Cloud. Proficiency in SQL and experience with databases like Oracle, SQL Server, Snowflake, or Teradata. Strong understanding of ETL best practices and data integration concepts. Experience with job scheduling tools like Autosys, Control-M, or equivalent. Knowledge of data warehousing concepts and dimensional modeling. Strong problem-solving skills and attention to detail. Excellent communication and teamwork abilities. Good to have Python or any programming knowledge. Bachelors degree in Computer Science, Information Systems, or related field. Preferred Qualifications: Experience with cloud platforms like AWS, Azure, or GCP. Familiarity with Bigdata/ Hadoop tools (e.g., Spark, Hive) and modern data architectures.Informatica certification is a plus. Experience with Agile methodologies and DevOps practices. Shift Timings : Shift: General Shift (5 Days WFO for initial 8 weeks) Skills: Data Engineering Hadoop Hive Python SQL Teradata Notice Period- 0-45 Days Pre requisites : Aadhar Card a copy, PAN card copy, UAN Disclaimer : The selected candidates will initially be required to work from the office for 8 weeks before transitioning to a hybrid model with 2 days of work from the office each week.
Posted 2 weeks ago
2.0 - 4.0 years
4 - 6 Lacs
Hyderabad
Work from Office
The CDP ETL & Database Engineer will specialize in architecting, designing, and implementing solutions that are sustainable and scalable. The ideal candidate will understand CRM methodologies, with an analytical mindset, and a background in relational modeling in a Hybrid architecture. The candidate will help drive the business towards specific technical initiatives and will work closely with the Solutions Management, Delivery, and Product Engineering teams. The candidate will join a team of developers across the US, India & Costa Rica. Responsibilities : ETL Development The CDP ETL & Database Engineer will be responsible for building pipelines to feed downstream data processes. They will be able to analyze data, interpret business requirements, and establish relationships between data sets. The ideal candidate will be familiar with different encoding formats and file layouts such as JSON and XML. I mplementations & Onboarding Will work with the team to onboard new clients onto the ZMP/CDP+ platform. The candidate will solidify business requirements, perform ETL file validation, establish users, perform complex aggregations, and syndicate data across platforms. The hands-on engineer will take a test-driven approach towards development and will be able to document processes and workflows. Incremental Change Requests The CDP ETL & Database Engineer will be responsible for analyzing change requests and determining the best approach towards implementation and execution of the request. This requires the engineer to have a deep understanding of the platform's overall architecture. Change requests will be implemented and tested in a development environment to ensure their introduction will not negatively impact downstream processes. Change Data Management The candidate will adhere to change data management procedures and actively participate in CAB meetings where change requests will be presented and approved. Prior to introducing change, the engineer will ensure that processes are running in a development environment. The engineer will be asked to do peer-to-peer code reviews and solution reviews before production code deployment. Collaboration & Process Improvement The engineer will be asked to participate in knowledge share sessions where they will engage with peers, discuss solutions, best practices, overall approach, and process. The candidate will be able to look for opportunities to streamline processes with an eye towards building a repeatable model to reduce implementation duration. Job Requirements : The CDP ETL & Database Engineer will be well versed in the following areas: Relational data modeling ETL and FTP concepts Advanced Analytics using SQL Functions Cloud technologies - AWS, Snowflake Able to decipher requirements, provide recommendations, and implement solutions within predefined timeframes. The ability to work independently, but at the same time, the individual will be called upon to contribute in a team setting. The engineer will be able to confidently communicate status, raise exceptions, and voice concerns to their direct manager. Participate in internal client project status meetings with the Solution/Delivery management teams. When required, collaborate with the Business Solutions Analyst (BSA) to solidify requirements. Ability to work in a fast paced, agile environment; the individual will be able to work with a sense of urgency when escalated issues arise. Strong communication and interpersonal skills, ability to multitask and prioritize workload based on client demand. Familiarity with Jira for workflow mgmt., and time allocation. Familiarity with Scrum framework, backlog, planning, sprints, story points, retrospectives etc. Required Skills : ETL ETL tools such as Talend (Preferred, not required) DMExpress Nice to have Informatica Nice to have Database - Hands on experience with the following database Technologies Snowflake (Required) MYSQL/PostgreSQL Nice to have Familiar with NOSQL DB methodologies (Nice to have) Programming Languages Can demonstrate knowledge of any of the following. PLSQL JavaScript Strong Plus Python - Strong Plus Scala - Nice to have AWS Knowledge of the following AWS services: S3 EMR (Concepts) EC2 (Concepts) Systems Manager / Parameter Store Understands JSON Data structures, key value pair. Working knowledge of Code Repositories such as GIT, Win CVS, SVN. Workflow management tools such as Apache Airflow, Kafka, Automic/Appworx Jira Minimum Qualifications Bachelor's degree or equivalent 2-4 Years' experience Excellent verbal & written communications skills Self-Starter, highly motivated Analytical mindset.
Posted 2 weeks ago
3.0 - 8.0 years
4 - 8 Lacs
Pune
Work from Office
Minimum 3 years of experience in developing applications programs to implement the ETL workflow by creating the ETL jobs, data models in datamarts using Snowflake, DBT, Unix, SQL technologies. Redesign Control M Batch processing for the ETL job build to run efficiently in Production. Study existing system to evaluate effectiveness and developed new system to improve efficiency and workflow. Responsibilities: Perform requirements identification; conduct business program analysis, testing, and system enhancements while providing production support. Developer should have good understanding of working in Agile environment, Good understanding of JIRA, Sharepoint tools. Good written and verbal communication skills are a MUST as the candidate is expected to work directly with client counterpart." Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications Preferred technical and professional experience Responsible to develop triggers, functions, stored procedures to support this effort Assist with impact analysis of changing upstream processes to Data Warehouse and Reporting systems. Assist with design, testing, support, and debugging of new and existing ETL and reporting processes. Perform data profiling and analysis using a variety of tools. Troubleshoot and support production processes. Create and maintain documentation.
Posted 2 weeks ago
3.0 - 6.0 years
2 - 6 Lacs
Pune, Greater Noida
Work from Office
The Apex Group was established in Bermuda in 2003 and is now one of the worlds largest fund administration and middle office solutions providers. Our business is unique in its ability to reach globally, service locally and provide cross-jurisdictional services. With our clients at the heart of everything we do, our hard-working team has successfully delivered on an unprecedented growth and transformation journey, and we are now represented by over circa 13,000 employees across 112 offices worldwide.Your career with us should reflect your energy and passion. Thats why, at Apex Group, we will do more than simply empower you. We will work to supercharge your unique skills and experience. Take the lead and well give you the support you need to be at the top of your game. And we offer you the freedom to be a positive disrupter and turn big ideas into bold, industry-changing realities. For our business, for clients, and for you Apex Fund Services Position SQL Developer Employment Type: Full Time Location: Pune, India Salary: TBC Work Experience Applicant for this position should have 4+ years working as a SQL developer. Project Overview: The project will use a number of Microsoft SQL Server technologies and include development and maintenance of reports, APIs and other integrations with external financial systems. The successful applicant will liaise with other members of the team and will be expected to work on projects where they are the sole developer as well as part of a team on larger projects. The applicant will report to the SQL Development Manager : Ability to understand requirements clearly and communicate technical ideas to both technical stakeholders and business end users. Investigate and resolve issues quickly. Communication with end users. Working closely with other team members to understand business requirements. Complete structure analysis and systematic testing of the data. Skills: Microsoft SQL Server 2016 2022. T-SQL programming (4+ years) experience. Query/Stored Procedure performance tuning. SQL Server Integration Service. SQL Server Reporting Services. Experience in database design. Experience with source control. Knowledge of software engineering life cycle. Previous experience in designing, developing, testing, implementing and supporting software. 3rd Level IT Qualification. SQL MSCA or MSCE preferable. Knowledge of data technologies such as SnowFlake, Airflow, ADF desirable Other skills Ability to work on own initiative and as part of a team. Excellent time management and decision making skills. Excellent communication skills in both English written and verbal. Background in the financial industry preferable. Academic Qualification: Any graduation or post graduate. Any specialization in IT. Company Websitewww.apexfundservices.com DisclaimerUnsolicited CVs sent to Apex (Talent Acquisition Team or Hiring Managers) by recruitment agencies will not be accepted for this position. Apex operates a direct sourcing model and where agency assistance is required, the Talent Acquisition team will engage directly with our exclusive recruitment partners.
Posted 2 weeks ago
7.0 - 10.0 years
14 - 22 Lacs
Bengaluru
Hybrid
Position Description: The Spend Management Technology organization is responsible for delivering innovative enterprise technology solutions that enable the business? processes for firm-wide controls including Sourcing, Contract Management, Accounts Payable, Asset Management and Third Party Oversight. Spend Management Technology (SMT) is seeking a Backend/Database developer who will play an integral role in designing, implementing, and supporting data integration with new systems, data warehouse, and data extraction solutions across SMT functional areas. The candidate must have hands-on experience working on syabse,Db2 , shell scripting, Python, and, database procedural language and Postgres will be plus; In addition, candidate must have experience work independently. The candidate must also be well versed in full software development lifecycle and methodologies. The candidate must be self-starter. Skills Required:? 5-7 years of experience working as a hands-on developer in Sybase, DB2, ETL technologies. ?Worked extensively on data integration, designing, and developing reusable interfaces/ ?Advanced experience in Python, DB2, Sybase, shell scripting, Unix, Perl scripting, DB platforms, database design and modeling.?Expert level understanding of data warehouse, core database concepts and relational database design?Experience in writing stored procedures, optimization, and performance tuning?Strong Technology acumen and a deep strategic mindset?Proven track record of delivering results ?Proven analytical skills and experience making decisions based on hard and soft data?A desire and openness to learning and continuous improvement, both of yourself and your team members?Hands-on experience on development of APIs is a plus ?Good to have experience with Business Intelligence tools, Source to Pay applications such as SAP Ariba, and Accounts Payable system Skills Required:?Familiarity with Postgres and Python is a plus Location : Bangalore Core skills : Database (DB2 & Postgres preferred,but any relational DB is fine),Unix/Linux scripting, Python, SQL queries Nice to have : Java, Informatica.
Posted 2 weeks ago
4.0 - 8.0 years
12 - 20 Lacs
Bengaluru
Work from Office
Develop backend services using Python, FastAPI/Flask, integrate SQL databases, build Elasticsearch solutions, deploy to Azure/AWS, manage CI/CD, and mentor juniors. Optimize performance and ensure clean, scalable architecture Required Candidate profile 4–8 years of Python experience with strong backend skills, FastAPI/Flask, SQL, Elasticsearch, and Azure/AWS exposure.
Posted 2 weeks ago
5.0 - 7.0 years
12 - 16 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
We are seeking a highly skilled Python Developer with Strong AWS & Terraform experience. The ideal candidate must possess strong Python development capabilities, robust hands-on experience with AWS, and a working knowledge of Terraform. This role also requires foundational SQL skills and the ability to integrate and automate various backend and cloud services. Requirements and Qualifications: 57+ years of overall experience in software development Strong proficiency in Python development Extensive experience working with AWS services (Lambda, API Gateway, CloudWatch, etc.) Hands-on experience with Terraform Basic understanding of SQL Experience with REST APIs, Salesforce SOQL Familiarity with tools and platforms such as Git, GitHub Actions, Dell Boomi, Snowflake Knowledge of QA Automation using Python, Cucumber, Gherkin, Postman Roles and Responsibilities: Integrate automation frameworks with AWS, X-Ray, and Boomi services Develop backend automation scripts for Boomi processes Build utility tools for Salesforce data cleanup and EDI document generation Create and manage automated triggers in the test framework using AWS services (Lambda, API Gateway, etc.) Develop utilities for internal EDI processes integrating third-party applications (Salesforce, Dell Boomi, AWS, Snowflake, X-Ray) Integrate utilities into Cucumber Automation Framework Connect automation framework with Jira and X-Ray for test reporting Automate test cases for various EDI processes Collaborate on development and integration using Python, REST APIs, AWS, and other modern tools Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune,Remote
Posted 2 weeks ago
2.0 - 7.0 years
5 - 15 Lacs
Hyderabad, Bengaluru
Work from Office
Role & responsibilities Actively participate in pricing decisions by providing data-driven insights, formulating optimization proposals, setting up experiments, and evaluating results. Support leadership in making strategic and tactical decisions based on extensive data analysis. Perform analysis of pricing and competitive data and provide insights and recommendations. Help manage the collection of pricing and competitive market data by working directly with external data miners. Participate in developing methodology for defining key metrics, related tables, and needed calculations. Preferred candidate profile Technical mindset and a Bachelor's degree in Technical Science (Applied Mathematics, Computer Science, or related field). Strong knowledge of mathematics and understanding of mathematical statistics. 2+ years of experience in quantitative modeling, development, or implementation. Working experience in data manipulation and advanced data analysis. Experience with SAS, R, Python, and proficiency working with large datasets. Applied experience with Logistic Regression, Linear Regression, Survival Analysis, Time Series Analysis, Decision Trees, and Cluster Analysis. Experience as an analyst/data scientist preferably in B2B/B2C marketplaces. Hands-on experience with SQL and Python for processing and analysis. Critical thinking ability to break down complex requirements into smaller components. Demonstrated ability to work independently and be a self-starter. Ability to work effectively in a team. Willingness to work in a startup environment with evolving goals. Proficiency in English at a level not lower than B2.
Posted 2 weeks ago
5.0 - 9.0 years
1 - 1 Lacs
Bengaluru
Remote
Hi , Synergy Technologies is a leader in technology services and consulting. We enable clients across the world to create and execute strategies .We help our clients find the right problems to solve, and to solve these effectively. We bring our expertise and innovation to every project we undertake Position: Business Intelligence Developer Duration : Contract to Full Time Location : Remote Work ( Remote Work ) Note : Consultant who are located in Bangalore are given preferences first choice.. & than other States & Cities... Note Need Strong Experience with ( DBT & Snowflake & Azure Needed )........ JD Required qualications include: Business Intelligence Developer Opportunity Our mission is clear: to enhance the safety and well-being of workers across the globe. As a trailblazer in software solutions, we empower businesses and their suppliers with a platform that champions safety, sustainability, and risk management within supply chains. Join our close-knit team of Data Systems and Internal Business Intelligence experts, where you can live out our core values daily and contribute to impactful projects that further the companys vision. About the Role As a Business Intelligence Developer , you will play a critical role in developing impactful business intelligence solutions that empower internal teams with data-driven insights for strategic decision-making. Working closely with business analysts, data engineers, and stakeholders, youll design and build data models, interactive reports, and dashboards to transform complex data into clear, actionable insights. Your efforts will ensure data quality, accuracy, and governance while enhancing accessibility for business users. Key Responsibilities Develop BI Solutions: Design, develop, and implement data models, dashboards, and reports using Power BI to support data-driven initiatives. Data Modeling & Integration: Collaborate with data engineers and analysts to create optimized data models that aggregate data from multiple sources, ensuring scalability and alignment with business needs. Enhance Data Accuracy: Continuously improve data accuracy, standardize key metrics, and refine reporting processes to drive operational efficiency. Ensure Data Governance: Adhere to the companys data governance policies, ensuring that all BI solutions comply with data security standards, especially for sensitive information. Optimize BI Performance: Monitor BI solutions to ensure performance and reliable data access, implementing enhancements as needed. Documentation & User Support: Maintain comprehensive documentation of dashboards, data models, and processes; provide end-user training to maximize tool effectiveness. Adapt and Innovate: Stay informed on BI best practices and emerging technologies to proactively enhance BI capabilities. Qualifications Education: Bachelor’s Degree in Data Science, Business Analytics, Computer Science, or a rrelated field. Experience: Minimum of 5 years in business intelligence development, including data modeling, reporting, and dashboard creation. Power BI Expertise: Strong experience with Power BI, including advanced DAX calculations, data modeling, and creating visually engaging, actionable dashboards. dbt Labs Cloud IDE: At least 1 year of hands-on experience with dbt Labs Cloud IDE is required. Technical Skills: Proficiency in SQL and modern cloud-based data warehousing concepts, with experience in Snowflake, SQL Server, or Redshift. Cloud and ERP/CRM Proficiency: Familiarity with platforms such as NetSuite, Salesforce, Fivetran, and API integrations; experience with SaaS systems like Zuora Billing, Churn Zero, Marketo, and Qualtrics is a plus. Communication Skills: Ability to translate technical insights into business-friendly language. Preferred Skills Certifications: Power BI, Snowflake, or similar BI tools. Portfolio: Ability to provide redacted samples of Power BI dashboards. SaaS Experience: Background in SaaS organizations is beneficial.
Posted 2 weeks ago
4.0 - 9.0 years
15 - 25 Lacs
Hyderabad, Chennai
Work from Office
Interested can also apply with sanjeevan.natarajan@careernet.in Role & responsibilities Technical Leadership Lead a team of data engineers and developers; define technical strategy, best practices, and architecture for data platforms. End-to-End Solution Ownership Architect, develop, and manage scalable, secure, and high-performing data solutions on AWS and Databricks. Data Pipeline Strategy Oversee the design and development of robust data pipelines for ingestion, transformation, and storage of large-scale datasets. Data Governance & Quality Enforce data validation, lineage, and quality checks across the data lifecycle. Define standards for metadata, cataloging, and governance. Orchestration & Automation Design automated workflows using Airflow, Databricks Jobs/APIs, and other orchestration tools for end-to-end data operations. Cloud Cost & Performance Optimization Implement performance tuning strategies, cost optimization best practices, and efficient cluster configurations on AWS/Databricks. Security & Compliance Define and enforce data security standards, IAM policies, and compliance with industry-specific regulatory frameworks. Collaboration & Stakeholder Engagement Work closely with business users, analysts, and data scientists to translate requirements into scalable technical solutions. Migration Leadership Drive strategic data migrations from on-prem/legacy systems to cloud-native platforms with minimal risk and downtime. Mentorship & Growth Mentor junior engineers, contribute to talent development, and ensure continuous learning within the team. Preferred candidate profile Python , SQL , PySpark , Databricks , AWS (Mandatory) Leadership Experience in Data Engineering/Architecture Added Advantage: Experience in Life Sciences / Pharma
Posted 2 weeks ago
4.0 - 8.0 years
8 - 18 Lacs
Nagpur, Pune, Bengaluru
Work from Office
Expertise in Python Programming with hands on experience in using Python libraries for data ingestion and DWH technologies like Snowflake, AWS - Expertise in Data Modeling specially OMOP will be an added advantage - Willingness to work on any of the ETL tools/technologies as per the need of project/organization - Understand the existing eco system and come up with unique solutions to improvise - Communicate with Product Owner or Business Analyst regarding requirements to understand them - Strong analytical, problem-solving, and product management skills
Posted 2 weeks ago
7.0 - 12.0 years
15 - 30 Lacs
Hyderabad
Remote
Lead Data Engineer with Health Care Domain Role & responsibilities Position: Lead Data Engineer Experience: 7+ Years Location: Hyderabad | Chennai | Remote SUMMARY: Data Engineer will be responsible for ETL and documentation in building data warehouse and analytics capabilities. Additionally, maintain existing systems/processes and develop new features, along with reviewing, presenting and implementing performance improvements. Duties and Responsibilities Build ETL (extract, transform, and loading) jobs using Fivetran and dbt for our internal projects and for customers that use various platforms like Azure, Salesforce, and AWS technologies. Monitoring active ETL jobs in production. Build out data lineage artifacts to ensure all current and future systems are properly documented. Assist with the build out design/mapping documentation to ensure development is clear and testable for QA and UAT purposes. Assess current and future data transformation needs to recommend, develop, and train new data integration tool technologies. Discover efficiencies with shared data processes and batch schedules to help ensure no redundancy and smooth operations Assist the Data Quality Analyst to implement checks and balances across all jobs to ensure data quality throughout the entire environment for current and future batch jobs. Hands-on experience in developing and implementing large-scale data warehouses, Business Intelligence and MDM solutions, including Data Lakes/Data Vaults . Required Skills This job has no supervisory responsibilities. Need strong experience with Snowflake and Azure Data Factory(ADF). Bachelor's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 6+ years experience in business analytics, data science, software development, data modeling or data engineering work. 5+ years experience with a strong proficiency with SQL query/development skills Develop ETL routines that manipulate and transfer large volumes of data and perform quality checks Hands-on experience with ETL tools (e.g Informatica, Talend, dbt, Azure Data Factory) Experience working in the healthcare industry with PHI/PII Creative, lateral, and critical thinker Excellent communicator Well-developed interpersonal skills Good at priori zing tasks and time management Ability to describe, create and implement new solutions Experience with related or complementary open source so ware platforms and languages (e.g. Java, Linux, Apache, Perl/Python/PHP, Chef) Knowledge / Hands-on experience with BI tools and reporting software (e.g. Cognos, Power BI, Tableau) Big Data stack (e.g. Snowflake(Snowpark), SPARK, MapReduce, Hadoop, Sqoop, Pig, HBase, Hive, Flume)
Posted 2 weeks ago
4.0 - 9.0 years
5 - 12 Lacs
New Delhi, Gurugram, Delhi / NCR
Work from Office
Role & responsibilities Key Responsibilities: Understand and analyze ETL requirements, data mapping documents, and business rules. Design, develop, and execute test cases, test scripts, and test plans for ETL processes. Perform data validation, source-to-target data mapping, and data integrity checks. Write complex SQL queries for data verification and backend testing. Conduct regression, integration, and system testing for ETL pipelines and data warehouse environments. Work with BI tools to validate reports and dashboards if applicable. Collaborate with developers, business analysts, and data engineers to ensure testing coverage and resolve issues. Document defects, test results, and provide detailed bug reports and testing status. Required Skills and Experience: 4+ years of experience in ETL testing or data warehouse testing. Strong proficiency in SQL for data validation and analysis. Hands-on experience with ETL tools like Informatica , Talend , SSIS , or similar. Knowledge of data warehousing concepts, star/snowflake schemas, and data modeling. Experience with test management tools (e.g., JIRA, HP ALM, TestRail). Understanding of automation in data testing (a plus, e.g., Python, Selenium with databases). Familiarity with cloud platforms (e.g., AWS Redshift, Google BigQuery, Azure Data Factory) is a plus ETL Tester- Preferred candidate profile skill Matrix Years ETL Testing SQL ETL Tool Testing Tool (e.g Jira etc) Cloud Data Warehouse
Posted 2 weeks ago
5.0 - 10.0 years
4 - 9 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Role & responsibilities We are looking for 5+ years of experience in snowflake development. Willing to relocate to preferred location. Opening Location: chennai, Bangalore
Posted 2 weeks ago
5.0 - 10.0 years
4 - 9 Lacs
Nagpur, Chennai, Bengaluru
Work from Office
Role & responsibilities We are looking for 5+ years of experience in snowflake development. Willing to relocate to preferred location. Opening Location: chennai, Bangalore & Nagpur
Posted 2 weeks ago
4.0 - 8.0 years
15 - 30 Lacs
Pune, Bengaluru
Hybrid
Job Overview: We seek a highly skilled and motivated Data Engineer with strong expertise in Python, PySpark, AWS, Databricks, and Snowflake to join our dynamic team. The ideal candidate should have hands-on experience with Spark optimization, SQL data processing, and AWS tools and technologies . Exposure to Informatica and data streaming tools is a plus. Key Responsibilities: Design, develop, and optimize scalable data processing pipelines using Spark and PySpark. Implement Spark optimization techniques to enhance performance and efficiency. Work with complex SQL queries for data manipulation and transformation. Collaborate with data scientists, analysts, and engineering teams to meet data needs. Work on AWS Cloud services like Redshift, AWS Glue, SQL Server, and Databricks. Utilize Python for data processing tasks, ensuring modularization and packaging. Basic knowledge of Informatica. Engage in real-time data streaming and integration using tools such as NiFi, Kafka, and EventHub (optional but preferred). Hands-on experience with Snowflake for data warehousing and analytics. Must Have Experience _ Python, Pyspark, SQL, AWS, Spark Required Skills: Spark Expertise: Strong experience in Spark and PySpark, including optimization techniques. Complex SQL: Advanced SQL knowledge and data processing skills. Cloud Exposure: Familiarity with AWS services and Databricks. Python Proficiency: Experience with Python for data processing and scripting. Data Streaming: Experience with NiFi, Kafka, and EventHub is desirable. Snowflake: Hands-on experience with Snowflake for data warehousing and analytics. Qualifications: Bachelors or Masters degree in Computer Science, Information Technology, or related fields. 4-8 years of professional experience in Data Engineering, Data Analytics, or related roles.
Posted 2 weeks ago
8.0 - 13.0 years
15 - 25 Lacs
Hyderabad, Bengaluru
Hybrid
Looking for Snowflake developer for US client, this candidate should be strong with Snowflake & DBT & should be able to do impact analysis on the current ETLs (Informatica/ Data stage) and provide solutions based on the analysis. Exp: 7- 12yrs
Posted 2 weeks ago
10.0 - 15.0 years
20 - 35 Lacs
Hyderabad
Work from Office
Job Role: Database Architect Experience: 10-15 years Work Location: Hyderabad Job Description: Key Responsibilities : Design and architect complex database systems, ensuring alignment with business needs and technical requirements. Develop and optimize SQL queries, stored procedures, and functions for high-performance solutions. Lead the team in the design, optimization, and maintenance of database architecture. Provide guidance in query optimization, performance tuning, and database indexing strategies. Collaborate with clients to understand business requirements and deliver tailored solutions. Oversee ETL processes (SSIS) and ensure effective data integration. Utilize Power BI and DAX queries for data reporting and analysis. Mentor junior developers and lead the team in adopting best practices for database design and development. Ensure alignment with database design principles, normalization, and industry standards. Qualifications : Proven experience in Database Design & Architecture, with a focus on relational database systems (SQL Server, Oracle, PostgreSQL, MySQL). Hands-on experience in Duck Creek Insights and the insurance domain is a plus. Strong proficiency in SQL, stored procedures, and function development. Experience with query optimization, performance tuning, and indexing. Expertise in leading teams, managing client relationships, and delivering high-quality solutions. Familiarity with ETL tools like SSIS and experience in data integration. Knowledge of Power BI and DAX queries for reporting and analytics. Excellent communication, problem-solving, and leadership skills.
Posted 2 weeks ago
5.0 - 10.0 years
20 - 25 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Snowflake DB developer + Python + SQL Location : PAN India NP : 0 to 30 days Exp : 5 to 10Years Skill set: Snowflake, Python, SQL and PBI developer. Understand and Implement Data Security, Data Modelling Write Complex SQL Queries, Write JavaScript and Python Stored Procedure code in Snowflake. Using ETL (Extract, Transform, Load) tools to move and transform data into Snowflake and from Snowflake to other systems. Understand cloud architecture. Can develop, Design PBI dashboards, reports, and data visualizations Communication skills Relevant candidate can drop a mail to roshini.k@wipro.com with updated resume and below details TEX : REX : Current company : CCTC : ECTC : Notice Period : (LWD If serving) Counter offer CTC if any Location :
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Snowflake has become one of the most sought-after skills in the tech industry, with a growing demand for professionals who are proficient in handling data warehousing and analytics using this cloud-based platform. In India, the job market for Snowflake roles is flourishing, offering numerous opportunities for job seekers with the right skill set.
These cities are known for their thriving tech industries and have a high demand for Snowflake professionals.
The average salary range for Snowflake professionals in India varies based on experience levels: - Entry-level: INR 6-8 lakhs per annum - Mid-level: INR 10-15 lakhs per annum - Experienced: INR 18-25 lakhs per annum
A typical career path in Snowflake may include roles such as: - Junior Snowflake Developer - Snowflake Developer - Senior Snowflake Developer - Snowflake Architect - Snowflake Consultant - Snowflake Administrator
In addition to expertise in Snowflake, professionals in this field are often expected to have knowledge in: - SQL - Data warehousing concepts - ETL tools - Cloud platforms (AWS, Azure, GCP) - Database management
As you explore opportunities in the Snowflake job market in India, remember to showcase your expertise in handling data analytics and warehousing using this powerful platform. Prepare thoroughly for interviews, demonstrate your skills confidently, and keep abreast of the latest developments in Snowflake to stay competitive in the tech industry. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.