Jobs
Interviews

314 Advance Sql Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

15 - 30 Lacs

gurugram, bengaluru

Hybrid

Exciting opportunity for a Tableau and BI expert to join a leading analytics team in Gurugram. The role involves designing and delivering impactful dashboards and reports, collaborating with cross-functional teams, and driving decision-making through advanced data visualization. Location : Gurugram (Hybrid) Notice Period: Looking for immediate joiners or who can join in month of September Your Future Employer Our client is a globally recognized analytics organization known for its data-driven strategies and innovative approach to problem-solving. They foster a dynamic work culture built on collaboration, learning, and continuous improvement. Responsibilities Developing and publishing Tableau dashboards and KPI scorecards aligned with business needs Translating functional specifications into technical solutions for data visualization Managing data from multiple sources and transforming it for reporting Collaborating with stakeholders and contributing as an individual contributor Working with SQL, ETL pipelines, and RDBMS concepts for backend integration Building automated data workflows and maintaining visualization best practices Requirements Bachelors/Masters degree in Computer Science, Analytics, Mathematics, or related fields 3+ years of hands-on experience in Tableau dashboard development Proficiency in SQL, data modeling, and ETL processes Experience with Figma integration is preferred Familiarity with Power BI is a plus Strong communication and analytical problem-solving skills Ability to work independently in a fast-paced, evolving environment Whats in it for you Opportunity to work on high-impact analytics projects for global clients Collaborative work culture and mentorship-driven learning Access to the latest BI and visualization technologies Competitive compensation and career advancement opportunities

Posted 2 hours ago

Apply

7.0 - 12.0 years

18 - 20 Lacs

pune, chennai, coimbatore

Work from Office

Skills / Competencies 1. Strong background in software engineering with advanced SQL skills (5+ years) 2. Design and development using .NET, C# including Extensive experience with .NET Core and the broader .NET ecosystem. (7+ years) 3. ETL pipeline development experience 4. Experience with cloud solutions ideally MS Azure 5. Knowledge of software engineering best practices across the development lifecycle, including agile methodologies, coding standards, code reviews, source management, build processes, testing, and operations 6. Experience working in large-scale data warehouse environments 7. Proficient in working with structured data formats such as JSON/XML, including parsing, transformation, and integration into APIs and data pipelines 8. Exposure to Azure Data Factory or similar. 9. API design and development, including RESTful services, authentication (OAuth2, JWT), versioning, documentation (e.g., Swagger/OpenAPI), and performance optimization 10. Experience with CI/CD tools such as GitHub Actions and Harness pipelines. 11. Understanding of containerization technologies including Docker and Kubernetes (AKS)

Posted 2 hours ago

Apply

2.0 - 6.0 years

10 - 18 Lacs

hyderabad, bengaluru

Hybrid

Indium is hiring Data Analysts and Data Engineers ! Join our dynamic team and work on cutting-edge data projects. We're conducting a Walk-In Drive on 20th September 2025 at our Hyderabad and Chennai offices. Open Roles: Data Analyst Experience: 2 to 5 Years Key Skills: Advanced SQL Tableau / Looker / GDS / Power BI Python (Numpy, Pandas) Strong data visualization and communication skills Understanding of business metrics and KPIs Data Engineer Experience: 2 to 6 Years Key Skills: Python, SQL ETL Pipelines, Data Warehousing Cloud Platforms (GCP or any other cloud) Experience with Big Data tools. How to Apply / Participate: Please share your details in the form - https://forms.gle/ECPdpDo6yxUvoZxX8 Post reviewing your details, if your profile is shortlisted, we will send you the invitation for Walk-in drive. Looking forward for your responses!

Posted 5 hours ago

Apply

2.0 - 6.0 years

10 - 18 Lacs

hyderabad, bengaluru

Hybrid

Indium is hiring Data Analysts and Data Engineers ! Join our dynamic team and work on cutting-edge data projects. We're conducting a Walk-In Drive on 20th September 2025 at our Hyderabad and Chennai offices. Open Roles: Data Analyst Experience: 2 to 5 Years Key Skills: Advanced SQL Tableau / Looker / GDS / Power BI Python (Numpy, Pandas) Strong data visualization and communication skills Understanding of business metrics and KPIs Data Engineer Experience: 2 to 6 Years Key Skills: Python, SQL ETL Pipelines, Data Warehousing Cloud Platforms (GCP or any other cloud) Experience with Big Data tools. How to Apply / Participate: Please share your details in the form - https://forms.gle/ECPdpDo6yxUvoZxX8 Post reviewing your details, if your profile is shortlisted, we will send you the invitation for Walk-in drive. Looking forward for your responses!

Posted 2 days ago

Apply

4.0 - 6.0 years

20 - 24 Lacs

bengaluru

Work from Office

Overview We are an integral part of Annalect Global and Omnicom Group, one of the largest media and advertising agency holding companies in the world. Omnicom’s branded networks and numerous specialty firms provide advertising, strategic media planning and buying, digital and interactive marketing, direct and promotional marketing, public relations, and other specialty communications services. Our agency brands are consistently recognized as being among the world’s creative best. Annalect India plays a key role for our group companies and global agencies by providing stellar products and services in areas of Creative Services, Technology, Marketing Science (data & analytics), Market Research, Business Support Services, Media Services, Consulting & Advisory Services. We currently have 2500+ awesome colleagues (in Annalect India) who are committed to solve our clients’ pressing business issues. We are growing rapidly and looking for talented professionals like you to be part of this journey. Let us build this, together . Responsibilities Requirement gathering and evaluation of clients’ business situations in order to implement appropriate analytic solutions. Designs, generates and manages reporting frameworks that provide insight as to the performance of clients’ marketing activities across multiple channels. Be the single point of contact on anything data & analytics related to the project. QA process: Maintain, create and re-view QA plans for deliverables to align with the requirements, identify discrepancies if any and troubleshoot issues. Prioritize tasks and proactively manage workload to ensure timely delivery with high accuracy. Active contribution to project planning and scheduling. Create and maintain project specific documents such as process / quality / learning documents. Should be able to drive conversation with team, client and business stake holders Experience in managing global clients with strong account management background Strong relationship building skills & Excellent project and resource management skills Maintaining positive client and vendor relationships. Qualifications 10+ years of experience in media/marketing services or relevant domains with strong problem-solving ability. Strong knowledge on Advance SQL, Redshift, Alteryx, Tableau, Media knowledge, Data modeling, Advance excel are mandatory to have. Adverity and Python are good-to-have. Ability to identify and help determine key performance indicators for the clients. Strong written and verbal communication skills. Led delivery teams and projects to successful implementations. Familiarity working with large data sets and creating cohesive stories. Able to work and lead successfully with teams, handling multiple projects and meeting timelines. Presentation skills using MS Power Point or any presentation platforms Strong written and verbal communication skills Resourceful and self-motivated Strong project management and administrative skills Strong analytical skills with superior attention to detail and Demonstrate strong problem solving and troubleshooting skills

Posted 2 days ago

Apply

8.0 - 13.0 years

30 - 35 Lacs

bengaluru

Work from Office

About The Role Data Engineer -1 (Experience 0-2 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 2 days ago

Apply

4.0 - 9.0 years

15 - 30 Lacs

bengaluru

Work from Office

About the Team Youll be part of the Analytics Team at Meesho. We are trying to revolutionize the way our consumers shop online by experimenting with a video based shopping format. About the Role As a Lead Business Analyst, you will work on improving the reporting tools, methods, and processes of the team you are assigned to. You will also create and deliver weekly, monthly and quarterly metrics critical for tracking and managing the business. You will manage numerous requests concurrently and strategically, prioritising them when necessary. You will actively engage with internal partners throughout the organisation to meet and exceed customer service levels and transport-related KPIs. You will brainstorm simple, scalable solutions to difficult problems, and seamlessly manage projects under your purview. You will maintain excellent relationships with our users and in fact, advocate for them while keeping in mind the business goals of your team. What you will do Create various algorithms for optimizing demand and supply data Conduct analysis and solution-building based on insights captured from data Give insights to management and help in strategic planning Analyze metrics, key indicators and other available data sources to discover root causes of process defects Support business development and help to create efficient designs and solution processes Determine efficient utilization of resources Research and implement cost reduction opportunities What you will need B.Tech/M.Tech /MBA in any discipline 4+ years of experience as a Business Analyst Proficiency in Advanced Excel and Advanced SQL (must-have) Python is Plus Understanding of basic statistics and probability concepts Proven problem-solving skills

Posted 3 days ago

Apply

2.0 - 7.0 years

15 - 30 Lacs

bengaluru

Work from Office

About the Team As Business Analysts, its on us to dive into data and derive insights from it. These then become actionable solutions in the form of changes, improvements, upgrades and new features. As a Business Analyst at Meesho, you will play a crucial role in identifying, improving, and developing technology solutions that drive our strategic goals. This is a tremendous opportunity to learn about high-priority initiatives and collaborate with colleagues throughout the firm and across teams. We work at the intersection of business and technology, continuously developing our leadership, management and communication skills in the process. The exact team you will be working with will be decided during or after the hiring process. Regardless, you are sure to learn and grow and have fun doing so too. Each of our teams at Meesho has its own fun rituals from casual catch-ups to bar hopping, movies nights, and games. About the Role As a Senior Business Analyst, you will work on improving the reporting tools, methods, and processes of the team you are assigned to. You will also create and deliver weekly, monthly, and quarterly metrics critical for tracking and managing the business. You will manage numerous requests concurrently and strategically, prioritising them when necessary. You will actively engage with internal partners throughout the organisation to meet and exceed customer service levels and transport-related KPIs. You will brainstorm simple, scalable solutions to difficult problems, and seamlessly manage projects under your purview. You will maintain excellent relationships with our users and in fact, advocate for them while keeping in mind the business goals of your team. What you will do Create various algorithms for optimizing demand and supply data Conduct analysis and solution-building based on insights captured from data Give insights to management and help in strategic planning Analyze metrics, key indicators and other available data sources to discover root causes of process defects Support business development and help to create efficient designs and solution processes Determine efficient utilization of resources Research and implement cost reduction opportunities Must have skills /MBA in any discipline 2+ years of experience as a Business Analyst Proficiency in Advanced Excel and Advanced SQL (must-have) and Python(must have) Understanding of basic statistics and probability concepts Proven problem-solving skills

Posted 3 days ago

Apply

8.0 - 13.0 years

2 - 2 Lacs

hyderabad

Work from Office

SUMMARY Senior Data Engineer We are looking for an experienced and highly skilled Senior Data Engineer to join our data governance team. This role is pivotal in driving our data strategy forward, with a primary focus on leading the migration of our data infrastructure from PostgreSQL to Snowflake. The ideal candidate is a hands-on expert in building and managing large-scale data pipelines and possesses deep, practical experience in complex database migrations. As a technical expert, you will be responsible for executing critical projects, mentoring team members, and ensuring the highest standards of quality and accountability . Key Responsibilities Plan, and execute the end-to-end migration of large-scale datasets and data pipelines from PostgreSQL to Snowflake, ensuring minimal downtime and data integrity. Design, build, and optimize robust, scalable, and automated ETL/ELT data pipelines using Python and modern data engineering technologies. Guide and mentor other data engineers, fostering a culture of technical excellence, collaboration, and knowledge sharing. Provide code reviews and architectural oversight. Take full ownership of data engineering projects from conception through to deployment and ongoing maintenance. Be accountable for the quality, reliability, and timeliness of deliverables. Work closely with the team, and business stakeholders to understand their data needs and deliver high-quality data solutions that drive business value. Tackle complex data challenges, troubleshoot production issues, and implement performance optimizations within our data warehouse and pipeline infrastructure. Skills & Qualifications Bachelor's or Master's degree in Computer Science, Engineering, or a related quantitative field. 5+ years of relevant professional experience in a data engineering role. Proven, hands-on experience leading at least one significant, large-scale data migration from PostgreSQL to Snowflake. Expert-level proficiency in Python for data processing and pipeline orchestration (e.g., using libraries like Pandas, SQLAlchemy, and frameworks like Airflow or Dagster). Deep expertise in advanced SQL, data modeling, and data warehousing concepts. Strong understanding of the Snowflake architecture, features, and best practices. Familiarity with cloud services (AWS, GCP, or Azure) and their data-related offerings. Excellent problem-solving skills, a meticulous attention to detail, and a proven ability to manage multiple projects with tight deadlines. Strong communication and teamwork skills, with a collaborative mindset and a genuine willingness to help others succeed in a fast-paced, innovative environment. Other Key expectations: Ideal candidate is expected to work on-site (Hyderabad or Gurugram location) 12 days per month or 3 days per week. Ideal candidate should have strong communication and teamwork skills , with a collaborative spirit and a willingness to help others when needed.

Posted 3 days ago

Apply

3.0 - 7.0 years

6 - 10 Lacs

noida

Work from Office

Core skills required for the role : Databricks Level: Advanced SQL (MSSQL Server) Joins, SQ optimization, basic knowledge of StoredProcedure, Functions PySpark Level: Advanced Azure Delta lake Python Basic Mandatory Competencies Big Data - Big Data - Pyspark Data Science and Machine Learning - Data Science and Machine Learning - Databricks Cloud - Azure - Azure Data Factory (ADF), Azure Databricks, Azure Data Lake Storage, Event Hubs, HDInsight Database - Sql Server - DBA Data Science and Machine Learning - Data Science and Machine Learning - Python

Posted 3 days ago

Apply

4.0 - 6.0 years

12 - 16 Lacs

bengaluru

Work from Office

Overview We are an integral part of Annalect Global and Omnicom Group, one of the largest media and advertising agency holding companies in the world. Omnicom’s branded networks and numerous specialty firms provide advertising, strategic media planning and buying, digital and interactive marketing, direct and promotional marketing, public relations, and other specialty communications services. Our agency brands are consistently recognized as being among the world’s creative best. Annalect India plays a key role for our group companies and global agencies by providing stellar products and services in areas of Marketing Science (data & analytics), We are growing rapidly and looking for talented professionals like you to be part of this journey. Let us build this, together. Responsibilities Requirement gathering and evaluation of clients’ business situations in order to implement appropriate analytic solutions. Designs, generates and manages reporting frameworks that provide insight as to the performance of clients’ marketing activities across multiple channels. Be the single point of contact on anything data & analytics related to the project. QA process: Maintain, create and re-view QA plans for deliverables to align with the requirements, identify discrepancies if any and troubleshoot issues. Prioritize tasks and proactively manage workload to ensure timely delivery with high accuracy. Keep abreast of developments in and answer questions on data visualization and presentation, media/ research/ reporting tools, and systems; educate team on same Active contribution to project planning and scheduling. Create and maintain project specific documents such as process / quality / learning documents. Should be able to drive conversation with team, client and business stake holders Qualifications 6-9 years' experience in data management, ETL/BI or marketing analytics and analysis in Media or relevant domains with strong problem-solving ability. Data Extraction and Data Manipulation: - Working with the huge databases to meet the data needs for the project and play an SME role ensuring understanding of end-to-end data processes, system, architecture Strong working knowledge of Advance SQL, Redshift, Alteryx, Tableau, Media knowledge, Data modeling, Advance excel are mandatory to have Adverity and Python are good to have Strong written and verbal communication skills Excellent project and resource management skills Demonstrate strong problem solving and troubleshooting skills and eagerness to learn with Strong analytical skills with superior attention to detail. Resourceful and self-motivated Led delivery teams (approx. 4-7 members) and projects to successful implementations with focus towards coaching the team on domain and technology, undertaking their performance management and providing guidance for their career Familiarity working with large data sets and creating cohesive stories. Able to work and lead successfully with teams, handling multiple projects and meeting timelines. Maintaining positive client and vendor relationships. Presentation skills using MS Power Point or any presentation platforms

Posted 4 days ago

Apply

8.0 - 11.0 years

15 - 22 Lacs

pune

Work from Office

Roles and Responsibilities Design, develop, and maintain large-scale data warehouses using Snowflake. Collaborate with cross-functional teams to gather requirements and deliver high-quality solutions. Develop complex ETL processes using SnowSQL, stored procedures, functions, and views. Ensure scalability, performance, and security of the data warehouse environment. Provide technical guidance and mentorship to junior team members. Desired Candidate Profile 8-11 years of experience in Data Warehousing with expertise in Snowflake technology. Strong understanding of SQL concepts including joins, subqueries, aggregations, etc. Experience working on large-scale projects involving big data processing and analytics. Excellent problem-solving skills with ability to work independently or as part of a team.

Posted 4 days ago

Apply

3.0 - 6.0 years

15 - 25 Lacs

pune, gurugram, bengaluru

Hybrid

Salary: 20 to 35 LPA Exp: 3 to 8 years Location: Pune/Bangalore/Gurgaon(Hybrid) Notice: Immediate only..!! Key Skills: SQL, Advance SQL, BI tools etc Roles and Responsibilities Extract, manipulate, and analyze large datasets from various sources such as Hive, SQL databases, and BI tools. Develop and maintain dashboards using Tableau to provide insights on banking performance, market trends, and customer behavior. Collaborate with cross-functional teams to identify key performance indicators (KPIs) and develop data visualizations to drive business decisions. Desired Candidate Profile 3-8 years of experience in Data Analytics or related field with expertise in Banking Analytics, Business Intelligence, Campaign Analytics, Marketing Analytics, etc. . Strong proficiency in tools like Tableau for data visualization; Advance SQL knowledge preferred. Experience working with big data technologies like Hadoop ecosystem (Hive), Spark; familiarity with Python programming language required.

Posted 4 days ago

Apply

5.0 - 10.0 years

25 - 40 Lacs

hyderabad, gurugram, bengaluru

Hybrid

Salary: 25 to 40 LPA Exp: 5 to 10 years Location: Bangalore/Hyderabad Notice: Immediate only..!! Key Skills: SQL, Advance SQL, BI tools, ETL etc Roles and Responsibilities Extract, manipulate, and analyze large datasets from various sources such as Hive, SQL databases, and BI tools. Develop and maintain dashboards using Tableau to provide insights on banking performance, market trends, and customer behavior. Collaborate with cross-functional teams to identify key performance indicators (KPIs) and develop data visualizations to drive business decisions. Desired Candidate Profile 6-10 years of experience in Data Analytics or related field with expertise in Banking Analytics, Business Intelligence, Campaign Analytics, Marketing Analytics, etc. . Strong proficiency in tools like Tableau for data visualization; Advance SQL knowledge preferred. Experience working with big data technologies like Hadoop ecosystem (Hive), Spark; familiarity with Python programming language required.

Posted 4 days ago

Apply

6.0 - 11.0 years

15 - 30 Lacs

noida, pune, bengaluru

Hybrid

Experience- 6 to 11 years Location- Coimbatore- Key Responsibilities: Design and build scalable ELT pipelines in Snowflake using DBT/SQL . Develop efficient, well-tested DBT models (staging, intermediate, and marts layers). Implement data quality, testing, and monitoring frameworks to ensure data reliability and accuracy. Optimize Snowflake queries, storage, and compute resources for performance and cost-efficiency. Collaborate with cross-functional teams to gather data requirements and deliver data solutions. Required Qualifications: 6+ years of experience as a Data Engineer, with at least 5 years working with Snowflake . Proficient with DBT (Data Build Tool) including Jinja templating, macros, and model dependency management. Strong understanding of ELT patterns and modern data stack principles. Advanced SQL skills and experience with performance tuning in Snowflake. Interested candidates share your CV at himani.girnar@alikethoughts.com with below details Candidate's name- Email and Alternate Email ID- Contact and Alternate Contact no- Total exp- Relevant experience- Current Org- Notice period- CCTC- ECTC- Current Location- Preferred Location- Pancard No-

Posted 5 days ago

Apply

5.0 - 8.0 years

4 - 8 Lacs

kochi

Work from Office

We are seeking a highly skilled and experienced Senior Developer with 5\u20138 years of hands-on expertise in ETL, Data Warehousing, Advanced SQL, and Snowflake, ideally with a strong understanding of the pharmaceutical or healthcare domain. The selected candidate will play a key role in driving data initiatives, optimising data pipelines, and facilitating stakeholder collaboration across business and technical teams. You will work on end-to-end data solutions\u2014extracting, transforming, and loading (ETL) data from various sources into centralised data platforms. This role also demands strong SQL expertise for advanced queries, performance tuning, and data validation. Experience in Snowflake is essential, as is the ability to understand complex data models in the pharma ecosystem. Your ability to communicate clearly with business stakeholders, technical leads, and senior leadership will be crucial. Acting as a bridge between the technical and business teams, you will ensure accurate translation of requirements into scalable, compliant data solutions. Primary Responsibilities: Design, develop, and maintain ETL pipelines and Data Warehouse structures Write complex SQL queries for data transformation, validation, and reporting Develop and optimize Snowflake-based data solutions Collaborate with pharma domain experts to map business requirements Communicate effectively with business teams and technical stakeholders Act as a liaison between leadership, developers, and business analysts Preferred Skills: Pharma/healthcare data domain experience is mandatory Veeva CRM data understanding is a plus Strong stakeholder management and communication skills

Posted 5 days ago

Apply

4.0 - 8.0 years

6 - 16 Lacs

hyderabad, chennai, bengaluru

Work from Office

We are seeking a highly skilled Zuora Billing Specialist with strong expertise in subscription billing, invoicing, customer account management, and payment processing . The ideal candidate will have proven experience in Zuora Billing, Zuora APIs, Zuora Workflows, and Advanced SQL , along with solid integration knowledge across Salesforce, Avalara, and multiple Payment Gateways . This role will focus on optimizing billing processes, enhancing Zuora platform capabilities, automating workflows, and supporting end-to-end subscription lifecycle management in a fast-paced consulting environment. Key Responsibilities: Manage and configure Zuora Billing including customer accounts, subscriptions, invoicing, billing rules, rate plans, and payment processing . Design, develop, and optimize Zuora Workflows for automation and process efficiency . Configure and administer Zuora platform objects, product catalog, custom fields, and billing configurations . Support and enhance the Order Lifecycle Management process, including Order Harmonization, subscription changes, amendments, and renewals . Develop and maintain Zuora API integrations with Salesforce, Avalara, Payment Gateways, and internal systems . Leverage Advanced SQL for data analysis, reporting, reconciliation, and troubleshooting across Zuora and integrated platforms. Collaborate with cross-functional teams to support business requirements, system enhancements, and issue resolution . Required Skills & Experience 4+ years of hands-on experience with Zuora Billing (subscription management, invoicing, customer accounts). Strong expertise in Zuora APIs and integrations with enterprise systems (Salesforce, Avalara, Payment Gateways). Proven experience in Order Management and Order Harmonization within Zuora. Demonstrated ability in creating and optimizing Zuora Workflows for process automation. Strong knowledge of Zuora object model, platform configurations, billing rules, and product catalog management . Proficiency in Advanced SQL for querying, reporting, and troubleshooting (mandatory).

Posted 5 days ago

Apply

2.0 - 5.0 years

7 - 11 Lacs

mumbai

Work from Office

Provide expertise in analysis, requirements gathering, design, coordination, customization, testing and support of reports, in client’s environment Develop and maintain a strong working relationship with business and technical members of the team Relentless focus on quality and continuous improvement Perform root cause analysis of reports issues Development / evolutionary maintenance of the environment, performance, capability and availability. Assisting in defining technical requirements and developing solutions Effective content and source-code management, troubleshooting and debugging Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Tableau Desktop Specialist, SQL -Strong understanding of SQL for Querying database Good to have - Python ; Snowflake, Statistics, ETL experience. Extensive knowledge on using creating impactful visualization using Tableau. Must have thorough understanding of SQL & advance SQL (Joining & Relationships). Must have experience in working with different databases and how to blend & create relationships in Tableau. Must have extensive knowledge to creating Custom SQL to pull desired data from databases. Troubleshooting capabilities to debug Data controls Preferred technical and professional experience Troubleshooting capabilities to debug Data controls Capable of converting business requirements into workable model. Good communication skills, willingness to learn new technologies, Team Player, Self-Motivated, Positive Attitude

Posted 6 days ago

Apply

6.0 - 11.0 years

10 - 15 Lacs

gurugram

Work from Office

Position summary Our proprietary software-as-a-service helps automotive dealerships and sales teams better understand and predict exactly which customers are ready to buy, the reasons why, and the key offers and incentives most likely to close the sale. Its micro-marketing engine then delivers the right message at the right time to those customers, ensuring higher conversion rates and a stronger ROI. What You'll Do You will be part of our Data Platform & Product Insights data engineering team. As part of this agile team, you will work in our cloud native environment to Build & support data ingestion and processing pipelines in cloud. This will entail extraction, load and transformation of big data from a wide variety of sources, both batch & streaming, using latest data frameworks and technologies Partner with product team to assemble large, complex data sets that meet functional and non-functional business requirements, ensure build out of Data Dictionaries/Data Catalogue and detailed documentation and knowledge around these data assets, metrics and KPIs. Warehouse this data, build data marts, data aggregations, metrics, KPIs, business logic that leads to actionable insights into our product efficacy, marketing platform, customer behaviour, retention etc. Build real-time monitoring dashboards and alerting systems. Coach and mentor other team members. Who you are 6+ years of experience in Big Data and Data Engineering. Strong knowledge of advanced SQL, data warehousing concepts and DataMart designing. Have strong programming skills in SQL, Python etc. Experience in design and development of data pipeline, ETL/ELT process on-premises/cloud. Experience in one of the Cloud providers GCP, Azure, AWS. Experience with relational SQL and NoSQL databases, including Postgres. Experience workflow management tools: AWS data pipeline, Google Cloud Composer etc. Experience with Distributed Versioning Control environments such as GIT, Azure DevOps Building Docker images and fetch/promote and deploy to Production. Integrate Docker container orchestration framework, config Maps, deployments using terraform. Should be able to convert business queries into technical documentation. Strong problem solving and communication skills. Bachelors or an advanced degree in Computer Science or related engineering discipline. Good to have some exposure to Exposure to any Business Intelligence (BI) tools like Tableau, Dundas, Power BI etc. Agile software development methodologies. Working in multi-functional, multi-location teams Grade: 10 Location: Gurugram Hybrid Model: twice a week work from office Shift Time: 12 pm to 9 pm IST What You'll Love About Us Do ask us about these! Total Rewards. Monetary, beneficial and developmental rewards! Work Life Balance. You can't do a good job if your job is all you do! Prepare for the Future. Academy we are all learners; we are all teachers! Employee Assistance Program. Confidential and Professional Counselling and Consulting. Diversity & Inclusion. HeForShe! Internal Mobility. Grow with us!

Posted 6 days ago

Apply

5.0 - 8.0 years

10 - 14 Lacs

bengaluru

Work from Office

About The Role Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Azure Databricks Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Senior Data Engineer, you bring experience in applying advanced concepts and technologies in production environments.Your expertise and skills make you an ideal candidate to lead and deliver cutting-edge data solutions. Roles and Responsibilities- Experience with Infrastructure as Code (IaC), preferably using Terraform.- Proficiency in CI/CD pipelines, with a strong preference for Azure DevOps.- Familiarity with Azure Data Factory for seamless data integration and orchestration.- Hands-on experience with Apache Airflow for workflow automation.- Automation skills using PowerShell.- Nice to have/Basic knowledge of Lakehouse Apps and frameworks like Angular.js, Node.js, or React.js.Professional & Technical Skills: - Extensive hands-on experience with Azure Databricks and modern data architecture principles.- In-depth understanding of Lakehouse and Medallion Architectures and their practical applications.- Advanced knowledge of Delta Lake, including data storage, schema evolution, and ACID transactions.- Comprehensive expertise in working with Parquet files, including handling challenges and designing effective solutions.-Working experience with the Unity Catalog- Knowledge on Azure Cloud services- Working experience with Azure DevOps- Proficiency in writing clear, maintainable, and modular code using Python and PySpark.- Advanced SQL expertise, including query optimization and performance tuning. . Additional Information:- Azure certification is a plus Qualification 15 years full time education

Posted 6 days ago

Apply

5.0 - 10.0 years

3 - 8 Lacs

hyderabad, gurugram, bengaluru

Work from Office

The Team The Institutions and People Profiles team is dedicated to gathering and maintaining comprehensive information on key attributes related to firms, including both buyside and sellside entities, as well as funds and individual profiles. This team plays a crucial role in enhancing the quality and depth of our data offerings, ensuring that stakeholders have access to accurate and up-to-date information. By leveraging a robust methodology for data collection and validation, the team supports informed decision-making processes across the organization, contributing to our overall mission of delivering exceptional insights and analytics. The team currently supports a diverse portfolio of products including but not limited to CIQ (Classic), CIQ Pro, Xpressfeed, BigDough Advanced, and Capital Access. The Impact As a Data Steward with the Institutions and People Profiles team, you will play a pivotal role in ensuring the quality, governance, and usability of the dataset. You will serve as a key point of contact for content-related matters across the team and with adjacent stakeholders. Your contributions will be vital in capturing, disseminating, and enhancing the quality of institutional and individual data, enabling its use for internal analytics and external customer applications. Whats in it for You? Develop expertise in the firms, funds, people dataset and its applications across the financial landscape. Engage in initiatives that enhance data quality, automation, and integration across workflows. Gain exposure to advanced analytical tools and methodologies while contributing to impactful projects. Collaborate with stakeholders in Operations, Technology, and Product teams to improve overall content delivery. Leverage and develop communication skills to articulate complex concepts to audiences with varied expertise levels. Responsibilities Data Management & Delivery: Oversee the end-to-end delivery of complex, data-driven projects, including requirements gathering, analysis, and execution. Define features, user stories, and technology requirements, ensuring alignment with user and client needs. Collaboration: Work with stakeholders across Operations, Technology, and Product teams to streamline workflows. Support technology teams with user story creation, bug identification, and agile development processes. Data Quality & Governance: Analyse vendor data and develop effective ingestion strategies. Identify and implement automation opportunities using Python, machine learning, or other data methodologies. Visualization & Insights: Transform raw, complex data into actionable insights through effective visualization techniques. Stay informed about market trends and address data gaps to maintain relevance and accuracy. What Were Looking For Education: Postgraduate degree in Economics, Finance, or a related field. Experience: Minimum of 5 years of experience in data strategy, management, and governance. Proficiency in the entity and people ecosystem, including expertise with CIQ, BigDough, DMS, MMD backend infrastructure, tables, pipelines, and loaders. Technical Proficiency: Advanced SQL skills (must have) with hands-on experience in data mining and analysis. Proficiency in writing Features and User Stories, preferably using Azure DevOps. Strong understanding and application of data visualization tools and techniques. Soft Skills: Excellent written and verbal communication skills. Strong collaborative mindset with a structured approach to problem-solving. Innovative thinker, capable of working effectively across teams and geographies.

Posted 6 days ago

Apply

9.0 - 14.0 years

9 - 14 Lacs

mumbai

Work from Office

Grade Level (for internal use): 12 The Team: You will be part of global technology team comprising of Developers, QA and BA teams and will be responsible for analysis, design, and development and testing. The Impact: You will be working on one of the core technology platforms responsible for the end of day calculation as well as dissemination of index values. Whats In It for You: You will have the opportunity to work on the enhancements to the existing index calculation system as well as implement new methodologies as required. Responsibilities: Design and development of Java applications for S&P Dow Jones Indices (SPDJI) web sites and its feeder systems Participate in multiple software development processes including Coding, Testing, Debugging & Documentation Develop software applications based on clear business specifications Work on new initiatives and support existing Index applications Perform Application & System Performance tuning and troubleshoot performance issues Develop web based applications and build rich front-end user interfaces Build applications with object oriented concepts and apply design patterns Integrate in-house applications with various vendor software platforms Setup development environment / sandbox for application development Check-in application code changes into the source repository Perform unit testing of application code and fix errors Interface with databases to extract information and build reports Effectively interact with customers, business users and IT staff Basic Qualifications: Bachelor's degree in Computer Science, Information Systems or Engineering is required, or in lieu, a demonstrated equivalence in work experience Excellent communication and interpersonal skills are essential, with strong verbal and writing proficiency 9+ years of experience in application development and support Must have experience of AWS cloud (EC2, EMR, Lambda, S3, Glue, etc.) Strong hands-onexperience with Java, J2EE, Java Messaging Service (JMS) & Enterprise JavaBeans (EJB) Strong hands-onexperience with advanced SQL, PL/SQL programming Basic networking knowledge / Unix scripting Exposure to addressingVulnerabilities Minimum 2 years of experience in anythreeor more offollowing: Advanced Python Advanced Scala Infrastructure/ CICD /DevOps/ Ansible / Fortify / Jenkins Big data / Microservices QA Automation (Cucumber, Selenium, Karate etc.) Preferred Qualification: Experience working with large datasets in Equity, Commodities, Forex, Futures and Options asset classes preferred Experience with Index/Benchmarks or Asset Management or Trading platforms preferred Basic Knowledge of User Interface design & development using JQuery, HTML5 & CSS preferred

Posted 1 week ago

Apply

8.0 - 13.0 years

30 - 35 Lacs

bengaluru

Work from Office

About The Role Data Engineer -1 (Experience 0-2 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 1 week ago

Apply

5.0 - 9.0 years

7 - 11 Lacs

navi mumbai

Work from Office

As a member of the Enterprise Data Platform team and Enterprise Technology organization within Technology & Engineering at PitchBook, you will be part of a team of big thinkers, innovators, and problem solvers who strive to deepen the positive impact we have on our customers and our company every day. We value curiosity and drive to find better ways of doing things. We thrive on customer empathy, which remains our focus when creating excellent customer experiences through product innovation. We know that greatness is achieved through collaboration and diverse points of view, so we work closely with partners around the globe. As a team, we assume positive intent in each other’s words and actions, value constructive discussions and foster a respectful working environment built on integrity, growth, and business value. We invest heavily in our people, who are eager to learn and constantly improve. Join our team and grow with us! As a Senior Data Engineer on the Enterprise Data Platform team, you will be responsible for building data pipelines to ingest various source data from enterprise technologies and PitchBook Platform data, manipulating (cleanse, dedupe, normalize) data into well-constructed data models for data analysis, implementing business logic and standard calculations, governing (supporting, observing, documenting) the data, and making the data available to end consumers in the form of PitchBook data products built on top of our data warehouse/data lake (e.g. Snowflake). You’ll work with a range of data and reporting technologies (e.g. Python, Docker, Tableau, Power BI) to build upon a strong foundation of rigor, quantitative techniques, and efficient processing. You’ll join other Engineers and Analytics professionals as part of the team that develops data pipelines and insights for our internal stakeholders across Sales, Customer Success, Marketing, Research, Data Operations, Product, Finance, and Administration. Your team will rely on you to build your skills in data techniques and analytics to deliver accurate, timely, accessible, and secure data & insights to users. You’ll collaborate closely and effectively with internal and external stakeholders of different roles and technical backgrounds, who have varying understanding of data engineering. You’ll have the opportunity and ability to impact many different areas of analytics and operational thinking across enterprise technology and product engineering. You will exhibit a growth mindset, be willing to solicit feedback, engage others with empathy, and help create a culture of belonging, teamwork, and purpose. If you love building data-centric solutions, strive for excellence every day, are adaptable and focused, and believe work should be fun, come join us! Primary Job Responsibilities Apply unified data technologies to support advanced and automated business analytics Design, develop, document, and maintain database and reporting structures used to compile insights Define, develop, and review extract, load, and transform (ELT) processes and data modeling solutions Consistently evolve data processes and techniques following industry best practices Build data models to be used for reports and dashboards used to translate business data into insights, identify and prioritize operational improvement opportunities, and measure business KPIs against objectives Contribute to the ongoing improvement of quality assurance standards and procedures Support the vision and values of the company through role modeling and encouraging desired behaviors Participate in various company initiatives and projects as requested Skills and Qualifications Bachelor's degree in a related field (Computer Science, Engineering, etc.) 5+ years of experience in data engineering roles, including creating and maintaining data pipelines, data modeling, and data architecture 5+ years of experience in advanced SQL, including expert-level skills in querying large datasets from multiple sources and developing automated reporting 3+ years of experience in Python, with skills for diverse components of data pipelines, including scripting, data manipulation, custom extract, transform and loads, and statistical/regression analysis Expertise in extract, transform, and load (ETL) and extract, load, transform (ELT) processes and pipelines, platforms (e.g. Airflow), and distributed messaging (e.g. Kafka) Experience with tools that capture and control data modeling change management (e.g. SQLMesh) Proficient in data storage solutions, data warehousing, and cloud-based data platforms (e.g. Snowflake) Knowledge and applicable working experience establishing and ensuring data governance, data quality, and compliance standards Exceptional problem-solving skills Excellent communication and collaboration skills with the ability to engage with non-technical stakeholders Experience working with enterprise technologies (CRM, ERP, Marketing Automation Platforms, Financial Systems, etc.) is a plus Working Conditions The job conditions for this position are in a standard office setting. Employees in this position use PC and phone on an on-going basis throughout the day. Limited corporate travel may be required to remote offices or other business meetings and events. Morningstar India is an equal opportunity employer

Posted 1 week ago

Apply

2.0 - 7.0 years

7 - 11 Lacs

jaipur

Work from Office

Job Description: Job Title: Regulatory reporting team, NCT Location: Jaipur, India Role Description The role is to perform a number of key functions that support and control the business in complying with a number regulatory requirements such as M II. This role forms part of a team in Bangalore that supports Regulatory reporting across all asset classes: Rates, Credit, Commodities, Equities and F responsibilities include day to day exception management MIS Compilation and User Acceptance Testing (UAT). This role will also indulge in support terms of building out reports, macros etc. What well offer you 100% reimbursement under child care assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Performing and/or managing various exception management functions across reporting for all asset classes, across multiple jurisdictions Ensure accurate, timely and completeness of reporting Working closely with our technology development teams to design system solutions, the aim to automate as much of the exceptions process a Liaising with internal and external teams to propose developments to the current architecture in order to ensure greater compliance with Regul improved STP processing of our reporting across all asset classes Perform root cause analysis or exceptions with investigation & appropriate escalation of any significant issues found through testing, rejection stream to senior management to ensure transparency exists in our controls Ability to build and maintain effective operational process and prioritise activities based on risk. Clear communication and escalation. Ability to recognize high risk situations and deal with them in a prompt manner. Documentation of BI deliverables. Support the design of data models, reports and visualizations to meet business needs. Develop end-user reports and visualizations. Your skills and experience 2- 7 years work experience within an Ops role within financial services. Graduate in Science / Technology / Engg. / Mathematics. Regulatory experience (MIFIR, EMIR, Dodd Frank, Bank of England etc.) is preferred Preferable experience in Middle Office/Back Office, Reference Data and excellent in Trade Life Cycle (At least 2 asset Classes Equities, Credit Commodities) Ability to work independently, as well as in a team environment Clear and concise communication and escalation. Ability to recognise high risk situations and deal with them in a prompt manner. Ability to identify and prioritize multiple tasks that have potential operational risk and p/l impact in an often high-pressure environment Experience in data analysis with intermediate/advanced Microsoft Office Suite skills including VBA. Experience in building reports and BI analysis with tools such as SAP Business Objects, Tableau, QlikView etc. Advanced SQL Experience is preferred. How well support you

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies