Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 8.0 years
15 - 30 Lacs
Noida, Gurugram, Delhi / NCR
Hybrid
Salary: 20 to 35 LPA Exp: 3 to 8 years Location: Gurgaon(Hybrid) Notice: Immediate only..!! Key Skills: SQL, Advance SQL, BI tools etc Roles and Responsibilities Extract, manipulate, and analyze large datasets from various sources such as Hive, SQL databases, and BI tools. Develop and maintain dashboards using Tableau to provide insights on banking performance, market trends, and customer behavior. Collaborate with cross-functional teams to identify key performance indicators (KPIs) and develop data visualizations to drive business decisions. Desired Candidate Profile 3-8 years of experience in Data Analytics or related field with expertise in Banking Analytics, Business Intelligence, Campaign Analytics, Marketing Analytics, etc. . Strong proficiency in tools like Tableau for data visualization; Advance SQL knowledge preferred. Experience working with big data technologies like Hadoop ecosystem (Hive), Spark; familiarity with Python programming language required.
Posted 1 month ago
8.0 - 13.0 years
15 - 30 Lacs
Noida, Gurugram, Delhi / NCR
Hybrid
Hiring - Sr. SQL DBA Location- Gurugram- Hybrid Fulltime Education - Bachelor or higher degree. 10+ years experience in working in IT services organization. At least 8+ years of SQL DBA experience covering implementation, production support and project work. At least 5+ years of exposure as L3/Lead/ SME support role. Knowledge and Understanding the ITIL process In client facing role(s) globally for at least 5 years. Excellent communication skills On call support for at least 5 years. Excellent Documentations skills in MS SharePoint 2010 /2013 Mandatory skills - SQL Server & Cloud administration required skills • Senior level Microsoft SQL Server DBA with experience in large and critical environments. Excellent knowledge of performance tuning at both server level and query level. Candidate must have knowledge of perfmon, SQL Server dynamic management views. Knowledge of SQL server internals. • Knowledge of SQL partitioning and compression • Hands-on experience on mirroring, replication and Always ON Must have automation experience. Good knowledge of scripting. Ability to write T-Sql scripts, Power Shell or Python. Excellent knowledge of SSIS packages. Hands-on experience on SQL Server Clustering. Hands-on experience on Integration Services, Reporting services and Analysis services • Must have good knowledge in SQL Server permission/security policies • At least 6 months experience with AWS Cloud RDS Good to have skills MySQL knowledge / experience. • Linux Knowledge Experience with Snow Flake Certifications Desired MS certifications on MSSQL latest versions. AWS Certifications
Posted 1 month ago
6.0 - 11.0 years
8 - 13 Lacs
Pune
Work from Office
What You'll Do The Global Analytics & Insights (GAI) team is looking for a Senior Data Engineer to lead our build of the data infrastructure for Avalara's core data assets- empowering us with accurate, data to lead data backed decisions. As A Senior Data Engineer, you will help architect, implement, and maintain our data infrastructure using Snowflake, dbt (Data Build Tool), Python, Terraform, and Airflow. You will immerse yourself in our financial, marketing, and sales data to become an expert of Avalara's domain. You will have deep SQL experience, an understanding of modern data stacks and technology, a desire to build things the right way using modern software principles, and experience with data and all things data related. What Your Responsibilities Will Be You will architect repeatable, reusable solutions to keep our technology stack DRY Conduct technical and architecture reviews with engineers, ensuring all contributions meet quality expectations You will develop scalable, reliable, and efficient data pipelines using dbt, Python, or other ELT tools Implement and maintain scalable data orchestration and transformation, ensuring data accuracy, consistency Collaborate with cross-functional teams to understand complex requirements and translate them into technical solutions Build scalable, complex dbt models Demonstrate ownership of complex projects and calculations of core financial metrics and processes Work with Data Engineering teams to define and maintain scalable data pipelines. Promote automation and optimization of reporting processes to improve efficiency. You will be reporting to Senior Manager What You'll Need to be Successful Bachelor's degree in Computer Science or Engineering, or related field 6+ years experience in data engineering field, with advanced SQL knowledge 4+ years of working with Git, and demonstrated experience collaborating with other engineers across repositories 4+ years of working with Snowflake 3+ years working with dbt (dbt core) 3+ years working with Infrastructure as Code Terraform 3+ years working with CI CD, and demonstrated ability to build and operate pipelines AWS Certified Terraform Certified Experience working with complex Salesforce data Snowflake, dbt certified
Posted 1 month ago
1.0 - 5.0 years
3 - 5 Lacs
Gurugram
Work from Office
Role & responsibilities As part of the Home Credit analytics team, the successful candidate will be responsible for developing, analyzing and executing ideas and initiatives designed to achieve business reports.. Would need to learn HCIN data base, absorb current reporting and should be able to create new reports as per business requirement. Should have strong base in SQL and Power BI reporting Mandatory Skills SQL Coding, power bi, Excel Report Preparation, Dashboards, Reporting, Advanced Excel, PowerPoint, data analysis Preferred candidate profile SQL Coding, MS office ,Power BI, Power point, Dashboards, Advanced Excel and outlook Good Communication, Analytics and Decision Making Notice Period - Immediate - 30 days Max
Posted 1 month ago
5.0 - 10.0 years
7 - 11 Lacs
Hyderabad
Work from Office
Provide expertise in analysis, requirements gathering, design, coordination, customization, testing and support of reports, in client’s environment Develop and maintain a strong working relationship with business and technical members of the team Relentless focus on quality and continuous improvement Perform root cause analysis of reports issues Development / evolutionary maintenance of the environment, performance, capability and availability. Assisting in defining technical requirements and developing solutions Effective content and source-code management, troubleshooting and debugging Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5+ years of experience with BI tools, with expertise and/or certification in at least one major BI platform – Tableau preferred. Advanced knowledge of SQL, including the ability to write complex stored procedures, views, and functions. Proven capability in data storytelling and visualization, delivering actionable insights through compelling presentations. Excellent communication skills, with the ability to convey complex analytical findings to non-technical stakeholders in a clear, concise, and meaningful way. 5.Identifying and analyzing industry trends, geographic variations, competitor strategies, and emerging customer behavior Preferred technical and professional experience Troubleshooting capabilities to debug Data controls Capable of converting business requirements into workable model. Good communication skills, willingness to learn new technologies, Team Player, Self-Motivated, Positive Attitude. Must have thorough understanding of SQL & advance SQL (Joining & Relationships)
Posted 1 month ago
5.0 - 8.0 years
18 - 22 Lacs
Bengaluru
Work from Office
Job Title - Sales Excellence - COE - Data Engineering Specialist Management Level: 9-Team Lead/Consultant Location: Mumbai, MDC2C Must-have skills: Sales Good to have skills: Data Science, SQL, Automation, Machine Learning Job Summary : Apply deep statistical tools and techniques to find relationships between variables Roles & Responsibilities: - Apply deep statistical tools and techniques to find relationships between variables. - Develop intellectual property for analytical methodologies and optimization techniques. - Identify data requirements and develop analytic solutions to solve business issues. Job Title - Analytics & Modelling Specialist Management Level :9-Specialist Location:Bangalore/ Gurgaon/Hyderabad/Mumbai Must have skills:Python, Data Analysis, Data Visualization, SQL Good to have skills:Machine Learning Job Summary : The Center of Excellence (COE) makes sure that the sales and pricing methods and offerings of Sales Excellence are effective. - The COE supports salespeople through its business partners and Analytics and Sales Operations teams. The Data Engineer helps manage data sources and environments, utilizing large data sets and maintaining their integrity to create models and apps that deliver insights to the organization. Roles & Responsibilities: Build and manage data models that bring together data from different sources. Help consolidate and cleanse data for use by the modeling and development teams. Structure data for use in analytics applications. Lead a team of Data Engineers effectively. Professional & Technical Skills: A bachelors degree or equivalent Total experience Range:5-8 years in the relevant field A minimum of 3 years of GCP experience with exposure to machine learning/data science Experience in configuration the machine learning workflow in GCP. A minimum of 5 years Advanced SQL knowledge and experience working with relational databases A minimum of 3 years Familiarity and hands on experience in different SQL objects like stored procedures, functions, views etc., A minimum of 3 years Building of data flow components and processing systems to extract, transform, load and integrate data from various sources. A minimum of 3 years Hands on experience in advanced excel topics such as cube functions, VBA Automation, Power Pivot etc. A minimum of 3 years Hands on experience in Python Additional Information: Understanding of sales processes and systems. Masters degree in a technical field. Experience with quality assurance processes. Experience in project management. You May Also Need: Ability to work flexible hours according to business needs. Must have good internet connectivity and a distraction-free environment for working at home, in accordance with local guidelines. Professional & Technical Skills: - Relevant experience in the required domain. - Strong analytical, problem-solving, and communication skills. - Ability to work in a fast-paced, dynamic environment. Additional Information: - Opportunity to work on innovative projects. - Career growth and leadership exposure. About Our Company | Accenture Qualification Experience:8 to 10 Years Educational Qualification: B.Com
Posted 1 month ago
4.0 - 9.0 years
15 - 30 Lacs
Gurugram, Chennai
Work from Office
Role & responsibilities • Assume ownership of Data Engineering projects from inception to completion. Implement fully operational Unified Data Platform solutions in production environments using technologies like Databricks, Snowflake, Azure Synapse etc. Showcase proficiency in Data Modelling and Data Architecture Utilize modern data transformation tools such as DBT (Data Build Tool) to streamline and automate data pipelines (nice to have). Implement DevOps practices for continuous integration and deployment (CI/CD) to ensure robust and scalable data solutions (nice to have). Maintain code versioning and collaborate effectively within a version-controlled environment. Familiarity with Data Ingestion & Orchestration tools such as Azure Data Factory, Azure Synapse, AWS Glue etc. Set up processes for data management, templatized analytical modules/deliverables. Continuously improve processes with focus on automation and partner with different teams to develop system capability. Proactively seek opportunities to help and mentor team members by sharing knowledge and expanding skills. Ability to communicate effectively with internal and external stakeholders. Coordinating with cross-functional team members to make sure high quality in deliverables with no impact on timelines Preferred candidate profile • Expertise in computer programming languages such as: Python and Advance SQL • Should have working knowledge of Data Warehousing, Data Marts and Business Intelligence with hands-on experience implementing fully operational data warehouse solutions in production environments. • 3+ years of Working Knowledge of Big data tools (Hive, Spark) along with ETL tools and cloud platforms. • 3+ years of relevant experience in either Snowflake or Databricks. Certification in Snowflake or Databricks would be highly recommended. • Proficient in Data Modelling and ELT techniques. • Experienced with any of the ETL/Data Pipeline Orchestration tools such as Azure Data Factory, AWS Glue, Azure Synapse, Airflow etc. • Experience working with ingesting data from different data sources such as RDBMS, ERP Systems, APIs etc. • Knowledge of modern data transformation tools, particularly DBT (Data Build Tool), for streamlined and automated data pipelines (nice to have). • Experience in implementing DevOps practices for CI/CD to ensure robust and scalable data solutions (nice to have). • Proficient in maintaining code versioning and effective collaboration within a versioncontrolled environment. • Ability to work effectively as an individual contributor and in small teams. Should have experience mentoring junior team members. • Excellent problem-solving and troubleshooting ability with experience of supporting and working with cross functional teams in a dynamic environment. • Strong verbal and written communication skills with ability to communicate effectively, articulate results and issues to internal and client team.
Posted 1 month ago
5.0 - 8.0 years
18 - 25 Lacs
Noida, Hyderabad, Chennai
Hybrid
Hiring for SQL SSIS Developer for Hyderabad/Chennai/Noida/Pune locations. Looking for immediate joiners: 0-1 week Primary Skill Advanced SQL, SSIS PF deduction is mandatory for all companies. Role & responsibilities 5 years of practical experience with SSIS, Advanced SQL, T-SQL, experience with Microsoft SQL Server platform, and Transact SQL stored procedures and triggers Working knowledge of Azure Synapse & Snowflake Experience executing SSIS packages via SQL Server Job Agent Experience with complex queries including use of CTEs, table variables, merge and dynamic SQL.
Posted 1 month ago
10.0 - 15.0 years
22 - 27 Lacs
Gurugram
Work from Office
Position Summary Core Objectives in this roleDrive revenue growth at a Key Account, in partnership with onshore Client Partners, and offshore Delivery Leaders, by focusing on Delivery excellence, leading to increased customer satisfaction strategic account management and business development Work Experience 10 +years of relevant delivery experience in a large/midsize IT services/Consulting/Analytics Company. 10 years Delivery Management – Business information management (major) and Commercial Ops (minor) 5+ years of Account Management/ Business Development/ Solutioning experience in the IT services/Consulting industry Extensive Pharma/ Life Science industry experience- commercial & clinical Datasets Tech Skills Azure ADF Informatica Intelligent Cloud Services (IICS) PowerShell Scripting Tidal for Scheduling Azure Cloud Services Advance SQL Databricks (second priority) Job Responsibilities Program Management and Delivery Governance coordination with delivery teams to Represent the ‘voice of the client’ in front of delivery teams Serve as the glue between onshore-offshore and cross-LOB stakeholders Anticipate and mitigate risks Uncover new areas to add value to the client’s business Business Development Activities to grow existing accounts Account Strategy and Tactics Proposal Writing (storyline, slides, documents) Bid Management,Drive to uncover cross-sell, up-sell opportunities Process Activities to ensure smooth financial and project operations Track and follow up on CRM opportunities, Active Projects Prepare reports/deckspipeline, revenue, profitability, invoicing Being successful in this role requires an attitude of ‘ownership’ of all aspects of a business.It entails working collaboratively in a matrix organization, with the full range of stakeholders Onshore client-facing partners Finance, Legal, Marketing Delivery teams both onshore and offshore Education MBA PG Diploma in Management Behavioural Competencies Project Management Client Engagement and Relationship Building Attention to P&L Impact Capability Building / Thought Leadership Customer focus Technical Competencies Account management Azure Data Factory Data Governance Pharma Data Analytics Delivery Management- BIM/ Cloud Info Management Informatica Azure SQL
Posted 1 month ago
1.0 - 3.0 years
3 - 5 Lacs
Hyderabad
Work from Office
ABOUT THE ROLE Role Description We are seeking an MDM Associate Analyst with 2 5 years of development experience to support and enhance our enterprise MDM (Master Data Management) platforms using Informatica/Reltio. This role is critical in delivering high-quality master data solutions across the organization, utilizing modern tools like Databricks and AWS to drive insights and ensure data reliability. The ideal candidate will have strong SQL, data profiling, and experience working with cross-functional teams in a pharma environment. To succeed in this role, the candidate must have strong experience on MDM (Master Data Management) on configuration (L3 Configuration, Assets creati on, Data modeling etc ) , ETL and data mappings (CAI, CDI ) , data mastering (Match/Merge and Survivorship rules) , source and target integrations ( RestAPI , Batch integration, Integration with Databricks tables etc ) Roles & Responsibilities Analyze and manage customer master data using Reltio or Informatica MDM solutions. Perform advanced SQL queries and data analysis to validate and ensure master data integrity. Leverage Python, PySpark , and Databricks for scalable data processing and automation. Collaborate with business and data engineering teams for continuous improvement in MDM solutions. Implement data stewardship processes and workflows, including approval and DCR mechanisms. Utilize AWS cloud services for data storage and compute processes related to MDM. Contribute to metadata and data modeling activities. Track and manage data issues using tools such as JIRA and document processes in Confluence. Apply Life Sciences/Pharma industry context to ensure data standards and compliance. Basic Qualifications and Experience Masters degree with 1 - 3 years of experience in Business, Engineering, IT or related field OR Bachelors degree with 2 - 5 years of experience in Business, Engineering, IT or related field OR Diploma with 6 - 8 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Strong experience with Informatica or Reltio MDM platforms in building configurations from scratch (Like L3 configuration or Data modeling, Assets creations, Setting up API integrations, Orchestration) Strong experience in building data mappings, data profiling, creating and implementation business rules for data quality and data transformation Strong experience in implementing match and merge rules and survivorship of golden records Expertise in integrating master data records with downstream systems Very good understanding of DWH basics and good knowledge on data modeling Experience with IDQ, data modeling and approval workflow/DCR. Advanced SQL expertise and data wrangling. Exposure to Python and PySpark for data transformation workflows. Knowledge of MDM, data governance, stewardship, and profiling practices. Good-to-Have Skills: Familiarity with Databricks and AWS architecture. Background in Life Sciences/Pharma industries. Familiarity with project tools like JIRA and Confluence. Basics of data engineering concepts. Professional Certifications Any ETL certification ( e.g. Informatica) Any Data Analysis certification (SQL , Python, Databricks ) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams
Posted 1 month ago
6.0 - 11.0 years
15 - 30 Lacs
Noida, Pune, Bengaluru
Hybrid
We are looking for a Snowflake Developer with deep expertise in Snowflake and DBT or SQL to help us build and scale our modern data platform. Key Responsibilities: Design and build scalable ELT pipelines in Snowflake using DBT/SQL . Develop efficient, well-tested DBT models (staging, intermediate, and marts layers). Implement data quality, testing, and monitoring frameworks to ensure data reliability and accuracy. Optimize Snowflake queries, storage, and compute resources for performance and cost-efficiency. Collaborate with cross-functional teams to gather data requirements and deliver data solutions. Required Qualifications: 5+ years of experience as a Data Engineer, with at least 4 years working with Snowflake . Proficient with DBT (Data Build Tool) including Jinja templating, macros, and model dependency management. Strong understanding of ELT patterns and modern data stack principles. Advanced SQL skills and experience with performance tuning in Snowflake. Interested candidates share your CV at himani.girnar@alikethoughts.com with below details Candidate's name- Email and Alternate Email ID- Contact and Alternate Contact no- Total exp- Relevant experience- Current Org- Notice period- CCTC- ECTC- Current Location- Preferred Location- Pancard No-
Posted 1 month ago
6.0 - 11.0 years
8 - 13 Lacs
Gurugram
Work from Office
About the Role: Grade Level (for internal use): 10 Position summary Our proprietary software-as-a-service helps automotive dealerships and sales teams better understand and predict exactly which customers are ready to buy, the reasons why, and the key offers and incentives most likely to close the sale. Its micro-marketing engine then delivers the right message at the right time to those customers, ensuring higher conversion rates and a stronger ROI. What You'll Do You will be part of our Data Platform & Product Insights data engineering team. As part of this agile team, you will work in our cloud native environment to Build & support data ingestion and processing pipelines in cloud. This will entail extraction, load and transformation of big data from a wide variety of sources, both batch & streaming, using latest data frameworks and technologies Partner with product team to assemble large, complex data sets that meet functional and non-functional business requirements, ensure build out of Data Dictionaries/Data Catalogue and detailed documentation and knowledge around these data assets, metrics and KPIs. Warehouse this data, build data marts, data aggregations, metrics, KPIs, business logic that leads to actionable insights into our product efficacy, marketing platform, customer behaviour, retention etc. Build real-time monitoring dashboards and alerting systems. Coach and mentor other team members. Who you are 6+ years of experience in Big Data and Data Engineering. Strong knowledge of advanced SQL, data warehousing concepts and DataMart designing. Have strong programming skills in SQL, Python/ PySpark etc. Experience in design and development of data pipeline, ETL/ELT process on-premises/cloud. Experience in one of the Cloud providers GCP, Azure, AWS. Experience with relational SQL and NoSQL databases, including Postgres and MongoDB. Experience workflow management toolsAirflow, AWS data pipeline, Google Cloud Composer etc. Experience with Distributed Versioning Control environments such as GIT, Azure DevOps Building Docker images and fetch/promote and deploy to Production. Integrate Docker container orchestration framework using Kubernetes by creating pods, config Maps, deployments using terraform. Should be able to convert business queries into technical documentation. Strong problem solving and communication skills. Bachelors or an advanced degree in Computer Science or related engineering discipline. Good to have some exposure to Exposure to any Business Intelligence (BI) tools like Tableau, Dundas, Power BI etc. Agile software development methodologies. Working in multi-functional, multi-location teams Grade10 LocationGurugram Hybrid Modeltwice a week work from office Shift Time12 pm to 9 pm IST What You'll Love About Us Do ask us about these! Total Rewards. Monetary, beneficial and developmental rewards! Work Life Balance. You can't do a good job if your job is all you do! Prepare for the Future. Academy we are all learners; we are all teachers! Employee Assistance Program. Confidential and Professional Counselling and Consulting. Diversity & Inclusion. HeForShe! Internal Mobility. Grow with us! About automotiveMastermind Who we are: Founded in 2012, automotiveMastermind is a leading provider of predictive analytics and marketing automation solutions for the automotive industry and believes that technology can transform data, revealing key customer insights to accurately predict automotive sales. Through its proprietary automated sales and marketing platform, Mastermind, the company empowers dealers to close more deals by predicting future buyers and consistently marketing to them. automotiveMastermind is headquartered in New York City. For more information, visit automotivemastermind.com. At automotiveMastermind, we thrive on high energy at high speed. Were an organization in hyper-growth mode and have a fast-paced culture to match. Our highly engaged teams feel passionately about both our product and our people. This passion is what continues to motivate and challenge our teams to be best-in-class. Our cultural values of Drive and Help have been at the core of what we do, and how we have built our culture through the years. This cultural framework inspires a passion for success while collaborating to win. What we do: Through our proprietary automated sales and marketing platform, Mastermind, we empower dealers to close more deals by predicting future buyers and consistently marketing to them. In short, we help automotive dealerships generate success in their loyalty, service, and conquest portfolios through a combination of turnkey predictive analytics, proactive marketing, and dedicated consultative services. Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries
Posted 1 month ago
8.0 - 12.0 years
20 - 22 Lacs
Chennai, Bengaluru
Work from Office
Experience working closely with other data scientists, data engineers' software engineers, data managers and business partners. 7+ years in designing, planning, prototyping, productionizing, maintaining knowledge in Python, Go, Java, SQL knowledge
Posted 1 month ago
4.0 - 7.0 years
4 - 9 Lacs
Pune
Hybrid
Role Overview: This hybrid role sits within the Distribution Data Stewardship Team and combines operational and technical responsibilities to ensure data accuracy, integrity, and process optimization across sales reporting functions. Key Responsibilities: Support sales reporting inquiries from sales staff at all levels. Reconcile omnibus activity with sales reporting systems. Analyze data flows to assess impact on commissions and reporting. Perform data audits and updates to ensure integrity. Lead process optimization and automation initiatives. Manage wholesaler commission processes, including adjustments and manual submissions. Oversee manual data integration from intermediaries. Execute territory alignment changes to meet business objectives. Contribute to team initiatives and other responsibilities as assigned. Growth Opportunities: Exposure to all facets of sales reporting and commission processes. Opportunities to develop project and relationship management skills. Potential to explore leadership or technical specialist roles within the firm. Qualifications: Bachelors degree in Computer Engineering or a related field. 4–7 years of experience with Python programming and automation . Strong background in SQL and data analysis . Experience in relationship/customer management and leading teams . Experience working with Salesforce is a plus. Required Skills: Technical proficiency in Python and SQL . Strong communication skills and stakeholder engagement. High attention to data integrity and detail . Self-directed with excellent time management. Project coordination and documentation skills. Proficiency in MS Office , especially Excel .
Posted 1 month ago
8.0 - 10.0 years
30 - 35 Lacs
Pune, Chennai, Bengaluru
Work from Office
Role & responsibilities AWS Architect Primary skills Aws (Redshift, Glue, Lambda, ETL and Aurora), advance SQL and Python , Pyspark Note : -Aurora Database mandatory skill Experience – 8 + yrs Notice period – Immediate joiner Location – Any Brillio location (Preferred is Bangalore) Job Description: year of IT experiences with deep expertise in S3, Redshift, Aurora, Glue and Lambda services. Atleast one instance of proven experience in developing Data platform end to end using AWS Hands-on programming experience with Data Frames, Python, and unit testing the python as well as Glue code. Experience in orchestrating mechanisms like Airflow, Step functions etc. Experience working on AWS redshift is Mandatory. Must have experience writing stored procedures, understanding of Redshift data API and writing federated queries Experience in Redshift performance tunning.Good in communication and problem solving. Very good stakeholder communication and management Preferred candidate profile
Posted 1 month ago
2.0 - 5.0 years
7 - 11 Lacs
Pune
Work from Office
Provide expertise in analysis, requirements gathering, design, coordination, customization, testing and support of reports, in client’s environment Develop and maintain a strong working relationship with business and technical members of the team Relentless focus on quality and continuous improvement Perform root cause analysis of reports issues Development / evolutionary maintenance of the environment, performance, capability and availability. Assisting in defining technical requirements and developing solutions Effective content and source-code management, troubleshooting and debugging Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Tableau Desktop Specialist, SQL -Strong understanding of SQL for Querying database Good to have - Python ; Snowflake, Statistics, ETL experience. Extensive knowledge on using creating impactful visualization using Tableau. Must have thorough understanding of SQL & advance SQL (Joining & Relationships). Must have experience in working with different databases and how to blend & create relationships in Tableau. Must have extensive knowledge to creating Custom SQL to pull desired data from databases. Troubleshooting capabilities to debug Data controls Preferred technical and professional experience Troubleshooting capabilities to debug Data controls Capable of converting business requirements into workable model. Good communication skills, willingness to learn new technologies, Team Player, Self-Motivated, Positive Attitude. Must have thorough understanding of SQL & advance SQL (Joining & Relationships
Posted 1 month ago
2.0 - 5.0 years
7 - 11 Lacs
Pune
Work from Office
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You’ll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you’ll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you’ll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviour’s. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Expertise in designing and implementing scalable data warehouse solutions on Snowflake, including schema design, performance tuning, and query optimization. Strong experience in building data ingestion and transformation pipelines using Talend to process structured and unstructured data from various sources. Proficiency in integrating data from cloud platforms into Snowflake using Talend and native Snowflake capabilities. Hands-on experience with dimensional and relational data modelling techniques to support analytics and reporting requirements. Understanding of optimizing Snowflake workloads, including clustering keys, caching strategies, and query profiling. Ability to implement robust data validation, cleansing, and governance frameworks within ETL processes. Proficiency in SQL and/or Shell scripting for custom transformations and automation tasks Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Tableau Desktop Specialist, SQL -Strong understanding of SQL for Querying database Good to have - Python ; Snowflake, Statistics, ETL experience. Extensive knowledge on using creating impactful visualization using Tableau. Must have thorough understanding of SQL & advance SQL (Joining & Relationships) Preferred technical and professional experience Must have experience in working with different databases and how to blend & create relationships in Tableau. Must have extensive knowledge to creating Custom SQL to pull desired data from databases. Troubleshooting capabilities to debug Data controls. Capable of converting business requirements into workable model. Good communication skills, willingness to learn new technologies, Team Player, Self-Motivated, Positive Attitude.
Posted 1 month ago
1.0 - 5.0 years
9 - 13 Lacs
Mumbai, Gurugram, Bengaluru
Work from Office
We are seeking a talented and motivated Data Scientist with 1-3 years of experience to join our Data Science team. If you have a strong passion for data science, expertise in machine learning, and experience working with large-scale datasets, we want to hear from you. As a Data Scientist at RevX, you will play a crucial role in developing and implementing machine learning models to drive business impact. You will work closely with teams across data science, engineering, product, and campaign management to build predictive models, optimize algorithms, and deliver actionable insights. Your work will directly influence business strategy, product development, and campaign optimization. Major Responsibilities: Develop and implement machine learning models, particularly neural networks, decision trees, random forests, and XGBoost, to solve complex business problems. Work on deep learning models and other advanced techniques to enhance predictive accuracy and model performance. Analyze and interpret large, complex datasets using Python, SQL, and big data technologies to derive meaningful insights. Collaborate with cross-functional teams to design, build, and deploy end-to-end data science solutions, including data pipelines and model deployment frameworks. Utilize advanced statistical techniques and machine learning methodologies to optimize business strategies and outcomes. Evaluate and improve model performance, calibration, and deployment strategies for real-time applications. Perform clustering, segmentation, and other unsupervised learning techniques to discover patterns in large datasets. Conduct A/B testing and other experimental designs to validate model performance and business strategies. Create and maintain data visualizations and dashboards using tools such as matplotlib, seaborn, Grafana, and Looker to communicate findings. Provide technical expertise in handling big data, data warehousing, and cloud-based platforms like Google Cloud Platform (GCP). Required Experience/Skills: Bachelors or Masters degree in Data Science, Computer Science, Statistics, Mathematics, or a related field. 1-3 years of experience in data science or machine learning roles. Strong proficiency in Python for machine learning, data analysis, and deep learning applications. Experience in developing, deploying, and monitoring machine learning models, particularly neural networks, and other advanced algorithms. Expertise in handling big data technologies, with experience in tools such as BigQuery and cloud platforms (GCP preferred). Advanced SQL skills for data querying and manipulation from large datasets. Experience in data visualization tools like matplotlib, seaborn, Grafana, and Looker. Strong understanding of A/B testing, statistical tests, experimental design, and methodologies. Experience in clustering, segmentation, and other unsupervised learning techniques. Strong problem-solving skills and the ability to work with complex datasets and machine learning pipelines. Excellent communication skills, with the ability to explain complex technical concepts to non-technical stakeholders. Preferred Skills: Experience with deep learning frameworks such as TensorFlow or PyTorch. Familiarity with data warehousing concepts and big data tools. Knowledge of MLOps practices, including model deployment, monitoring, and management. Experience with business intelligence tools and creating data-driven dashboards. Understanding of reinforcement learning, natural language processing (NLP), or other advanced AI techniques. Education: Bachelor of Engineering or similar degree from any reputed University.
Posted 1 month ago
5.0 - 9.0 years
12 - 16 Lacs
Mumbai, Gurugram, Bengaluru
Work from Office
Research and Problem-Solving: Identify and frame business problems, conduct exploratory data analysis, and propose innovative data science solutions tailored to business needs. Leadership & Communication: Serve as a technical referent for the research team, driving high-impact, high-visibility initiatives. Effectively communicate complex scientific concepts to senior stakeholders, ensuring insights are actionable for both technical and non-technical audiences. Mentor and develop scientists within the team, fostering growth and technical excellence. Algorithm Development: Design, optimize, and implement advanced machine learning algorithms, including neural networks, ensemble models (XGBoost, random forests), and clustering techniques. End-to-End Project Ownership: Lead the development, deployment, and monitoring of machine learning models and data pipelines for large-scale applications. Model Optimization and Scalability: Focus on optimizing algorithms for performance and scalability, ensuring robust, well-calibrated models suitable for real-time environments. A/B Testing and Validation: Design and execute experiments, including A/B testing, to validate model effectiveness and business impat. Big Data Handling: Leverage tools like BigQuery, advanced SQL, and cloud platforms (e.g., GCP) to process and analyze large datasets. Collaboration and Mentorship: Work closely with engineering, product, and campaign management teams, while mentoring junior data scientists in best practices and advanced techniques. Data Visualization: Create impactful visualizations using tools like matplotlib, seaborn, Looker, and Grafana to communicate insights effectively to stakeholders. Required Experience/Skills 5–8 years of hands-on experience in data science or machine learning roles. 2+ years leading data science projects in AdTech Strong hands-on skills in Advanced Statistics, Machine Learning, and Deep Learning. Demonstrated ability to implement and optimize neural networks and other advanced ML models. Proficiency in Python for developing machine learning models, with a strong grasp of TensorFlow or PyTorch. Expertise handling large datasets using advanced SQL and big data tools like BigQuery In-depth knowledge of MLOps pipelines, from data preprocessing to deployment and monitoring. Strong background in A/B testing, statistical analysis, and experimental design. Proven capability in clustering, segmentation, and unsupervised learning methods. Strong problem-solving and analytical skills with a focus on delivering business value. Education: A Master’s in Data Science, Computer Science, Mathematics, Statistics, or a related field is preferred. A Bachelor's degree with exceptional experience will also be considered.
Posted 1 month ago
7.0 - 12.0 years
15 - 20 Lacs
Hyderabad
Work from Office
Overview We are seeking a strategic and hands-on Manager of Business Intelligence (BI) and Data Governance to lead the development and execution of our enterprise-wide data strategy. This role will oversee data governance frameworks, manage modern BI platforms, and ensure the integrity, availability, and usability of business-critical data. Reporting into senior leadership, this role plays a pivotal part in shaping data-informed decision-making across functions including Finance, Revenue Operations, Product, and more. The ideal candidate is a technically proficient and people-oriented leader with a deep understanding of data governance, cloud data architecture, and SaaS KPIs. They will drive stakeholder engagement, enablement, and adoption of data tools and insights, with a focus on building scalable, trusted, and observable data systems. Responsibilities Data Governance Leadership: Establish and maintain a comprehensive data governance framework that includes data quality standards, ownership models, data stewardship processes, and compliance alignment with regulations such as GDPR and SOC 2. Enterprise Data Architecture: Oversee data orchestration across Salesforce (SFDC), cloud-based data warehouses (e.g., Databricks, Snowflake, or equivalent), and internal systems. Cross collaborate with data engineering team for the development and optimization of ETL pipelines to ensure data reliability and performance at scale. Team Management & Enablement: Lead and mentor a team of BI analysts, and governance specialists. Foster a culture of collaboration, continuous learning, and stakeholder enablement to increase data adoption across the organization. BI Strategy & Tools Management: Own the BI toolset (with a strong emphasis on Tableau), and define standards for scalable dashboard design, self-service reporting, and analytics enablement. Evaluate and incorporate additional platforms (e.g., Power BI, Looker) as needed. Stakeholder Engagement & Strategic Alignment: Partner with leaders in Finance, RevOps, Product, and other departments to align reporting and data strategy with business objectives. Translate business needs into scalable reporting solutions and drive enterprise-wide adoption through clear communication and training. Data Quality & Observability: Implement data quality monitoring, lineage tracking, and observability tools to proactively detect issues and ensure data reliability and trustworthiness. Documentation & Transparency: Create and maintain robust documentation for data processes, pipeline architecture, code repositories (via GitHub), and business definitions to support transparency and auditability for technical and non-technical users. Executive-Level Reporting & Insight: Design and maintain strategic dashboards that surface key SaaS performance indicators to senior leadership and the board. Deliver actionable insights to support company-wide strategic decisions. Continuous Improvement & Innovation: Stay current with trends in data governance, BI technologies, and AI. Proactively recommend and implement enhancements to tools, processes, and governance maturity. Qualifications Data Governance Expertise: Proven experience implementing data governance frameworks, compliance standards, and ownership models across cross-functional teams. SQL Expertise: Advanced SQL skills with a strong background in ETL/data pipeline development across systems like Salesforce and enterprise data warehouses. BI Tools Mastery: Expertise in Tableau for developing reports and dashboards. Experience driving adoption of BI best practices across a diverse user base. Salesforce Data Proficiency: Deep understanding of SFDC data structure, reporting, and integration with downstream systems. Version Control & Documentation: Hands-on experience with GitHub and best practices in code versioning and documentation of data pipelines. Leadership & Stakeholder Communication: 3+ years of people management experience with a track record of team development and stakeholder engagement. Analytics Experience: 8+ years of experience in analytics roles, working with large datasets to derive insights and support executive-level decision-making. Programming Knowledge: Proficiency in Python for automation, data manipulation, and integration tasks. SaaS Environment Acumen: Deep understanding of SaaS metrics, business models, and executive reporting needs. Cross-functional Collaboration: Demonstrated success in partnering with teams like Finance, Product, and RevOps to meet enterprise reporting and insight goals.
Posted 1 month ago
8.0 - 13.0 years
3 - 6 Lacs
Bengaluru
Work from Office
Location Bengaluru : We are seeking a highly skilled and motivated Data Engineer to join our dynamic team. The ideal candidate will have extensive experience in data engineering, with a strong focus on Databricks, Python, and SQL. As a Data Engineer, you will play a crucial role in designing, developing, and maintaining our data infrastructure to support various business needs. Key Responsibilities Develop and implement efficient data pipelines and ETL processes to migrate and manage client, investment, and accounting data in Databricks Work closely with the investment management team to understand data structures and business requirements, ensuring data accuracy and quality. Monitor and troubleshoot data pipelines, ensuring high availability and reliability of data systems. Optimize database performance by designing scalable and cost-effective solutions. What s on offer Competitive salary and benefits package. Opportunities for professional growth and development. A collaborative and inclusive work environment. The chance to work on impactful projects with a talented team. Candidate Profile Experience: 8+ years of experience in data engineering or a similar role. Proficiency in Apache Spark. Databricks Data Cloud, including schema design, data partitioning, and query optimization Exposure to Azure. Exposure to Streaming technologies. (e.g Autoloader, DLT Streaming) Advanced SQL, data modeling skills and data warehousing concepts tailored to investment management data (e.g., transaction, accounting, portfolio data, reference data etc). Experience with ETL/ELT tools like snap logic and programming languages (e.g., Python, Scala, R programing). Familiarity workload automation and job scheduling tool such as Control M. Familiar with data governance frameworks and security protocols. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Education Bachelor s degree in computer science, IT, or a related discipline. Not Ready to Apply Join our talent pool and we'll reach out when a job fits your skills.
Posted 1 month ago
3.0 - 6.0 years
6 - 11 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Job Title: Oracle Fusion Techno-Functional Consultant Location: Hyderabad/Bangalore/Pune Job Type: Full-time Experience Level: 8+ Years Domain: ERP – Oracle Fusion Cloud Must-Have Skills: - 8+ years of Oracle ERP experience, with a minimum of 3 years in Oracle Fusion Cloud - Strong knowledge and hands-on experience in Fusion Manufacturing or SCM, Fusion Cost Management, and Fusion EAM or Finance Modules - Proficiency in Advanced SQL / PL-SQL and BI Tools (OTBI, BI Publisher, FAW) - Familiarity with Fusion data models and REST/SOAP Web Services - Ability to troubleshoot complex functional and technical issues - Excellent communication and problem-solving skills - Bachelor's degree in Engineering, Computer Science, or related field
Posted 1 month ago
2.0 - 5.0 years
7 - 11 Lacs
Pune
Work from Office
Provide expertise in analysis, requirements gathering, design, coordination, customization, testing and support of reports, in client’s environment Develop and maintain a strong working relationship with business and technical members of the team Relentless focus on quality and continuous improvement Perform root cause analysis of reports issues Development / evolutionary maintenance of the environment, performance, capability and availability. Assisting in defining technical requirements and developing solutions Effective content and source-code management, troubleshooting and debugging Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Tableau Desktop Specialist, SQL -Strong understanding of SQL for Querying database Good to have - Python ; Snowflake, Statistics, ETL experience. Extensive knowledge on using creating impactful visualization using Tableau. Must have thorough understanding of SQL & advance SQL (Joining & Relationships). Must have experience in working with different databases and how to blend & create relationships in Tableau. Must have extensive knowledge to creating Custom SQL to pull desired data from databases. Troubleshooting capabilities to debug Data controls Preferred technical and professional experience Troubleshooting capabilities to debug Data controls Capable of converting business requirements into workable model. Good communication skills, willingness to learn new technologies, Team Player, Self-Motivated, Positive Attitude. Must have thorough understanding of SQL & advance SQL (Joining & Relationships)
Posted 1 month ago
5.0 - 10.0 years
2 - 6 Lacs
Gurugram
Work from Office
Skills: Primary Skills: Enhancements, new development, defect resolution, and production support of ETL development using AWS native services Integration of data sets using AWS services such as Glue and Lambda functions. Utilization of AWS SNS to send emails and alerts Authoring ETL processes using Python and PySpark ETL process monitoring using CloudWatch events Connecting with different data sources like S3 and validating data using Athena. Experience in CI/CD using GitHub Actions Proficiency in Agile methodology Extensive working experience with Advanced SQL and a complex understanding of SQL. Competencies / Experience: Deep technical skills in AWS Glue (Crawler, Data Catalog)5 years. Hands-on experience with Python and PySpark3 years. PL/SQL experience3 years CloudFormation and Terraform2 years CI/CD GitHub actions1 year Experience with BI systems (PowerBI, Tableau)1 year Good understanding of AWS services like S3, SNS, Secret Manager, Athena, and Lambda2 years
Posted 1 month ago
3.0 - 8.0 years
2 - 4 Lacs
Pune, Greater Noida
Work from Office
The Apex Group was established in Bermuda in 2003 and is now one of the worlds largest fund administration and middle office solutions providers. Our business is unique in its ability to reach globally, service locally and provide cross-jurisdictional services. With our clients at the heart of everything we do, our hard-working team has successfully delivered on an unprecedented growth and transformation journey, and we are now represented by over circa 13,000 employees across 112 offices worldwide.Your career with us should reflect your energy and passion. Thats why, at Apex Group, we will do more than simply empower you. We will work to supercharge your unique skills and experience. Take the lead and well give you the support you need to be at the top of your game. And we offer you the freedom to be a positive disrupter and turn big ideas into bold, industry-changing realities. For our business, for clients, and for you Role purpose Part of a team of Sales Technology specialists, the role is fundamental to supporting and advancing the usage of our Sales Compensation solution. The role will involve configuration and support of Xactly Incent and Connect. Role Responsibilities: Develop and support Xactly Incent and Connect Design, develop and test reports & dashboards Trace unexpected results back to their source and diagnose underlying issues. If unable to resolve directly, own coordination and resolution with appropriate resources Coordinate with the Xactly Data Warehouse and ETL teams to implement commission data changes, and output data for consumption by other business teams Document system configuration and payment administrative processes Provide guidance to the business, building domain knowledge, gathering requirements, providing solutions and impact analysis Remain current with Xactly products and modules through regular engagement with and training through Xactly resources Be mindful of changes to the business that may impact the current solution - new products and business lines, acquisitions, reorganizations, system changes, etc. Work with SOX auditors in providing necessary changes and documentation Perform ad-hoc reporting and analysis to provide business insight Serve as an escalation resource for Tier 2 & 3 issues Provide input and knowledge sharing with Technology teams Drive technology, business and Xactly adoption best practices Participate in scheduled and ad-hoc training in order to improve policy and process acumen Perform other duties as assigned Skills Required: Provent experience supporting Sales compensations (commissions, bonuses) Advanced SQL and ETL skills Experience with Salesforce 3 years experience with Xactly Incent and Connect (preferred) Experience with the implementation process of Xactly Incent and Connect Strong verbal and written communication skills to interact with users, cross-functional colleagues and IT Ability to accurately collect information in order to understand and assess the needs and situation Strong attention to detail Familiarity with GDPR and data security Familiarity with reporting/data mining methodologies Ability to prioritise workload and provide timely follow-up and resolution Ability to work effectively in a fast-paced environment and handle multiple projects Strong problem solving, troubleshooting and analytical skills Strong verbal and written communication skills to interact with team members Qualifications: Bachelors Degree or equivalent experience Xactly Admin qualified (preferred) ITIL qualification is a plus What you will get in return: A genuinely unique opportunity to be part of an expanding large global business Working with a strong and dynamic team Training and development opportunities Exposure to all aspects of the business, cross-jurisdiction and to working with senior management directly
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough