Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
1.0 - 5.0 years
9 - 13 Lacs
Mumbai, Gurugram, Bengaluru
Work from Office
We are seeking a talented and motivated Data Scientist with 1-3 years of experience to join our Data Science team. If you have a strong passion for data science, expertise in machine learning, and experience working with large-scale datasets, we want to hear from you. As a Data Scientist at RevX, you will play a crucial role in developing and implementing machine learning models to drive business impact. You will work closely with teams across data science, engineering, product, and campaign management to build predictive models, optimize algorithms, and deliver actionable insights. Your work will directly influence business strategy, product development, and campaign optimization. Major Responsibilities: Develop and implement machine learning models, particularly neural networks, decision trees, random forests, and XGBoost, to solve complex business problems. Work on deep learning models and other advanced techniques to enhance predictive accuracy and model performance. Analyze and interpret large, complex datasets using Python, SQL, and big data technologies to derive meaningful insights. Collaborate with cross-functional teams to design, build, and deploy end-to-end data science solutions, including data pipelines and model deployment frameworks. Utilize advanced statistical techniques and machine learning methodologies to optimize business strategies and outcomes. Evaluate and improve model performance, calibration, and deployment strategies for real-time applications. Perform clustering, segmentation, and other unsupervised learning techniques to discover patterns in large datasets. Conduct A/B testing and other experimental designs to validate model performance and business strategies. Create and maintain data visualizations and dashboards using tools such as matplotlib, seaborn, Grafana, and Looker to communicate findings. Provide technical expertise in handling big data, data warehousing, and cloud-based platforms like Google Cloud Platform (GCP). Required Experience/Skills: Bachelors or Masters degree in Data Science, Computer Science, Statistics, Mathematics, or a related field. 1-3 years of experience in data science or machine learning roles. Strong proficiency in Python for machine learning, data analysis, and deep learning applications. Experience in developing, deploying, and monitoring machine learning models, particularly neural networks, and other advanced algorithms. Expertise in handling big data technologies, with experience in tools such as BigQuery and cloud platforms (GCP preferred). Advanced SQL skills for data querying and manipulation from large datasets. Experience in data visualization tools like matplotlib, seaborn, Grafana, and Looker. Strong understanding of A/B testing, statistical tests, experimental design, and methodologies. Experience in clustering, segmentation, and other unsupervised learning techniques. Strong problem-solving skills and the ability to work with complex datasets and machine learning pipelines. Excellent communication skills, with the ability to explain complex technical concepts to non-technical stakeholders. Preferred Skills: Experience with deep learning frameworks such as TensorFlow or PyTorch. Familiarity with data warehousing concepts and big data tools. Knowledge of MLOps practices, including model deployment, monitoring, and management. Experience with business intelligence tools and creating data-driven dashboards. Understanding of reinforcement learning, natural language processing (NLP), or other advanced AI techniques. Education: Bachelor of Engineering or similar degree from any reputed University.
Posted 2 weeks ago
5.0 - 9.0 years
12 - 16 Lacs
Mumbai, Gurugram, Bengaluru
Work from Office
Research and Problem-Solving: Identify and frame business problems, conduct exploratory data analysis, and propose innovative data science solutions tailored to business needs. Leadership & Communication: Serve as a technical referent for the research team, driving high-impact, high-visibility initiatives. Effectively communicate complex scientific concepts to senior stakeholders, ensuring insights are actionable for both technical and non-technical audiences. Mentor and develop scientists within the team, fostering growth and technical excellence. Algorithm Development: Design, optimize, and implement advanced machine learning algorithms, including neural networks, ensemble models (XGBoost, random forests), and clustering techniques. End-to-End Project Ownership: Lead the development, deployment, and monitoring of machine learning models and data pipelines for large-scale applications. Model Optimization and Scalability: Focus on optimizing algorithms for performance and scalability, ensuring robust, well-calibrated models suitable for real-time environments. A/B Testing and Validation: Design and execute experiments, including A/B testing, to validate model effectiveness and business impat. Big Data Handling: Leverage tools like BigQuery, advanced SQL, and cloud platforms (e.g., GCP) to process and analyze large datasets. Collaboration and Mentorship: Work closely with engineering, product, and campaign management teams, while mentoring junior data scientists in best practices and advanced techniques. Data Visualization: Create impactful visualizations using tools like matplotlib, seaborn, Looker, and Grafana to communicate insights effectively to stakeholders. Required Experience/Skills 5–8 years of hands-on experience in data science or machine learning roles. 2+ years leading data science projects in AdTech Strong hands-on skills in Advanced Statistics, Machine Learning, and Deep Learning. Demonstrated ability to implement and optimize neural networks and other advanced ML models. Proficiency in Python for developing machine learning models, with a strong grasp of TensorFlow or PyTorch. Expertise handling large datasets using advanced SQL and big data tools like BigQuery In-depth knowledge of MLOps pipelines, from data preprocessing to deployment and monitoring. Strong background in A/B testing, statistical analysis, and experimental design. Proven capability in clustering, segmentation, and unsupervised learning methods. Strong problem-solving and analytical skills with a focus on delivering business value. Education: A Master’s in Data Science, Computer Science, Mathematics, Statistics, or a related field is preferred. A Bachelor's degree with exceptional experience will also be considered.
Posted 2 weeks ago
7.0 - 12.0 years
15 - 20 Lacs
Hyderabad
Work from Office
Overview We are seeking a strategic and hands-on Manager of Business Intelligence (BI) and Data Governance to lead the development and execution of our enterprise-wide data strategy. This role will oversee data governance frameworks, manage modern BI platforms, and ensure the integrity, availability, and usability of business-critical data. Reporting into senior leadership, this role plays a pivotal part in shaping data-informed decision-making across functions including Finance, Revenue Operations, Product, and more. The ideal candidate is a technically proficient and people-oriented leader with a deep understanding of data governance, cloud data architecture, and SaaS KPIs. They will drive stakeholder engagement, enablement, and adoption of data tools and insights, with a focus on building scalable, trusted, and observable data systems. Responsibilities Data Governance Leadership: Establish and maintain a comprehensive data governance framework that includes data quality standards, ownership models, data stewardship processes, and compliance alignment with regulations such as GDPR and SOC 2. Enterprise Data Architecture: Oversee data orchestration across Salesforce (SFDC), cloud-based data warehouses (e.g., Databricks, Snowflake, or equivalent), and internal systems. Cross collaborate with data engineering team for the development and optimization of ETL pipelines to ensure data reliability and performance at scale. Team Management & Enablement: Lead and mentor a team of BI analysts, and governance specialists. Foster a culture of collaboration, continuous learning, and stakeholder enablement to increase data adoption across the organization. BI Strategy & Tools Management: Own the BI toolset (with a strong emphasis on Tableau), and define standards for scalable dashboard design, self-service reporting, and analytics enablement. Evaluate and incorporate additional platforms (e.g., Power BI, Looker) as needed. Stakeholder Engagement & Strategic Alignment: Partner with leaders in Finance, RevOps, Product, and other departments to align reporting and data strategy with business objectives. Translate business needs into scalable reporting solutions and drive enterprise-wide adoption through clear communication and training. Data Quality & Observability: Implement data quality monitoring, lineage tracking, and observability tools to proactively detect issues and ensure data reliability and trustworthiness. Documentation & Transparency: Create and maintain robust documentation for data processes, pipeline architecture, code repositories (via GitHub), and business definitions to support transparency and auditability for technical and non-technical users. Executive-Level Reporting & Insight: Design and maintain strategic dashboards that surface key SaaS performance indicators to senior leadership and the board. Deliver actionable insights to support company-wide strategic decisions. Continuous Improvement & Innovation: Stay current with trends in data governance, BI technologies, and AI. Proactively recommend and implement enhancements to tools, processes, and governance maturity. Qualifications Data Governance Expertise: Proven experience implementing data governance frameworks, compliance standards, and ownership models across cross-functional teams. SQL Expertise: Advanced SQL skills with a strong background in ETL/data pipeline development across systems like Salesforce and enterprise data warehouses. BI Tools Mastery: Expertise in Tableau for developing reports and dashboards. Experience driving adoption of BI best practices across a diverse user base. Salesforce Data Proficiency: Deep understanding of SFDC data structure, reporting, and integration with downstream systems. Version Control & Documentation: Hands-on experience with GitHub and best practices in code versioning and documentation of data pipelines. Leadership & Stakeholder Communication: 3+ years of people management experience with a track record of team development and stakeholder engagement. Analytics Experience: 8+ years of experience in analytics roles, working with large datasets to derive insights and support executive-level decision-making. Programming Knowledge: Proficiency in Python for automation, data manipulation, and integration tasks. SaaS Environment Acumen: Deep understanding of SaaS metrics, business models, and executive reporting needs. Cross-functional Collaboration: Demonstrated success in partnering with teams like Finance, Product, and RevOps to meet enterprise reporting and insight goals.
Posted 2 weeks ago
8.0 - 13.0 years
3 - 6 Lacs
Bengaluru
Work from Office
Location Bengaluru : We are seeking a highly skilled and motivated Data Engineer to join our dynamic team. The ideal candidate will have extensive experience in data engineering, with a strong focus on Databricks, Python, and SQL. As a Data Engineer, you will play a crucial role in designing, developing, and maintaining our data infrastructure to support various business needs. Key Responsibilities Develop and implement efficient data pipelines and ETL processes to migrate and manage client, investment, and accounting data in Databricks Work closely with the investment management team to understand data structures and business requirements, ensuring data accuracy and quality. Monitor and troubleshoot data pipelines, ensuring high availability and reliability of data systems. Optimize database performance by designing scalable and cost-effective solutions. What s on offer Competitive salary and benefits package. Opportunities for professional growth and development. A collaborative and inclusive work environment. The chance to work on impactful projects with a talented team. Candidate Profile Experience: 8+ years of experience in data engineering or a similar role. Proficiency in Apache Spark. Databricks Data Cloud, including schema design, data partitioning, and query optimization Exposure to Azure. Exposure to Streaming technologies. (e.g Autoloader, DLT Streaming) Advanced SQL, data modeling skills and data warehousing concepts tailored to investment management data (e.g., transaction, accounting, portfolio data, reference data etc). Experience with ETL/ELT tools like snap logic and programming languages (e.g., Python, Scala, R programing). Familiarity workload automation and job scheduling tool such as Control M. Familiar with data governance frameworks and security protocols. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Education Bachelor s degree in computer science, IT, or a related discipline. Not Ready to Apply Join our talent pool and we'll reach out when a job fits your skills.
Posted 2 weeks ago
3.0 - 6.0 years
6 - 11 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Job Title: Oracle Fusion Techno-Functional Consultant Location: Hyderabad/Bangalore/Pune Job Type: Full-time Experience Level: 8+ Years Domain: ERP – Oracle Fusion Cloud Must-Have Skills: - 8+ years of Oracle ERP experience, with a minimum of 3 years in Oracle Fusion Cloud - Strong knowledge and hands-on experience in Fusion Manufacturing or SCM, Fusion Cost Management, and Fusion EAM or Finance Modules - Proficiency in Advanced SQL / PL-SQL and BI Tools (OTBI, BI Publisher, FAW) - Familiarity with Fusion data models and REST/SOAP Web Services - Ability to troubleshoot complex functional and technical issues - Excellent communication and problem-solving skills - Bachelor's degree in Engineering, Computer Science, or related field
Posted 2 weeks ago
2.0 - 5.0 years
7 - 11 Lacs
Pune
Work from Office
Provide expertise in analysis, requirements gathering, design, coordination, customization, testing and support of reports, in client’s environment Develop and maintain a strong working relationship with business and technical members of the team Relentless focus on quality and continuous improvement Perform root cause analysis of reports issues Development / evolutionary maintenance of the environment, performance, capability and availability. Assisting in defining technical requirements and developing solutions Effective content and source-code management, troubleshooting and debugging Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Tableau Desktop Specialist, SQL -Strong understanding of SQL for Querying database Good to have - Python ; Snowflake, Statistics, ETL experience. Extensive knowledge on using creating impactful visualization using Tableau. Must have thorough understanding of SQL & advance SQL (Joining & Relationships). Must have experience in working with different databases and how to blend & create relationships in Tableau. Must have extensive knowledge to creating Custom SQL to pull desired data from databases. Troubleshooting capabilities to debug Data controls Preferred technical and professional experience Troubleshooting capabilities to debug Data controls Capable of converting business requirements into workable model. Good communication skills, willingness to learn new technologies, Team Player, Self-Motivated, Positive Attitude. Must have thorough understanding of SQL & advance SQL (Joining & Relationships)
Posted 2 weeks ago
5.0 - 10.0 years
2 - 6 Lacs
Gurugram
Work from Office
Skills: Primary Skills: Enhancements, new development, defect resolution, and production support of ETL development using AWS native services Integration of data sets using AWS services such as Glue and Lambda functions. Utilization of AWS SNS to send emails and alerts Authoring ETL processes using Python and PySpark ETL process monitoring using CloudWatch events Connecting with different data sources like S3 and validating data using Athena. Experience in CI/CD using GitHub Actions Proficiency in Agile methodology Extensive working experience with Advanced SQL and a complex understanding of SQL. Competencies / Experience: Deep technical skills in AWS Glue (Crawler, Data Catalog)5 years. Hands-on experience with Python and PySpark3 years. PL/SQL experience3 years CloudFormation and Terraform2 years CI/CD GitHub actions1 year Experience with BI systems (PowerBI, Tableau)1 year Good understanding of AWS services like S3, SNS, Secret Manager, Athena, and Lambda2 years
Posted 3 weeks ago
3.0 - 8.0 years
2 - 4 Lacs
Pune, Greater Noida
Work from Office
The Apex Group was established in Bermuda in 2003 and is now one of the worlds largest fund administration and middle office solutions providers. Our business is unique in its ability to reach globally, service locally and provide cross-jurisdictional services. With our clients at the heart of everything we do, our hard-working team has successfully delivered on an unprecedented growth and transformation journey, and we are now represented by over circa 13,000 employees across 112 offices worldwide.Your career with us should reflect your energy and passion. Thats why, at Apex Group, we will do more than simply empower you. We will work to supercharge your unique skills and experience. Take the lead and well give you the support you need to be at the top of your game. And we offer you the freedom to be a positive disrupter and turn big ideas into bold, industry-changing realities. For our business, for clients, and for you Role purpose Part of a team of Sales Technology specialists, the role is fundamental to supporting and advancing the usage of our Sales Compensation solution. The role will involve configuration and support of Xactly Incent and Connect. Role Responsibilities: Develop and support Xactly Incent and Connect Design, develop and test reports & dashboards Trace unexpected results back to their source and diagnose underlying issues. If unable to resolve directly, own coordination and resolution with appropriate resources Coordinate with the Xactly Data Warehouse and ETL teams to implement commission data changes, and output data for consumption by other business teams Document system configuration and payment administrative processes Provide guidance to the business, building domain knowledge, gathering requirements, providing solutions and impact analysis Remain current with Xactly products and modules through regular engagement with and training through Xactly resources Be mindful of changes to the business that may impact the current solution - new products and business lines, acquisitions, reorganizations, system changes, etc. Work with SOX auditors in providing necessary changes and documentation Perform ad-hoc reporting and analysis to provide business insight Serve as an escalation resource for Tier 2 & 3 issues Provide input and knowledge sharing with Technology teams Drive technology, business and Xactly adoption best practices Participate in scheduled and ad-hoc training in order to improve policy and process acumen Perform other duties as assigned Skills Required: Provent experience supporting Sales compensations (commissions, bonuses) Advanced SQL and ETL skills Experience with Salesforce 3 years experience with Xactly Incent and Connect (preferred) Experience with the implementation process of Xactly Incent and Connect Strong verbal and written communication skills to interact with users, cross-functional colleagues and IT Ability to accurately collect information in order to understand and assess the needs and situation Strong attention to detail Familiarity with GDPR and data security Familiarity with reporting/data mining methodologies Ability to prioritise workload and provide timely follow-up and resolution Ability to work effectively in a fast-paced environment and handle multiple projects Strong problem solving, troubleshooting and analytical skills Strong verbal and written communication skills to interact with team members Qualifications: Bachelors Degree or equivalent experience Xactly Admin qualified (preferred) ITIL qualification is a plus What you will get in return: A genuinely unique opportunity to be part of an expanding large global business Working with a strong and dynamic team Training and development opportunities Exposure to all aspects of the business, cross-jurisdiction and to working with senior management directly
Posted 3 weeks ago
6.0 - 11.0 years
15 - 30 Lacs
Noida, Pune, Bengaluru
Hybrid
We are looking for a Snowflake Data Engineer with deep expertise in Snowflake and DBT to help us build and scale our modern data platform. Key Responsibilities: Design and build scalable ELT pipelines in Snowflake using DBT . Develop efficient, well-tested DBT models (staging, intermediate, and marts layers). Implement data quality, testing, and monitoring frameworks to ensure data reliability and accuracy. Optimize Snowflake queries, storage, and compute resources for performance and cost-efficiency. Collaborate with cross-functional teams to gather data requirements and deliver data solutions. Required Qualifications: 5+ years of experience as a Data Engineer, with at least 4 years working with Snowflake . Proficient with DBT (Data Build Tool) including Jinja templating, macros, and model dependency management. Strong understanding of ELT patterns and modern data stack principles. Advanced SQL skills and experience with performance tuning in Snowflake. Interested candidates share your CV at himani.girnar@alikethoughts.com with below details Candidate's name- Email and Alternate Email ID- Contact and Alternate Contact no- Total exp- Relevant experience- Current Org- Notice period- CCTC- ECTC- Current Location- Preferred Location- Pancard No-
Posted 3 weeks ago
4.0 - 8.0 years
10 - 20 Lacs
Kolkata, Gurugram, Bengaluru
Work from Office
Job Opportunity for GCP Data Engineer Role: Data Engineer Location: Gurugram/ Bangalore/Kolkata (5 Days work from office) Experience : 4+ Years Key Skills: Data Analysis / Data Preparation - Expert Dataset Creation / Data Visualization - Expert Data Quality Management - Advanced Data Engineering - Advanced Programming / Scripting - Intermediate Data Storytelling- Intermediate Business Analysis / Requirements Analysis - Intermediate Data Dashboards - Foundation Business Intelligence Reporting - Foundation Database Systems - Foundation Agile Methodologies / Decision Support - Foundation Technical Skills: • Cloud - GCP - Expert • Database systems (SQL and NoSQL / Big Query / DBMS) - Expert • Data warehousing solutions - Advanced • ETL Tools - Advanced • Data APIs - Advanced • Python, Java, and Scala etc. - Intermediate • Some knowledge understanding the basics of distributed systems - Foundation • Some knowledge of algorithms and optimal data structures for analytics - Foundation • Soft Skills and time management skills - Foundation
Posted 3 weeks ago
4.0 - 8.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Back Who Are We At BCE Global Tech, immerse yourself in exciting projects that are shaping the future of both consumer and enterprise telecommunications This involves building innovative mobile apps to enhance user experiences and enable seamless connectivity on-the-go Thrive in diverse roles like Full Stack Developer, Backend Developer, UI/UX Designer, DevOps Engineer, Cloud Engineer, Data Science Engineer, and Scrum Master; at a workplace that encourages you to freely share your bold and different ideas If you are passionate about technology and eager to make a difference, we want to hear from you! Apply now to join our dynamic team in Bengaluru The role demands strong leadership skills with a focus on risk management and the ability to deliver results-oriented outcomes The candidate must understand business needs, maintain a customer-focused approach, and demonstrate the ability to adapt and work in a fast-paced environment, managing multiple projects and tolerating ambiguity effectively Key Responsibilities experience in business intelligence related functions (testing, analysis, coding, etc )Architect and provide guidance on building end-to-end systems optimized for speed and scale Proficiency in programming languages including SAS, Oracle and Teradata SQL Strong knowledge of relational databasesMy sql, DB2, etc Autonomous and self-motivated with an ability to prioritize Experience working in agile Technology Skills Advanced SQL skills preferable Teradata Working knowledge of SAS ViyaNew cloud product (Good to have) Google Cloud experience includeing BIG QueryGCP prefer, we can also look for other Clouds Working knowledge of MicroStrategy Strong leadership skills with risk-management Results-oriented Understanding of business needs/input, customer focus Ability to adapt and work in a fast paced environment with good tolerance to ambiguity and manage multiple projects Required Qualifications To Be Successful In This Role Work with multiple teams to collect and document business requirements / User Stories Work within various database environments such as Teradata and SAS to design, develop and deliver data solutions, ensuring accuracy and quality Design, construct, modify, integrate, implement and test data models and database management systems Utilize advanced SQL to analyze data and perform data mining analysis Effectively manage data mining analysis and report to various stakeholders Establish & maintain effective relationships with cross-functional product teams, colleagues and process teams, including virtual environment (i e conference calls, videoconference) Additional Information Job Type Full Time Work ProfileHybrid (Work from Office/ Remote) Years of Experience8-10Years LocationBangalore What We Offer Competitive salaries and comprehensive health benefits Flexible work hours and remote work options Professional development and training opportunities A supportive and inclusive work environment
Posted 3 weeks ago
3.0 - 5.0 years
0 - 3 Lacs
Ahmedabad
Work from Office
Analysis & Design: Analyse legal/business requirements and translate them into product features. Collaborate with regulatory experts and stakeholders to define scope and align with product strategy. Conduct feasibility studies and visualize solutions through models/diagrams. Logic Development: Develop and optimize regulatory logic for seamless system integration. Maintain development environments and deliverables. Follow best practices and mentor new developers. Ensure timely, high-quality logic releases. Documentation & Testing: Create internal/external documentation as per SOPs. Perform developer, functional, and regression testing. Identify and resolve defects to maintain product quality. Customer Support: Provide technical and regulatory support to internal/external clients. Participate in meetings to ensure alignment with business needs. Continuous Improvement: Identify and implement improvements in development processes. Automate development and QA steps where possible. Lead improvement initiatives and tool development. Innovation & Collaboration: Stay updated with new technologies and methodologies. Engage with team members and stakeholders across departments. Participate in Agile ceremonies and foster team synergy. Communication & Growth: Ensure transparent, effective communication. Provide feedback and mentoring. Pursue continuous learning and professional development. Who You Are (Education and Qualification Requirements): Bachelors degree or equivalent experience. Good to have knowledge in SAP EHS Experience in SDLC, programming, and database systems (Oracle, SQL Server, Access). Knowledge of SQL, data analysis, and database design. Familiarity with version control (Git), debugging, and software testing. Strong communication, analytical, and problem-solving skills. Experience with Agile/Scrum methodologies. Fluent in English (written and spoken). Strong attention to detail, quality, and multi-tasking ability
Posted 3 weeks ago
1.0 - 3.0 years
5 - 7 Lacs
Bengaluru
Hybrid
Role & responsibilities As a Business Analyst in the HR team, you will play a crucial role in managing confidential HRIS data, analyzing compensation structures, and ensuring accurate incentive and variable pay calculations. You will be responsible for generating key HR reports on a cyclical basis, providing data-driven insights to improve workforce planning and decision-making. Handle and analyze confidential HRIS data with precision. Calculate incentives and variable pay based on defined parameters. Develop and maintain HR dashboards and cyclical reports to drive business decisions. Ensure data accuracy and integrity in HR-related financial calculations. Provide actionable insights through data visualization and analytics. Collaborate with stakeholders to improve workforce metrics and trends. If this opportunity aligns with your interests and experience, you may apply by sending your resume to surjish.suresh@livspace.com.
Posted 3 weeks ago
5.0 - 10.0 years
11 - 21 Lacs
Bhubaneswar, Pune, Bengaluru
Work from Office
About Client Hiring for One of Our Multinational Corporations! Job Title : Python Developer Qualification : Any Graduate or Above Relevant Experience : 5 to 12 Years Must Have Skills : 1. Python 2. Pandas 3. Numpy 4. Flask/Django 5. Docker 6. SQL Roles and Responsibilities : 1. Design, develop, test, and deploy Python-based applications. 2. Work with libraries such as Pandas and NumPy for data manipulation and analysis. 3. Develop RESTful APIs using Flask or Django . 4. Containerize applications using Docker for deployment in cloud or on-premise environments. 5. Interact with SQL databases for data storage, retrieval, and optimization. 6. Write clean, maintainable, and efficient code following best practices. 7. Collaborate with front-end developers, data engineers, and DevOps teams to deliver complete solutions. 8. Troubleshoot and resolve technical issues as they arise. Location : Bangalore,Pune,Hyderabad CTC Range : Upto 30 LPA (Lakhs Per Annum) Notice period : 90 days Mode of Interview : Virtual Mode of Work : Work From Office [MADHUSHREE C] [HR Recruiter] Black and White Business Solutions Pvt Ltd Bangalore, Karnataka, INDIA. Direct Number: 08067432410 | WhatsApp 8431051997 | madhushree.c@blackwhite.in | www.blackwhite.in
Posted 3 weeks ago
6.0 - 10.0 years
15 - 25 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Roles and Responsibilities Design, develop, and maintain databases using SQL Server Management Studio (SSMS) to support business intelligence initiatives. Extract data from various sources such as Oracle, Teradata, and Sybase using advanced querying techniques. Develop complex queries to analyze large datasets and provide insights through reporting tools like Tableau or Power BI. Collaborate with cross-functional teams to identify requirements for data extraction and transformation into a standardized format. Troubleshoot issues related to database performance tuning, indexing strategies, and stored procedures. Desired Candidate Profile 6-10 years of experience in Data Warehousing & Business Intelligence domain with expertise in Advanced Excel, Advance SQL, Linux, Scripting languages (Python), and Reporting Tools (Tableau/Power BI). Bachelor's degree in Engineering (B.Tech/B.E.) or Master's degree (M.Tech) from a reputed institution. Strong understanding of ETL concepts including designing star schemas, snowflake schemas, dimensional models etc.
Posted 3 weeks ago
3.0 - 7.0 years
7 - 12 Lacs
Hyderabad
Work from Office
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Design, develop, and implement BI applications using Microsoft Azure, including Azure SQL Database, Azure Data Lake Storage, Azure Databricks, and Azure Blob Storage Manage the entire software development life cycle, encompassing requirements gathering, designing, coding, testing, deployment, and support Collaborate with cross-functional teams to define, design, and release new features Utilize CI/CD pipelines to automate deployment using Azure and DevOps tools Monitor application performance, identify bottlenecks, and devise solutions to address these issues Foster a positive team environment and skill development Write clean, maintainable, and efficient code that adheres to company standards and best practices Participate in code reviews to ensure code quality and share knowledge Troubleshoot complex software issues and provide timely solutions Engage in Agile/Scrum development processes and meetings Stay updated with the latest and emerging technologies in software development and incorporate new technologies into solution design as appropriate Proactively identify areas for improvement or enhancement in current architecture Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Bachelor’s or Master’s degree in CS, IT, or a related field with 4+ years of experience in software development 3+ years of experience in writing advance level SQL and PySpark Code 3+ years of experience in Azure Databricks and Azure SQL 3+ years of experience in Azure (ADF) Knowledge on advance SQL, ETL & Visualization tools along with knowledge on data warehouse concepts Proficient in building enterprise-level data warehouse projects using Azure Databricks and ADF Proficient in code versioning tools GitHub Proven excellent understanding of Agile methodologies Proven solid problem-solving skills with the ability to work independently and manage multiple tasks simultaneously Proven excellent interpersonal, written, and verbal communication skills At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.
Posted 4 weeks ago
3.0 - 8.0 years
15 - 30 Lacs
Gurugram, Bengaluru
Hybrid
Salary: 15 to 30 LPA Exp: 3 to 8 years Location: Bangalore/Gurgaon Notice: Immediate only..!! Key Skills: SQL, Advance SQL, BI tools, ETL etc Roles and Responsibilities Extract, manipulate, and analyze large datasets from various sources such as Hive, SQL databases, and ETL processes. Develop and maintain dashboards using Tableau to provide insights on banking performance, market trends, and customer behavior. Collaborate with cross-functional teams to identify key performance indicators (KPIs) and develop data visualizations to drive business decisions. Desired Candidate Profile 3-8 years of experience in Data Analytics or related field with expertise in Banking Analytics, Business Intelligence, Campaign Analytics, Marketing Analytics, etc. . Strong proficiency in tools like Tableau for data visualization; Advance SQL knowledge preferred. Experience working with big data technologies like Hadoop ecosystem (Hive), Spark; familiarity with Python programming language required.
Posted 4 weeks ago
4.0 - 9.0 years
4 - 8 Lacs
Chennai
Work from Office
Primary Skills Expertise in tools SAP Data services, ETL and Data warehouse concepts. SQL writing, Advance SQL tuning preferably Oracle Knowledge in SAP BO, Informatica are added advantages Batch and real-time data processing of large-scale data sets Optimal use case data structure design, build & performance tuning Ingestion, transformation and cleansing of structured & unstructured data across a heterogeneous landscape Secondary Skills Excellent problem solving and analytical skills Experience of working in traditional waterfall and agile methodologies, working both independently and in project/product teams under pressure and to deadlines Excellent Stakeholder Management skills (supporting use cases and data consumers)
Posted 4 weeks ago
6.0 - 11.0 years
20 - 35 Lacs
Bengaluru
Work from Office
Job Overview: We are seeking a highly skilled Advanced Software Engineer to join our dynamic team. The ideal candidate will have extensive experience working on a variety of software development projects, particularly in Java Full Stack Development. This position requires a strong understanding of Spring, Hibernate, MS SQL, and proficiency in SQL query optimization as well as creating and managing stored procedures. Key Responsibilities: Design, develop, and maintain scalable web applications using Java and related technologies. Collaborate with cross-functional teams to define, design, and ship new features and functionality. Utilize Spring Framework for building robust and flexible service layers. Work with Hibernate ORM to manage data handling and improve application performance. Develop and optimize complex SQL queries for application performance. Write and maintain SQL stored procedures to support the needs of the application. Conduct code reviews, provide constructive feedback, and ensuring adherence to best practices. Troubleshoot, debug, and resolve application and database issues. Stay up-to-date with emerging technologies and industry trends to continuously improve the development process. Required Skills and Qualifications: Bachelors degree in Computer Science, Information Technology, or a related field. Minimum of [6 years] of professional experience in software development, specifically in Java Full Stack development. Strong proficiency in Java and related frameworks (Spring, Hibernate). Experience with database management, particularly MS SQL Server. In-depth knowledge of SQL query optimization techniques. Proven experience crafting and maintaining SQL stored procedures. Familiarity with web technologies including HTML, CSS, JavaScript, and frameworks like Angular or React. Strong understanding of RESTful APIs and microservices architecture. Experience with Agile and DevOps methodologies. Excellent problem-solving skills and a proactive attitude. Strong communication skills and ability to work in a team environment. Preferred Skills: Experience with cloud services (AWS, Azure, etc.). Familiarity with containerization (Docker, Kubernetes). Understanding of software development lifecycle (SDLC) and version control systems (Git). Knowledge of security best practices in web application development.
Posted 4 weeks ago
1.0 - 3.0 years
5 - 7 Lacs
Bengaluru
Hybrid
Role & responsibilities As a Business Analyst in the HR team, you will play a crucial role in managing confidential HRIS data, analyzing compensation structures, and ensuring accurate incentive and variable pay calculations. You will be responsible for generating key HR reports on a cyclical basis, providing data-driven insights to improve workforce planning and decision-making. Handle and analyze confidential HRIS data with precision. Calculate incentives and variable pay based on defined parameters. Develop and maintain HR dashboards and cyclical reports to drive business decisions. Ensure data accuracy and integrity in HR-related financial calculations. Provide actionable insights through data visualization and analytics. Collaborate with stakeholders to improve workforce metrics and trends. If this opportunity aligns with your interests and experience, you may apply by sending your resume to surjish.suresh@livspace.com.
Posted 1 month ago
4.0 - 7.0 years
14 - 18 Lacs
Bengaluru
Work from Office
Responsibilities: - Drive and gather business requirements, assessments, solutions especially in the areas of data analysis, data extraction/delivery, source/target mappings and reporting. - Create Excel & Tableau dashboards & analyses that provide visibility into KPIs, marketing and product effectiveness, and business trends/drivers - Work closely with Business Operations, Product, Marketing & Sales, Finance, and Engineering teams to solve problems, identify trends, and define key metrics. - Able to quickly understand the business process and needs, and translate them into business requirements - Detail-oriented and be able to think of all the scenarios for functional and non-functional requirements Skills: - 1-6 years of analytical experience in an analytics-based consulting role, exploring large data sets in order to answer strategic questions for customers (either as an internal analyst or external consultant)- Strong analytical skills and ability to make fast decisions with limited and noisy data- Strong knowledge of the overall ad tech landscape (RTB and DSP experience preferred)- High level of proficiency with MS Excel (pivot tables, complex functions; VBA preferred)- Advanced SQL and Python skillApplySaveSaveProInsights
Posted 1 month ago
8.0 - 13.0 years
25 - 30 Lacs
Nashik
Work from Office
Dreaming big is in our DNA Its who we are as a company Its our culture Its our heritage And more than ever, its our future A future where were always looking forward Always serving up new ways to meet lifes moments A future where we keep dreaming bigger We look for people with passion, talent, and curiosity, and provide them with the teammates, resources and opportunities to unleash their full potential The power we create together when we combine your strengths with ours is unstoppable Are you ready to join a team that dreams as big as you do AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology The teams are transforming Operations through Tech and Analytics, Do You Dream Big We Need You, Job Description Job Title: Azure Data Engineer Location: Bengaluru Reporting to: Senior Manager Data Engineering Purpose of the role We are seeking an experienced Data Engineer with over 4 years of expertise in data engineering and a focus on leveraging GenAI solutions The ideal candidate will have a strong background in Azure services, relational databases, and programming languages, including Python and PySpark You will play a pivotal role in designing, building, and optimizing scalable data pipelines while integrating AI-driven solutions to enhance our data capabilities, Key tasks & accountabilities Data Pipeline Development: Design and implement efficient ETL/ELT pipelines using Azure Data Factory (ADF) and Azure Databricks (ADB), Ensure high performance and scalability of data pipelines, Relational Database Management: Work with relational databases to structure and query data efficiently, Design, optimize, and maintain database schemas, Programming and Scripting: Write, debug, and optimize Python, PySpark, and SQL code to process large datasets, Develop reusable code components and libraries for data processing, Data Quality and Governance: Implement data validation, cleansing, and monitoring mechanisms, Ensure compliance with data governance policies and best practices, Performance Optimization: Identify and resolve bottlenecks in data processing and storage, Optimize resource utilization on Azure services, Collaboration and Communication: Work closely with cross-functional teams, including AI, analytics, and product teams, Document processes, solutions, and best practices for future use, Qualifications, Experience, Skills Previous Work Experience 4+ years of experience in data engineering, Proficiency in Azure Data Factory (ADF) and Azure Databricks (ADB), Expertise in relational databases and advanced SQL, Strong programming skills in Python and PySpark, Experience with GenAI solutions is a plus, Familiarity with data governance and best practices, Level Of Educational Attainment Required Bachelor's degree in Computer Science, Information Technology, or a related field, Technical Expertise: Knowledge of machine learning pipelines and GenAI workflows, Experience with Azure Synapse or other cloud data platforms, Familiarity with CI/CD pipelines for data workflows, And above all of this, an undying love for beer! We dream big to create future with more cheers,
Posted 1 month ago
5.0 - 10.0 years
15 - 30 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Roles and Responsibilities Develop data models using Erwin, ER/Studio, etc. to support business intelligence initiatives. Design and implement dimensional models for data warehousing projects. Create ETL processes using Informatica PowerCenter, SQL Server Integration Services (SSIS), etc. to extract, transform, and load data from various sources into a centralized repository. Utilize Tableau or Power BI to create interactive dashboards and reports for end-users. Collaborate with stakeholders to gather requirements and develop solutions that meet their needs. Desired Candidate Profile 5-10 years of experience in Data Modeling, Data Warehousing, ETL Development & Business Intelligence. Bachelor's degree in Engineering (B.Tech/B.E.) or Master's degree (M.Tech) in Any Specialization. Strong expertise in tools such as Erwin, ER/Studio, Informatica PowerCenter, SQL Server Integration Services (SSIS), Tableau & Power BI.
Posted 1 month ago
5.0 - 10.0 years
14 - 24 Lacs
Hyderabad
Work from Office
Position: SQL Developer with Analytics Experience: 5+ Years Mandatory Skills: Very Strong in SQL, Complex SQL Query, Advanced SQL, Python, Tableau OR Power BI, Dashboards, Data Modelling And Data visualization. Location: Hyderabad Roles & Responsibilities: Be responsible for the development of the conceptual, logical, and physical data models. Work with application/solution teams to implement data strategies, build data flows and develop/execute logical and physical data models. Implement and maintain data analysis scripts using SQL and Python. Develop and support reports and dashboards using Google PLX/Data Studio/Looker. Monitor performance and implement necessary infrastructure optimizations. Demonstrate ability and willingness to learn quickly and complete large volumes of work with high quality. Demonstrate excellent collaboration, interpersonal communication and written skills with the ability to work in a team environment. Minimum Qualifications: 5+ years of solid hands-on experience with SQL ,Complex SQL, Google, Analytics and Dashboard development. Hands-on experience with design, development, and support of data pipelines Strong SQL programming skills (Joins, sub queries, queries with analytical functions, stored procedures, functions etc.) Hands-on experience using statistical methods for data analysis Experience with data platform and visualization technologies such as Google PLX dashboards, Data Studio, Tableau, Pandas, Qlik Sense, Splunk, Humio, Grafana Experience in Web Development like HTML, CSS, jQuery, Bootstrap Experience in Machine Learning Packages like Scikit-Learn, NumPy, SciPy, Pandas, NLTK, BeautifulSoup, Matplotlib, Statsmodels. Strong design and development skills with meticulous attention to detail. Familiarity with Agile Software Development practices and working in an agile environment Strong analytical, troubleshooting and organizational skills Ability to analyse and troubleshoot complex issues, and proficiency in multitasking Ability to navigate ambiguity is great BS degree in Computer Science, Math, Statistics or equivalent academic credentials Interested Candidates Share CV with updated projects to mail id dikshith.nalapatla@motivitylabs.com with below mentioned details for quick response. Total Experience: Relevant SQL Experience : Mention Tableau or Power Bi Experience: Current Role / Skillset: Current CTC: Fixed: Variables(if any): Bonus(if any): Payroll Company(Name): Client Company(Name): Expected CTC: Official Notice Period: Serving Notice (Yes / No): CTC of offer in hand: Last Working Day (in current organization): Location of the Offer in hand: Willing to work from office: UK Shift (yes/no): UK SHIFT 2:00PM TO 11:00PM ************* 5 DAYS WORK FROM OFFICE ****************
Posted 1 month ago
3.0 - 8.0 years
15 - 30 Lacs
Pune, Gurugram, Bengaluru
Hybrid
Salary: 20 to 35 LPA Exp: 3 to 8 years Location: Pune/Bangalore/Gurgaon(Hybrid) Notice: Immediate only..!! Key Skills: SQL, Advance SQL, BI tools etc Roles and Responsibilities Extract, manipulate, and analyze large datasets from various sources such as Hive, SQL databases, and BI tools. Develop and maintain dashboards using Tableau to provide insights on banking performance, market trends, and customer behavior. Collaborate with cross-functional teams to identify key performance indicators (KPIs) and develop data visualizations to drive business decisions. Desired Candidate Profile 3-8 years of experience in Data Analytics or related field with expertise in Banking Analytics, Business Intelligence, Campaign Analytics, Marketing Analytics, etc. . Strong proficiency in tools like Tableau for data visualization; Advance SQL knowledge preferred. Experience working with big data technologies like Hadoop ecosystem (Hive), Spark; familiarity with Python programming language required.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
19947 Jobs | Dublin
Wipro
9475 Jobs | Bengaluru
EY
7894 Jobs | London
Accenture in India
6317 Jobs | Dublin 2
Amazon
6141 Jobs | Seattle,WA
Uplers
6077 Jobs | Ahmedabad
Oracle
5820 Jobs | Redwood City
IBM
5736 Jobs | Armonk
Tata Consultancy Services
3644 Jobs | Thane
Capgemini
3598 Jobs | Paris,France