Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
10.0 - 18.0 years
25 - 30 Lacs
Noida
Work from Office
Responsibilities:- Collaborate with the sales team to understand customer challenges and business objectives and propose solutions, POC etc..- Develop and deliver impactful technical presentations and demos showcasing the capabilities of GCP Data and AI , GenAI Solutions- Conduct technical proof-of-concepts (POCs) to validate the feasibility and value proposition of GCP solutions.- Collaborate with technical specialists and solution architects from COE Team to design and configure tailored cloud solutions.- Manage and qualify sales opportunities, working closely with the sales team to progress deals through the sales funnel.- Stay up to date on the latest GCP offerings, trends, and best practices.Experience :- Design and implement a comprehensive strategy for migrating and modernizing existing relational on-premise databases to scalable and cost-effective solution on Google Cloud Platform ( GCP).- Design and Architect the solutions for DWH Modernization and experience with building data pipelines in GCP - Strong Experience in BI reporting tools ( Looker, PowerBI and Tableau) - In-depth knowledge of Google Cloud Platform (GCP) services, particularly Cloud SQL, Postgres, Alloy DB, BigQuery, Looker Vertex AI and Gemini (GenAI)- Strong knowledge and experience in providing the solution to process massive datasets in real time and batch process using cloud native/open source Orchestration techniques - Build and maintain data pipelines using Cloud Dataflow to orchestrate real-time and batch data processing for streaming and historical data.- Strong knowledge and experience in best practices for data governance, security, and compliance - Excellent Communication and Presentation Skills with ability to tailor technical information as per customer needs- Strong analytical and problem-solving skills.- Ability to work independently and as part of a team.
Posted 3 weeks ago
4.0 - 8.0 years
6 - 10 Lacs
Gurugram
Work from Office
Key Responsibilities: - Gather and Analyze data from a variety of sources, including SQL databases, BigQuery, Excel, Power BI, and Python. Good SQL coding skills are a must. - Work with stakeholders to understand their needs and translate them into data-driven solutions. - Communicate effectively with stakeholders, both verbally and in writing. - Leading and managing a team of business analysts. - Must be a self-starter, able to manage multiple tasks and projects simultaneously, own deliverables end to end, prioritize workload effectively and thrive in dynamic environment - Must be a problem solver with outstanding skills in discovering techniques and proven - Abilities to translate the underlying business needs into actionable insights - Works well under pressure, can work within stringent timelines and collaborate with teams to achieve results Desired Profile: - Should have relevant experience of 4-7 years in the field of analytics - Technical Capabilities (hands on) - SQL, Advance Excel, Power BI, BigQuery, R/Python (Good to have) - Possess strong analytical skills and sharepoint of views with the organization - Penchant for business, curiosity about numbers and persistent to work with data to generate insights - Provides customized knowledge for client work, prepares accurate, well developed client deliverables - Experience in App Analytics would be preferred - Experience with E-commerce, Retail business would be preferred
Posted 3 weeks ago
1.0 - 4.0 years
10 - 14 Lacs
Pune
Work from Office
Overview Design, develop, and maintain data pipelines and ETL/ELT processes using PySpark/Databricks. Optimize performance for large datasets through techniques such as partitioning, indexing, and Spark optimization. Collaborate with cross-functional teams to resolve technical issues and gather requirements. Responsibilities Ensure data quality and integrity through data validation and cleansing processes. Analyze existing SQL queries, functions, and stored procedures for performance improvements. Develop database routines like procedures, functions, and views. Participate in data migration projects and understand technologies like Delta Lake/warehouse. Debug and solve complex problems in data pipelines and processes. Qualifications Bachelor’s degree in computer science, Engineering, or a related field. Strong understanding of distributed data processing platforms like Databricks and BigQuery. Proficiency in Python, PySpark, and SQL programming languages. Experience with performance optimization for large datasets. Strong debugging and problem-solving skills. Fundamental knowledge of cloud services, preferably Azure or GCP. Excellent communication and teamwork skills. Nice to Have: Experience in data migration projects. Understanding of technologies like Delta Lake/warehouse. What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com
Posted 3 weeks ago
4.0 - 8.0 years
16 - 25 Lacs
Bengaluru
Hybrid
Required Skills: Successful candidates will have demonstrated the following skills and characteristics: Must Have: Proven expertise in supply chain analytics across domains such as demand forecasting, inventory optimization, logistics, segmentation, and network design Well versed and hands-on experience of working on optimization methods like linear programming, mixed integer programming, scheduling optimization. Having understanding of working on third party optimization solvers like Gurobi will be an added advantage Proficiency in forecasting techniques (e.g., Holt-Winters, ARIMA, ARIMAX, SARIMA, SARIMAX, FBProphet, NBeats) and machine learning techniques (supervised and unsupervised) Strong command of statistical modeling, testing, and inference Proficient in using GCP tools: BigQuery, Vertex AI, Dataflow, Looker Building data pipelines and models for forecasting, optimization, and scenario planning Strong SQL and Python programming skills; experience deploying models in GCP environment Knowledge of orchestration tools like Cloud Composer (Airflow) Nice to have: Familiarity with MLOps, containerization (Docker, Kubernetes), and orchestration tools (e.g., Cloud composer) Strong communication and stakeholder engagement skills at the executive level Roles and Responsibilities: Assist analytics projects within the supply chain domain, driving design, development, and delivery of data science solutions Develop and execute on project & analysis plans under the guidance of Project Manager Interact with and advise consultants/clients in US as a subject matter expert to formalize data sources to be used, datasets to be acquired, data & use case clarifications that are needed to get a strong hold on data and the business problem to be solved Drive and Conduct analysis using advanced analytics tools and coach the junior team members Implement necessary quality control measures in place to ensure the deliverable integrity like data quality, model robustness, and explainability for deployments. Validate analysis outcomes, recommendations with all stakeholders including the client team Build storylines and make presentations to the client team and/or PwC project leadership team Contribute to the knowledge and firm building activities Role & responsibilities Preferred candidate profile
Posted 3 weeks ago
7.0 - 10.0 years
15 - 18 Lacs
Hyderabad, Bengaluru
Hybrid
We are seeking an experienced MuleSoft Developer who will be responsible for designing, developing, and maintaining integrations between various enterprise applications and systems. You will work with business and technical teams to ensure smooth data flow across systems, leveraging MuleSofts Anypoint Platform to implement end-to-end integration solutions. Design and develop integration solutions using MuleSoft’s Anypoint Platform, including Mule ESB, CloudHub, and API management tools. • Work closely with business analysts, stakeholders, and system architects to gather integration requirements and transform them into technical solutions. • Create and maintain reusable MuleSoft APIs and connectors for internal and external integrations. • Experience in integtating On-Prem databases with SF Ecosystem • Detail knowledge on VPN implementation for On Prem Data Movement • Develop, test, deploy, and monitor APIs and integration services. • Troubleshoot and resolve issues related to integrations, APIs, and MuleSoft components. • Collaborate with cross-functional teams to support API lifecycle management and promote best practices for API design and development. • Ensure high availability, scalability, and security of integration services. Integration with different legacy systems and data bases like MySQL SQL Server • Ability to work on BigQuery , Oracle Connectors & MINIO API. • Ability to perform end to end integration between On Prem and cloud Databases to Salesforce Data Cloud and Marketing Cloud • Ability to work on Parquet file • Extensive experience in designing, implementing and optimizing REST, SOAP and Bulk APIs • Conduct unit test, code reviews and quality assurance checks to ensure integration solutions meet high standards • Develop technical design documents • Experience in implementation of large scale software(high transaction volume, high availability concepts) • Good programming skills and experience in troubleshooting Mule ESB, including working with debuggers, flow analyzers, and configuration tools • Experience with Services Oriented Architecture, Web services development • 2-3 yrs of experience working in integration platform Bachelor’s Degree in Computer Science, Information Technology, Business, Engineering, or a related field • Minimum of 4-5 years of experience in integration development, with at least 2-3 years of hands-on experience with MuleSoft • Experienced in building integration projects using Mule ESB, Mule API, and Mule Cloud Hub • Strong ability to manage and communicate with both technical and non-technical stakeholders. • Solid understanding of software development, systems integration, and cloud technologies • Strong strategic thinking and planning skills. • Ability to work in a fast-paced, dynamic environment and manage multiple priorities • Experience with Agile methodologies and version control systems like Git MBA or advanced degree in a related field. • MuleSoft certification (e.g., MuleSoft Certified Developer - Level 1). • Knowledge of Cloud environments like AWS, Azure, or Google Cloud Platform. • Experience with API management, OAuth, JWT, and OpenID Connect. • Experience in API integration with Legacy syste
Posted 3 weeks ago
10.0 - 15.0 years
30 - 40 Lacs
Bhopal, Pune, Gurugram
Hybrid
Job Title: Senior Data Engineer GCP | Big Data | Airflow | dbt Company: Xebia Location: All Xebia locations Experience: 10+ Years Employment Type: Full Time Notice Period: Immediate to Max 30 Days Only Job Summary Join the digital transformation journey of one of the world’s most iconic global retail brands! As a Senior Data Engineer , you’ll be part of a dynamic Digital Technology organization, helping build modern, scalable, and reliable data products to power business decisions across the Americas. You'll work in the Operations Data Domain, focused on ingesting, processing, and optimizing high-volume data pipelines using Google Cloud Platform (GCP) and other modern tools. Key Responsibilities Design, develop, and maintain highly scalable big data pipelines (batch & streaming) Collaborate with cross-functional teams to understand data needs and deliver efficient solutions Architect robust data solutions using GCP-native services (BigQuery, Pub/Sub, Cloud Functions, etc.) Build and manage modern Data Lake/Lakehouse platforms Create frameworks and reusable components for scalable ingestion and processing Implement data governance, security, and ensure regulatory compliance Mentor junior engineers and lead an offshore team of 8+ engineers Monitor pipeline performance, troubleshoot bottlenecks, and ensure data quality Engage in code reviews, CI/CD deployments, and agile product releases Contribute to internal best practices and engineering standards Must-Have Skills & Qualifications 8+ years in data engineering with strong hands-on experience in production-grade pipelines Expertise in GCP Data Services – BigQuery, Vertex AI, Pub/Sub, etc. Proficiency in dbt (Data Build Tool) for data transformation Strong programming skills in Python, Java, or Scala Advanced SQL & NoSQL knowledge Experience with Apache Airflow for orchestration Hands-on with Git, GitHub Actions , Jenkins for CI/CD Solid understanding of data warehousing (BigQuery, Snowflake, Redshift) Exposure to tools like Hadoop, Spark, Kafka , Databricks (nice to have) Familiarity with BI tools like Tableau, Power BI, or Looker (optional) Strong leadership qualities to manage offshore engineering teams Excellent communication skills and stakeholder management experience Preferred Education Bachelor’s or Master’s degree in Computer Science, Engineering, Mathematics, or a related field Notice Period Requirement Only Immediate Joiners or candidates with Max 30 Days Notice Period will be considered. How to Apply If you are passionate about solving real-world data problems and want to be part of a global data-driven transformation, apply now by sending your resume to vijay.s@xebia.com with the subject line: "Sr Data Engineer Application – [Your Name]" Kindly include the following details in your email: Full Name Total Experience Current CTC Expected CTC Current Location Preferred Location Notice Period / Last Working Day Key Skills Please do not apply if you are currently in process with any other role at Xebia or have recently interviewed.
Posted 3 weeks ago
5.0 - 8.0 years
27 - 42 Lacs
Bengaluru
Work from Office
Years Of Exp - 5-12 Yrs Location - PAN India OFSAA Data Modeler Experience in design, build ,customize OFSAA Data model ,Validation of data model Excellent knowledge in Data model guidelines for Staging. processing and Reporting tables. Knowledge on Data model Support on configuring the UDPs, subtype and supertype relationship enhancements Experience on OFSAA platform (OFSAAI) with one or more of following OFSAA modules: o OFSAA Financial Solution Data Foundation - (Preferred) o OFSA Data Integrated Hub - Optional Good in SQL and PL/SQL. Strong in Data Warehouse Principles, ETL/Data Flow tools. Should have excellent Analytical and Communication skills. OFSAA Integration SME - DIH/Batch run framework Experience in ETL process, familiar with OFSAA. DIH setup in EDS, EDD, T2T, etc. Familiar with different seeded tables, SCD, DIM, hierarchy, look ups, etc Worked with FSDF in knowing the STG, CSA, FACT table structures Have working with different APIs and out of box connectors, etc. Familiar with Oracle patching and SR
Posted 3 weeks ago
5.0 - 10.0 years
15 - 27 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Job Title: Data Engineer GCP Company: Xebia Location: Hybrid - Any Xebia Location Experience: 5+ Years Salary: As per industry standards Job Type: Full Time About the Role: Xebia is hiring a seasoned Data Engineer (L4) to join a high-impact team building scalable data platforms using GCP, Databricks, and Airflow . If you thrive on architecting future-ready solutions and have strong experience in big data transformations, we’d love to hear from you. Project Overview: We currently manage 1000+ Data Pipelines using Databricks Clusters for end-to-end data transformation ( Raw Silver Gold ) with orchestration handled via Airflow — all on Google Cloud Platform (GCP) . Curated datasets are delivered through BigQuery and Databricks Notebooks . Our roadmap includes migrating to a GCP-native data processing framework optimized for Spark workloads. Key Responsibilities: Design and implement a GCP-native data processing framework Analyze and plan migration of existing workloads to cloud-native architecture Ensure data availability, integrity, and consistency Build reusable tools and standards for the Data Engineering team Collaborate with stakeholders and document processes thoroughly Required Experience: 5+ years in Data Engineering with strong data architecture experience Hands-on expertise in Databricks , Airflow , BigQuery , and PySpark Deep knowledge of GCP services for data processing (Dataflow, Dataproc, etc.) Familiarity with data lake table formats like Delta, Iceberg Experience with orchestration tools ( Airflow , Dagster , or similar) Key Skills: Python programming Strong understanding of data lake architectures and cloud-native best practices Excellent problem-solving and communication skills Notice Period Requirement: Only Immediate Joiners or Candidates with Max 30 Days Notice Period Will Be Considered How to Apply: Interested candidates can share their details and updated resume with vijay.s@xebia.com in the following format: Full Name: Total Experience (Must be 5+ years): Current CTC: Expected CTC: Current Location: Preferred Location: Notice Period / Last Working Day (if serving notice): Primary Skill Set: Note: Please apply only if you have not recently applied or interviewed for any open roles at Xebia.
Posted 3 weeks ago
1.0 - 3.0 years
10 - 15 Lacs
Kolkata, Gurugram, Bengaluru
Hybrid
Salary: 10 to 16 LPA Exp: 1 to 3 years Location: Gurgaon / Bangalore/ Kolkata (Hybrid) Notice: Immediate to 30 days..!! Key Skills: GCP, Cloud, Pubsub, Data Engineer
Posted 3 weeks ago
3.0 - 8.0 years
15 - 30 Lacs
Gurugram, Bengaluru
Hybrid
Salary: 15 to 30 LPA Exp: 3 to 8 years Location: Gurgaon / Bangalore (Hybrid) Notice: Immediate to 30 days..!! Key Skills: GCP, Cloud, Pubsub, Data Engineer
Posted 3 weeks ago
13.0 - 15.0 years
35 - 50 Lacs
Hyderabad
Work from Office
We're Nagarro , We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues. That is where you come in! REQUIREMENTS: Total experience 13+years. Hands-on experience in big data engineering and data architecture. Strong working knowledge with Google Cloud Platform services. Strong working experience in Google Big Query, Google Cloud Storage, Cloud Dataflow, Cloud Composer, and Dataproc. Advanced proficiency in Python and SQL for data engineering and automation. Solid understanding of data modelling, data warehousing, and distributed computing frameworks like Apache Spark. Deep knowledge of data governance, security controls, and regulatory compliance within cloud ecosystems. Strong leadership, problem-solving, and communication skills to drive technical direction and stakeholder engagement. Experience working in agile development environments. Strong communication and coordination skills to work with cross-functional and globally distributed teams. RESPONSIBILITIES: Writing and reviewing great quality code. Understanding the projects functional and non-functional requirements and the business context of the application being developed. Understanding and documenting requirements validated by the SMEs Interacting with clients to identify the scope of testing, expectations, acceptance criteria and availability of test data and environment. Working closely with product owner in defining and refining acceptance criteria. Preparing test plan/strategy Estimating the test effort and preparing schedules for testing activities, assigning tasks, identifying constraints and dependencies Risk management identifying, mitigating and resolving business and technical risks. Determines the potential causes of problems and analyses multiple alternatives. Designing and developing a framework for automated testing following the project's design and coding guidelines. Set up best practices for test automation. Preparing test reports to summarize the outcome of the testing phase and recommending whether the application is in a shippable state or not Communicating measurable quality metrics, with the ability to highlight problem areas and suggest solutions Participating in retrospective meetings, helping identify the root cause of any quality related issue and identifying ways to continuously improve the testing process Conducting demos of the application for internal and external stakeholders Reviewing all testing artifacts prepared by the team and ensuring that defects found during the review are tracked to closure Working with team and stakeholders to triage and prioritize defects for resolution Giving constructive feedback to the team members and setting clear expectations
Posted 3 weeks ago
0.0 - 1.0 years
6 - 10 Lacs
Pune
Work from Office
Job Title: Software Engineer Location: Pune, Maharashtra Company: Zerebral IT Solutions Pvt. Ltd. Website: www.zerebral.co.in Experience Level: 0-1 years Employment Type: Full-time About Zerebral Founded in 2011 and headquartered in Pune, Zerebral IT Solutions is a privately held technology company specializing in building scalable products, big data platforms, and vertical search engines. We focus on large-scale web data extraction, massive-scale APIs, and cloud-native microservices. Our team thrives on solving complex engineering challenges and delivering real-world impact through innovation. Role Overview We are seeking a passionate and experienced Software Engineer to join our dynamic team. In this role, you will be instrumental in designing, developing, and optimising high-performance systems that handle large-scale data processing and API integrations. You will collaborate with cross-functional teams to deliver robust solutions that align with our commitment to excellence and innovation. Key Responsibilities Design, develop, and maintain scalable APIs and microservices for data-intensive applications. Implement large-scale web data extraction and aggregation systems. Optimize backend services for performance, scalability, and reliability. Collaborate with product managers, data scientists, and frontend developers to deliver end-to-end solutions. Ensure code quality through code reviews, testing, and adherence to best practices. Monitor system performance and troubleshoot issues proactively. Why Join Zerebral? Work on cutting-edge technologies and challenging projects. Collaborate with a team of passionate and skilled professionals. Opportunities for continuous learning and career growth. Flexible work environment with a focus on work-life balance. Competitive compensation and benefits package Educational Qualification with minimum cut-off percentages HSC - 70% and above SSC - 80% and above B.E. - Computer Science/IT/Electronics and related branches - 60% and above
Posted 3 weeks ago
4.0 - 8.0 years
6 - 10 Lacs
Bengaluru
Work from Office
We are looking to hire a Senior Manager, Customer Analytics to join our team in Bangalore! Reporting to the Director of Customer Insights & Analytics, youll lead a high-performing team of analysts and Subject matter authorities to deliver customer-centric insights, business intelligence, and data-driven strategies. In this role, youll act as a strategic partner to global collaborators across Marketing, Sales, and CRM, helping drive improvements in customer experience, marketing performance, and commercial outcomes. Youll also play a key role in mentoring talent, crafting analytics processes, and scaling insight capabilities spanning multiple business units. What Youll Do: Lead and mentor a team of 68 analysts and data professionals, encouraging a collaborative, high-performance culture Drive customer analytics initiatives including segmentation, cohort analysis, and performance tracking Partner with global Sales, Marketing, CRM, and Data leaders to align on priorities and deliver actionable insights Translate sophisticated data into clear recommendations and compelling narratives for non-technical collaborators Influence senior-level decision-making through data-driven storytelling and strategic insight Coordinate end-to-end delivery of analytics projects across divisions, ensuring quality and business relevance Improve and standardize analytics tools, processes, and documentation for scalability and consistency Collaborate with Tech and Data Engineering teams to improve customer data infrastructure and analytics capabilities Serve as a trusted advisor to senior leadership on demonstrating customer intelligence for business growth Leverage AI to improve team productivity and analytics efficiency What Youll Bring: Extensive experience in data analytics, customer insights, or marketing analytics, with a proven record in team leadership and management Optimally led customer-centric analytics initiatives within global or matrixed organizations Sophisticated analytical skills with hands-on expertise in SQL and tools like Python, R, or similar statistical platforms Strong proficiency in BI tools such as Power BI and Google Data Studio Familiar with cloud-based data platforms like Google BigQuery and similar analytics stacks Skilled in communicating sophisticated analytical insights clearly and persuasively to senior collaborators Business-savvy approach with the ability to translate data into impactful strategic actions Proven success in mentoring and developing high-performing analytics teams
Posted 3 weeks ago
5.0 - 10.0 years
1 - 1 Lacs
Hyderabad
Work from Office
Job title: Data Scientist/AI Engineer About Quantco: At Quantaco, we deliver state-of-the-art predictive financial data services for the Australian hospitality industry. We are the eighth-fastest growing company in Australia as judged by the countrys flagship financial newspaper, The Australian Financial Review. We are continuing our accelerating through hyper-automation. Our engineers are thought leaders in the business and provide significant input into the design and direction of our technology. Our engineering roles are not singular in their focus. You will develop new data models and predictive models, ensure pipelines are fully automated and run with bullet-proof reliability. We are a friendly and collaborative team. We work using a mature, design-first development process focused on delivering new features to enhance our customer's experience and improve their bottom line. You'll always be learning at Quantaco. About the role We are looking for a Data Scientist with strong software engineering capabilities to join our growing team. This is a key role in helping us unlock the power of data across our platform and deliver valuable insights to hospitality businesses. You will work on projects ranging from statistical modelling and anomaly detection to productionizing ML pipelines (ranged from time series forecasting to neural networks and custom LLMs), integrating with Django and Flask-based web applications, and building data products on Google Cloud using PostgreSQL and BigQuery, as well as ML routines in Databricks/VertexAI. This role is ideal for someone who thrives in a cross-functional environment, enjoys solving real-world problems with data, and can contribute to production-grade systems. Position Description Data Scientist Our culture and values Quantaco is a happy and diverse group of professionals who value a strong work ethic, authenticity, creativity, and flexibility. We work hard for each other and for our customers while having fun along the way. You can see what our team says about life at Quantaco here. If you've got a passion for creating new and impactful data-driven technology and want to realise your potential in a team that values your ideas, then we want to hear from you. Responsibilities of the role: Build and deploy data-driven solutions and machine learning models into production. Collaborate with engineers to integrate models into Django/Flask applications and APIs. Develop and maintain data pipelines using Python and SQL. Proactively seek to link analytical outputs to commercial outcomes Provide technical expertise for proof-of-concept (PoC) and minimum viable product (MVP) phases Clean, transform, and analyse large datasets to extract meaningful insights. Write clean, maintainable Python code and contribute to the platform’s architecture. Work with cloud-native tools (Google Cloud, BigQuery, Cloud Functions, etc.). Participate in sprint planning, stand-ups, and team ceremonies as part of an Agile team. Document MLOps processes, workflows, and best practices to facilitate knowledge sharing and ensure reproducibility You’ll fit right in if you Have 3+ years of experience in a data-centric or backend software engineering role. Are proficient in production Python, including Django or Flask, and SQL (PostgreSQL preferred). Are curious, analytical, and love solving data problems end-to-end. You demonstrate a scientific and design-led approach to delivering effective data solutions Have experience with data modelling, feature engineering, and applying ML algorithms in real-world applications. Can develop scalable data pipelines and integrate them with cloud platforms (preferably Google Cloud). Communicate clearly and can collaborate across technical and non-technical teams. You are self-motivated and can work as an individual and in a team You love innovation and are always looking for ways to improve Have an MLOps experience (mainly, regarding time-series forecasting, LLM and text analysis, classification & clustering problems) Position Description – Data Scientist: It would be fantastic (but not essential) if you Hold a degree in data science, mathematics, statistics, or computer science. Have experience with BigQuery, VertexAI, DBT, Databricks, or Terraform. Are familiar with containerisation and serverless architecture (Docker/Kubernetes/GCP). Have worked with BI tools or data visualization frameworks (e.g. Looker, PowerBI). Have exposure to financial data systems or the hospitality industry. Preferred technical skill set: Google Cloud Platform (BigQuery, VertexAI, Cloud Run) Python (Django/Flask) Azure (MS SQL Server, Databricks) Postman (API development) DBT, Stored procedures ML (time-series forecasting, LLM, text analysis, classification) Tableau/Looker Studio/Power BI
Posted 3 weeks ago
2.0 - 7.0 years
4 - 7 Lacs
Thiruvananthapuram
Work from Office
Service Manager Role & responsibilities: - Primary responsibility will be to provide commercial and administrative support to service department. - To plan and attain sales target of service department. - Handling service department operations including Day to Day work allocation & planning of service engineers / technicians for job execution at site. - Cordinating service activities with inhouse technicians and sub contractors for territory of West India and key accounts - Spare part management and accountability. - To monitor billings and receivables. - Collection of payments and necessary tax forms. - Prepare quotations for service contracts and casual service jobs - Prepare, Maintain and update regularly the AMC data base and ensure renewal of contracts. - Managing AMC accounts of customers. - Prepare and Coordinate service schedules - Prepare and update list of Warranty units and AMC units - Visit key customers for meetings as and when required. - Depute technicians to sites for service works and approve their expense vouchers. - Monitoring service report files and database. - Organisnig training programs for on field service technicians and sub contractors including customer maintenance personnels from time to time. - Supervising & monitoring customer complaint register to ensure no escalation of open calls - Involving technical resource to analyse root cause of repeated complaints or failure and maintain reports of such findings and analyses for future reference. - Checking of quality of work, measurement certification, billing certification & collection of payments. - Material Planning delivery arrangement for Project/Service - Checking of service reports and inform material requirement. - Closing the complaints as per specified response time resolution and time - Monitoring of movements of Service people. - Co-ordination with Vendors/Contractors. - Support the Sales, Project & Service function in all aspects required from time to time Apply Save Save Pro Insights
Posted 3 weeks ago
6.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Role Senior Data Analyst. Experience 6 to 10 years. Location Bangalore, Pune, Hyderabad, Gurgaon, Noida. Notice Immediate joiners only. Data Analyst EDA Exploratory Data Analysis, Communication ,Strong hands-on SQL ,Documentation Exp, GCP Exp, Data pipeline Exp. Requirements - 8+ years experience in Data mining working with large relational databases, succession using advanced data extraction and manipulation tools (for example; Big Query, Teradata, etc.) working with both structured and unstructured data. - Excellent communication skills, both written and verbal able to explain solutions, problems in clear and concise manner. - Experience in conducting business analysis to capture requirements from non-technical partners. - Superb analytical and conceptual thinking skills; to not only to manipulate but also derive relevant interpretations from data. - Proven knowledge of the data management lifecycle, including experience with data quality and metadata management. - Hands on experience in Computer Science, Statistics, Mathematics or Information Systems. - Experience in cloud, GCP Bigquery including but not limited to complex SQL querying. - 1-2 years or experience/exposure in the following : 1. Experience with CI/CD release processes using gitlab,Jira, confluence. 2. Familiarity with creating yaml files, understanding unstructured data such as json. 3. Experience with Looker Studio, Dataplex is a plus. - Hands on engineering experience is an asset. - Exposure to Python, Java nice to have.
Posted 3 weeks ago
3.0 - 6.0 years
5 - 9 Lacs
Pune
Work from Office
Data engineers are responsible for building reliable and scalable data infrastructure that enables organizations to derive meaningful insights, make data-driven decisions, and unlock the value of their data assets. - Grade Specific The role support the team in building and maintaining data infrastructure and systems within an organization. Skills (competencies) Ab Initio Agile (Software Development Framework) Apache Hadoop AWS Airflow AWS Athena AWS Code Pipeline AWS EFS AWS EMR AWS Redshift AWS S3 Azure ADLS Gen2 Azure Data Factory Azure Data Lake Storage Azure Databricks Azure Event Hub Azure Stream Analytics Azure Sunapse Bitbucket Change Management Client Centricity Collaboration Continuous Integration and Continuous Delivery (CI/CD) Data Architecture Patterns Data Format Analysis Data Governance Data Modeling Data Validation Data Vault Modeling Database Schema Design Decision-Making DevOps Dimensional Modeling GCP Big Table GCP BigQuery GCP Cloud Storage GCP DataFlow GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2 Industry Standard Data Modeling (FSLDM) Industry Standard Data Modeling (IBM FSDM)) Influencing Informatica IICS Inmon methodology JavaScript Jenkins Kimball Linux - Redhat Negotiation Netezza NewSQL Oracle Exadata Performance Tuning Perl Platform Update Management Project Management PySpark Python R RDD Optimization SantOs SaS Scala Spark Shell Script Snowflake SPARK SPARK Code Optimization SQL Stakeholder Management Sun Solaris Synapse Talend Teradata Time Management Ubuntu Vendor Management
Posted 3 weeks ago
8.0 - 10.0 years
12 - 22 Lacs
Pune, Bengaluru, Delhi / NCR
Hybrid
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Consultant - Insurance Manual Functional Testing We are seeking a dynamic and versatile Principal Consultant with expertise in the insurance industry and a passion for playwriting. This dual-role position offers the unique opportunity to lead strategic consulting projects in the insurance sector while also contributing creatively through playwriting. The ideal candidate will bring deep industry knowledge and creative storytelling skills to both areas, driving innovation and engagement within the company and the broader community Responsibilities Working knowledge of Manual Functional Testing Experience in Test Rail, JIRA & Confluence Experience in insurance P and C domain. Working Knowledge driven development (BDD”), Dev Ops concepts Strong understanding and experience with Agile Framework (design, terminology, ceremonies, construct, etc.) Working knowledge of Agile and Test Management Tools (i.e., JIRA, X-Ray, etc.) Understanding several types of automation frameworks, Behaviour Driven, Data Driven etc. General Testing Skills that include manual testing, functional test case preparation, test data preparation, test environment setup etc. Effective communication skills & highly proactive in approach Ability to manage & prioritize deliverables. Ability to be learn and apply new processes and tools. Qualifications we seek in you! Minimum Qualifications / Skills BE/ B Tech/ MCA/M Tech Valid and relevant Years of Testing Experience Preferred Qualifications/ Skills Strong Specialty Insurance domain & IT knowledge Manual Functional Testing with good integration. Automation Testing. (Playwright preferable) Datawarehouse Testing experience. (Mandatory) Should be able to test Web Apps, Desktop Apps, API Services, DB, Data validations. Experience on JIRA/ Remedy tool Iterative / Agile / DevOps/ ITIL practices & tools Execution of Transformation, Integration & Automation Programs/ Projects Excellent verbal, written communication skills and Analytical reasoning ability. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at www.genpact.com and on X, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.
Posted 3 weeks ago
8.0 - 12.0 years
12 - 13 Lacs
Hyderabad
Work from Office
We are seeking a Technical Architect specializing in Healthcare Data Analytics with expertise in Google Cloud Platform (GCP). The role involves designing and implementing data solutions tailored to healthcare analytics requirements. The ideal candidate will have experience in GCP tools like BigQuery, Dataflow, Dataprep, and Healthcare APIs, and should stay up to date with GCP updates. Knowledge of healthcare data standards, compliance requirements (e.g., HIPAA), and healthcare interoperability is essential. The role requires experience in microservices, containerization (Docker, Kubernetes), and programming languages like Python and Spark. The candidate will lead the implementation of data analytics solutions and collaborate with cross-functional teams, data scientists, and engineers to deliver secure, scalable systems.
Posted 3 weeks ago
12.0 - 17.0 years
30 - 45 Lacs
Bengaluru
Hybrid
Required Skills: Successful candidates will have demonstrated the following skills and characteristics: Must Have: Proven expertise in supply chain analytics across domains such as demand forecasting, inventory optimization, logistics, segmentation, and network design Well versed and hands-on experience of working on optimization methods like linear programming, mixed integer programming, scheduling optimization. Having understanding of working on third party optimization solvers like Gurobi will be an added advantage Proficiency in forecasting techniques (e.g., Holt-Winters, ARIMA, ARIMAX, SARIMA, SARIMAX, FBProphet, NBeats) and machine learning techniques (supervised and unsupervised) Strong command of statistical modeling, testing, and inference Proficient in using GCP tools: BigQuery, Vertex AI, Dataflow, Looker Building data pipelines and models for forecasting, optimization, and scenario planning Strong SQL and Python programming skills; experience deploying models in GCP environment Knowledge of orchestration tools like Cloud Composer (Airflow) Nice to have: Familiarity with MLOps, containerization (Docker, Kubernetes), and orchestration tools (e.g., Cloud composer) Strong communication and stakeholder engagement skills at the executive level Roles and Responsibilities: Assist analytics projects within the supply chain domain, driving design, development, and delivery of data science solutions Develop and execute on project & analysis plans under the guidance of Project Manager Interact with and advise consultants/clients in US as a subject matter expert to formalize data sources to be used, datasets to be acquired, data & use case clarifications that are needed to get a strong hold on data and the business problem to be solved Drive and Conduct analysis using advanced analytics tools and coach the junior team members Implement necessary quality control measures in place to ensure the deliverable integrity like data quality, model robustness, and explainability for deployments. Validate analysis outcomes, recommendations with all stakeholders including the client team Build storylines and make presentations to the client team and/or PwC project leadership team Contribute to the knowledge and firm building activities Role & responsibilities Preferred candidate profile
Posted 3 weeks ago
5.0 - 10.0 years
7 - 14 Lacs
Gurugram, Bengaluru
Work from Office
Required Skills • 5+ years of hands-on experience in relevant field • Knowledge of platforms like Big Query • Experience with AWS, Azure, or Google Cloud for scalable data solutions • Should be skilled in Azure SQL Managed Services • Skilled in utilizing Incortas direct data mapping for optimized data ingestion • Expertise in Extract, Transform, Load (ETL) processes and workflow automation • AI-enhanced automation, AMS, Oracle AMS ,Salesforce exposure are added skills Role & responsibilities • Build and optimize data pipelines for ingestion, transformation, and storage using BigQuery, Azure SQL Managed Services, and Incorta. • Implement Al-driven automation for data pipeline monitoring, performance tuning, and anomaly detection. • Ensure data governance, security, and compliance standards are met across all platforms. • Optimize data workflows for cost efficiency and scalability. • Collaborate with BI, Al, and application teams for seamless data access and analytics. • Integrate data from multiple sources including Salesforce, Oracle AMS, and other enterprise applications.
Posted 3 weeks ago
5.0 - 6.0 years
12 - 14 Lacs
Gurugram, Bengaluru
Work from Office
GCP Data Engineer Location: Gurgaon / Bengaluru Experience: 5+ years Job Type: Full-time Role Overview: The Data Engineer will focus on developing, maintaining, and optimizing data pipelines. You will work on BigQuery, Azure SQL Managed Services, and Incorta, ensuring efficient data ingestion, transformation, and governance while integrating Al-driven automation. Key Responsibilities: Build and optimize data pipelines for ingestion, transformation, and storage using BigQuery, Azure SQL Managed Services, and Incorta. Implement Al-driven automation for data pipeline monitoring, performance tuning, and anomaly detection. Ensure data governance, security, and compliance standards are met across all platforms. Optimize data workflows for cost efficiency and scalability. Collaborate with BI, Al, and application teams for seamless data access and analytics. Integrate data from multiple sources including Salesforce, Oracle AMS, and other enterprise applications. Required Skills: Primary: BigQuery, ETL Development, Azure SQL Managed Services, Incorta Secondary: Data Pipeline Optimization, Cost Optimization, Pipeline Maintenance & Automation Additional: AMS, Al-enhanced automation, Salesforce exposure, Oracle AMS Regards, Team BGT bougaintechbgt@gmail.com 9560201779
Posted 3 weeks ago
4.0 - 9.0 years
8 - 18 Lacs
Hyderabad, Bengaluru
Work from Office
Role & responsibilities Job Description: We are looking for an independent contributor experienced in Data Engineering space. Primary responsibilities include implementation of large-scale data processing (Structural, Statistical etc.,) Pipelines, creates production inference pipelines, associated APIs and analytics that support/provide insights for data driven decision making. Designs and develops data models, APIs, and pipelines to handle analytical workloads, data sharing, and movement across multiple systems at various grains in a large-scale data processing environment. Designs and maintains data systems and data structures for optimal read/write performance. Implements machine learning or statistical/heuristic learning in data pipelines based on input from Data Scientists. Roles and Responsibilities: Work in data streaming, movement, data modelling and data pipeline development Develop pipelines and data model changes in support of rapidly emerging business and project requirements Develop code and maintain systems to support analytics Infrastructure & Data Lake Partner/Contribute to data analysis and machine learning pipelines Design data recovery processes, alternate pipelines to check data quality. Create and maintain continuous data quality evaluation processes Optimize performance of the analytics platform and develop self-healing workflows Be a part of a global team and collaborate and co-develop solutions Qualifying Criteria: Bachelors degree in computer science, information technology, or engineering 5+ Years of prior experience in Data Engineering and Databases Experience with code based ETL framework like Airflow/Prefect Experience with Google Big Query, Google Pub Sub, Google Dataflow Experience building data pipelines on AWS or GCP Experience developing data APIs and pipelines using Python Experience with databases like MySQL/Postgres Experience with intermediate Python programming Experience with advanced SQL (analytical queries) "" Preferred Qualifications: Experience with Visualization tools like Tableau/QlikView/Looker Experience with building Machine Learning pipelines. Mandatory Skills: Data Engineering, Python, Airflow, AWS/ Google Cloud / GCP, Data Streaming, Data Lake, Data Pipelines, Google, Bigquerry, ETL, Google Pub sub, Google Data Flow, Rest API, MySQL, Postgre, SQL Analytics
Posted 3 weeks ago
5.0 - 8.0 years
9 - 19 Lacs
Gurugram, Bengaluru
Work from Office
Hi, Greetings Of the day Hiring for an MNC for a Sr Data Engineer Profile: Sr Data Engineer Experience-4-10years Interview Mode-Virtual Mandatory Skills : Pyspark, Python, AWS(Glue,EC2,Redshift,Lambda) Python, Spark, Big Data, ETL, SQL, etl, Data Warehousing. Good to have: Data structures and algorithms. Responsibilities Bachelor's degree in Computer Science, Engineering, or a related field Proven experience as a Data Engineer or similar role Experience with Python, and big data technologies (Hadoop, Spark, Kafka, etc.) Experience with relational SQL and NoSQL databases Strong analytic skills related to working with unstructured datasets Strong project management and organizational skills Experience with AWS cloud services: EC2, Lambda(step function), RDS, Redshift Ability to work in a team environment Excellent written and verbal communication skills Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc. Interested candidates can share the resume on the mail id avanya@niftelresources.com or contact on 9219975840 .
Posted 3 weeks ago
13.0 - 20.0 years
0 Lacs
Pune
Work from Office
Data Platform Engineer:-Technical Skills:- . 7+ years of experience in software engineering, with a focus on Big Data and GCP technologies such as Hadoop, PySpark, Terraform, BigQuery, DataProc and data management. Proven experience in leading software engineering teams, with a focus on mentorship, guidance, and team growth. Strong expertise in designing and implementing data pipelines, including ETL processes and real-time data processing. Hands-on experience with Hadoop ecosystem tools such as HDFS, MapReduce, Hive, Pig, and Spark. Hands on experience with cloud platform particularly Google Cloud Platform (GCP), and its data management services (e.g., Terraform, BigQuery, Cloud Dataflow, Cloud Dataproc, Cloud Storage). Solid understanding of data quality management and best practices for ensuring data integrity. Familiarity with containerization and orchestration tools such as Docker and Kubernetes is a plus. Excellent problem-solving skills and the ability to troubleshoot complex systems. Strong communication skills and the ability to collaborate with both technical and non-technical stakeholders
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
BigQuery, a powerful cloud-based data warehouse provided by Google Cloud, is in high demand in the job market in India. Companies are increasingly relying on BigQuery to analyze and manage large datasets, driving the need for skilled professionals in this area.
The average salary range for BigQuery professionals in India varies based on experience level. Entry-level positions may start at around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15-20 lakhs per annum.
In the field of BigQuery, a typical career progression may include roles such as Junior Developer, Developer, Senior Developer, Tech Lead, and eventually moving into managerial positions such as Data Architect or Data Engineering Manager.
Alongside BigQuery, professionals in this field often benefit from having skills in SQL, data modeling, data visualization tools like Tableau or Power BI, and cloud platforms like Google Cloud Platform or AWS.
As you explore opportunities in the BigQuery job market in India, remember to continuously upskill and stay updated with the latest trends in data analytics and cloud computing. Prepare thoroughly for interviews by practicing common BigQuery concepts and showcase your hands-on experience with the platform. With dedication and perseverance, you can excel in this dynamic field and secure rewarding career opportunities. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.