Home
Jobs

3417 Databricks Jobs - Page 4

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Role: Technical Architect Experience: 8-15 years Location: Bangalore, Chennai, Gurgaon, Pune, and Kolkata Mandatory Skills: Python, Pyspark, SQL, ETL, Pipelines, Azure Databricks, Azure Data Factory, & Architect Designing. Primary Roles and Responsibilities: Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills and Qualifications: Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 8+ yrs. of IT experience and 5+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

HCL is hiring for AIML Developer role Location: Noida (Hybrid) Must have skills: Generative AI - GPT3, ML Ops and Python Proficient in Python, with experience in machine learning, deep learning, and NLP processing. Experience in developing and implementing generative AI models, with a strong understanding of deep learning techniques such as GPT, VAE, and GANs. Proficient in Langchain, LLM Prompt Engineering: The engineer prompts and optimizes few-shot techniques to enhance LLM's performance on specific tasks, e.g. personalized recommendations. Model Evaluation & Optimization: Evaluate LLM's zero-shot and few-shot capabilities, fine-tuning hyperparameters, ensuring task generalization, and exploring model interpretability for robust web app integration. Response Quality: Collaborate with ML and Integration engineers to leverage LLM's pre-trained potential, delivering contextually appropriate responses in a user-friendly web app. It is essential to have a solid understanding of data structures, algorithms, and principles of software engineering. Experience with vector databases RDBMS, MongoDB and NoSQL databases. Proficiency in working with embeddings. Strong distributed systems skills and system architecture skills Experienced in building and running a large platform at scale. Hands-on experience with Python, Hugging Face, TensorFlow, Keras, PyTorch, Spark, or similar statistical tools. Experience as data modeling ML/NLP scientist. including, but not limited to, Performance tuning, fine-tuning, RLHF, and performance optimization. Validated background with ML toolkits, such as PyTorch, TensorFlow, Keras, Langchain, Llamadindex, SparkML, or Databricks. Proficient with the integration of data from multiple data sources Experience with NoSQL databases, such as HBase, ElasticSearch, and MongoDB API Design. API/Data mapping to schema. Experienced in and strong knowledge of using AI/ML and more particularly LLMs eager to apply this rapidly changing technology. Good Knowledge of Kubernetes, and RESTful design. Prior experience in developing public cloud services or open-source ML software is an advantage Show more Show less

Posted 1 day ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Modernization Program Manager – Director Level (Pune, India) Years of experience - 8 to 13 Location: Pune, India Seniority: Director / Program Leader Industry: IT, Cloud, Digital Transformation Key Responsibilities: Lead end-to-end offshore modernization program management Manage program planning, delivery, tracking, and risk mitigation Coordinate with US stakeholders, GCC IT, and 3rd-party contractors Handle contracts – SOWs, MSAs, and vendor agreements Track budget, subcontractor burn rate, and project milestones Act as main escalation point for daily operations and risks Set up processes to monitor program deliverables and timelines Identify and manage risks, issues, and action plans Align offshore delivery with global strategy and business goals Drive collaboration across teams to meet program objectives Key Skills & Experience: 5+ years in program management, client delivery, or consulting Experience with global teams and complex program leadership Strong in contract management and budget tracking Excellent communication with business and technical teams Background in Agile , DevOps , and cloud technologies Knowledge of Snowflake , Databricks , or modern cloud platforms Skilled in roadmap creation , stakeholder engagement , and risk management Strong problem-solving , decision-making , and prioritization skills Education: Master’s in project management, Computer Science , IT , Engineering , Data Science , or related fields Show more Show less

Posted 1 day ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title: Data Scientist – Data and Artificial Intelligence Location: Hyderabad Job Type: Full-time Company Description Qylis is a leading provider of innovative IT solutions, specializing in Cloud, Data & AI, and Cyber Security. We help businesses unlock the full potential of these technologies to achieve their goals and gain a competitive edge. Our unique approach focuses on delivering value through bespoke solutions tailored to customer specific needs. We are driven by a customer-centric mindset and committed to delivering continuous value through intellectual property accelerators and automation. Our team of experts is passionate about technology and dedicated to making a positive impact. We foster an environment of growth and innovation, constantly pushing the boundaries to deliver exceptional results. Website: www.qylis.com, LinkedIn: www.linkedin.com/company/qylis Job Summary We are an engineering organization collaborating directly with clients to address challenges using the latest technologies. Our focus is on joint code development with clients' engineers for cloud-based solutions, accelerating organizational progress. Working with product teams, partners, and open-source communities, we contribute to open source, striving for platform improvement. This role involves creating impactful solution patterns and open-source assets. As a team member, you'll collaborate with engineers from both teams, applying your skills and creativity to solve complex challenges and contribute to open source, while fostering professional growth. Responsibilities Researching and developing production-grade models (forecasting, anomaly detection, optimization, clustering, etc.) for global cloud business by using statistical and machine learning techniques. Manage large volumes of data, and create new and improved solutions for data collection, management, analyses, and data science model development. Drive the onboarding of new data and the refinement of existing data sources through feature engineering and feature selection. Apply statistical concepts and cutting-edge machine learning techniques to analyze cloud demand and optimize data science model code for distributed computing platforms and task automation. Work closely with other data scientists and data engineers to deploy models that drive cloud infrastructure capacity planning. Present analytical findings and business insights to project managers, stakeholders, and senior leadership and keep abreast of new statistical / machine learning techniques and implement them as appropriate to improve predictive performance. Oversees the analysis of data and leads the team in identifying trends, patterns, correlations, and insights to develop new forecasting models and improve existing models. Leads collaboration among team and leverages data to identify pockets of opportunity to apply state-of-the-art algorithms to improve a solution to a business problem. Consistently leverages knowledge of techniques to optimize analysis using algorithms. Modifies statistical analysis tools for evaluating Machine Learning models. Solves deep and challenging problems for circumstances such as when model predictions are not correct, when models do not match the training data or the design outcomes when the data is not clean when it is unclear which analyses to run, and when the process is ambiguous. Provides coaching to team members on business context, interpretation, and the implications of findings. Interprets findings and their implications for multiple businesses, and champions methodological rigour by calling attention to the limitations of knowledge wherever biases in data, methods, and analysis exist. Generates and leverages insights that inform future studies and reframe the research agenda. Informs both current business decisions by implementing and adapting supply-chain strategies through complex business intelligence. Qualifications M.Sc. in Statistics, Applied Mathematics, Applied Economics, Computer Science or Engineering, Data Science, Operations Research or similar applied quantitative field 7+ years of industry experience in developing production-grade statistical and machine learning code in a collaborative team environment. Prior experience in machine learning using R or Python (scikit / numpy / pandas / statsmodel). Prior experience working on Computer Vision Project is an Add on Knowledge on AWS and Azure Cloud. Prior experience in time series forecasting. Prior experience with typical data management systems and tools such as SQL. Knowledge and ability to work within a large-scale computing or big data context, and hands-on experience with Hadoop, Spark, DataBricks or similar. Excellent analytical skills; ability to understand business needs and translate them into technical solutions, including analysis specifications and models. Experience in machine learning using R or Python (scikit / numpy / pandas / statsmodel) with skill level at or near fluency. Experience with deep learning models (e.g., tensorflow, PyTorch, CNTK) and solid knowledge of theory and practice. Practical and professional experience contributing to and maintaining a large code base with code versioning systems such as Git. Creative thinking skills with emphasis on developing innovative methods to solve hard problems under ambiguity. Good interpersonal and communication (verbal and written) skills, including the ability to write concise and accurate technical documentation and communicate technical ideas to non-technical audiences. Skills:- Python, FastAPI, Large Language Models (LLM) tuning and MySQL Show more Show less

Posted 1 day ago

Apply

7.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Job Title: Senior Technical Delivery Manager – ETL, Datawarehouse and Analytics Experience : 15 plus years in IT delivery management, with at least 7 years in Big Data, Cloud, and Analytics. Experience should span across ETL, Data Management, Data Visualization and Project Management Location : Mumbai, India Department : Big Data and Cloud – DATA ANALYTICS DELIVERY Company: Smartavya Analytica Private limited is niche Data and AI company. Based in Pune, we are pioneers in data-driven innovation, transforming enterprise data into strategic insights. Established in 2017, our team has experience in handling large datasets up to 20 PB’s in a single implementation, delivering many successful data and AI projects across major industries, including retail, finance, telecom, manufacturing, insurance, and capital markets. We are leaders in Big Data, Cloud and Analytics projects with super specialisation in very very large Data Platforms. https://smart-analytica.com Empowering Your Digital Transformation with Data Modernization and AI Job Overview : Smartavya Analytica Private Limited is seeking an experienced Senior Delivery Manager to oversee and drive the successful delivery of large-scale Big Data, Cloud, and Analytics projects. The ideal candidate will have a strong background in IT delivery management, excellent leadership skills, and a proven record in managing complex projects from initiation to completion. The ideal candidate should have the right blend of Client Engagement, Project Delivery and Data Management Skills Key Responsibilities : • Technical Project Management: o Lead the end-to-end technical delivery of multiple projects in Big Data, Cloud, and Analytics. Lead teams in technical solutioning, design and development o Develop detailed project plans, timelines, and budgets, ensuring alignment with client expectations and business goals. o Monitor project progress, manage risks, and implement corrective actions as needed to ensure timely and quality delivery. • Client Engagement and Stakeholder Management: o Build and maintain strong client relationships, acting as the primary point of contact for project delivery. o Understand client requirements, anticipate challenges, and provide proactive solutions. o Coordinate with internal and external stakeholders to ensure seamless project execution. o Communicate project status, risks, and issues to senior management and stakeholders in a clear and timely manner. • Team Leadership: o Lead and mentor a team of data engineers, analysts, and project managers. o Ensure effective resource allocation and utilization across projects. o Foster a culture of collaboration, continuous improvement, and innovation within the team. • Technical and Delivery Excellence: o Leverage Data Management Expertise and Experience to guide and lead the technical conversations effectively. Identify and understand technical areas of support needed to the team and work towards resolving them – either by own expertise or networking with internal and external stakeholders to unblock the team o Implement best practices in project management, delivery, and quality assurance. o Drive continuous improvement initiatives to enhance delivery efficiency and client satisfaction. o Stay updated with the latest trends and advancements in Big Data, Cloud, and Analytics technologies. Requirements : • Experience in IT delivery management, particularly in Big Data, Cloud, and Analytics. • Strong knowledge of project management methodologies and tools (e.g., Agile, Scrum, PMP). • Excellent leadership, communication, and stakeholder management skills. • Proven ability to manage large, complex projects with multiple stakeholders. • Strong critical thinking skills and the ability to make decisions under pressure. Academic Qualifications: • Bachelor’s degree in computer science, Information Technology, or a related field. • Relevant certifications in Big Data, Cloud platforms like GCP, Azure, AWS, Snowflake, Databricks, Project Management or similar areas is preferred. Experience : • 15+ years in IT delivery management, with at least 7 years in Big Data, Cloud, and Analytics. Experience should span across ETL, Data Management, Data Visualization and Project Management The ideal candidate will have a strong experience in IT delivery management, excellent leadership skills, and a proven record in managing complex projects from initiation to completion. The ideal candidate should have the right blend of experience in Client Engagement, Project Delivery and Technical Data Management Skills If you have a passion for leading high-impact projects and delivering exceptional results, we encourage you to apply and be a part of our innovative team at Smartavya Analytica Private Limited Show more Show less

Posted 1 day ago

Apply

4.0 - 6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

The Purpose of the role We are seeking a full stack data scientist in Advanced Analytics team, who will be at the forefront of developing new innovative data driven solutions with bleeding edge machine learning and AI solution end to end. AIML Data Scientist is a technical job that uses AI & machine learning techniques to automate underwriting processes, improve claims outcomes and/or risk solutions. This person will develop vibrant data science solutions which require data engineering, AlML algorithms and Ops engineering skills to develop and deploy it for the business. Ideal candidate for this role is someone with a strong education in computer science, data science, statistics, applied math or a related field, and who is eager to tackle problems with innovative thinking without compromising detail business insights. You are adept at solving diverse problems by utilizing a variety of different tools, strategies, machine learning techniques, algorithms and programming languages. Responsibilities Work with business partners globally, determine analyses to be performed, manage deliverables against timelines, present of results and implement the model. Use broad spectrum of Machine Learning, text and image AI models to extract impactful features from structured/unstructured data. Develop and implement models that help with automating, getting insights, make smart decisions; Ensure that the model is able to meet the desired KPIs post-production. Develop and deploy scalable and efficient machine learning models. Package and publish codes and solutions in reusable format python package format- (Pypi, Scikit-learn pipeline,..) Keep the code ready for seamless building of CI/CD pipelines and workflows for machine learning applications. Ensure high quality code that meets business objectives, quality standards and secure web development guidelines. Building reusable tools to streamline the modeling pipeline and sharing knowledge Build real-time monitoring and alerting systems for machine learning systems. Develop and maintain automated testing and validation infrastructure. Troubleshoot pipelines across multiple touchpoints like CI Server, Artifact storage and Deployment cluster. Implement best practices for versioning, monitoring and reusability. Requirements Sound understanding of ML concepts, Supervised / Unsupervised Learning, Ensemble Techniques, Hyperparameter Good knowledge of Random Forest, XGBoost, SVM, Clustering, building data pipelines in Azure/Databricks, deep learning models, OpenCV, Bert and new transformer models for NLU, LLM application in ML> Strong experience with Azure cloud computing and containerization technologies (like Docker, Kubernetes). 4-6 years of experience in delivery end to end data science models. Experience with Python/OOPs programming languages and data science frameworks like (Pandas, Numpy, TensorFlow, Keras, PyTorch, sklearn). Knowledge of DevOps tools such as Git, Jenkins, Sonar, Nexus is must. Building python wheels and debugging build process. Data pipeline building and debugging (by creating and following log traces). Basic knowledge of DevOps practices. Concepts of Unit Testing and Test-Driven development. SDE skills like OOP and Functional programming are an added advantage. Experience with Databricks and its ecosystem is an added advantage. analytics/statistics/mathematics or related domain. Show more Show less

Posted 1 day ago

Apply

2.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Title Data Engineer - Assistant Manager Job Description Job title: Data Engineer - Deputy Manager Location : Chennai Your role: The Data Engineering manager needs to be well versed with Microsoft Business Intelligence stack having strong skills and experience in development and implementation of BI and advanced analytics solutions as per business requirements. Strong hands-on experience in Microsoft ADF pipelines, databricks notebooks, Pyspark. Adept in design and development of data flows using ADF. Expertise in implementing complex ETL logics through databricks notebooks Experience in implementing CI-CD pipelines through azure devops Experience in writing complexT- SQLs Understand the business requirements and develop data models accordingl Have knowledge and experience in prototyping, designing, and requirement analysis. Excellent knowledge in data usage, Scheduling, Data Refresh and diagnostics Experience in tools such as Microsoft Azure, SQL data warehouse, Visual Studio, etc. Worked in an agile (scrum) environment with globally distributed teams Analytical bent of mind Business acumen and articulation skills; ability to capture business needs and translate into a solution Ability to manage interaction with business stakeholders and others within the organization Good communication and documentation skills Proven experience in interfacing with different source systems Proven experience in data modelling Minimum required Education: BE Computer Science / MCA / MSC IT Minimum required Experience: Minimum 2 years of experience in Data Engineering or equivalent with Bachelor's Degree. Preferred Certification: Azure ADF/Databricks/T-SQL Preferred Skills: Azure ADF/Databricks PySpark / T-SQL Data Governance Data Harmonization & Processing Data Quality Assurance Business Intelligence Tools Requirements Analysis Root Cause Analysis (RCA) Requirements Gathering How We Work Together We believe that we are better together than apart. For our office-based teams, this means working in-person at least 3 days per week. Onsite roles require full-time presence in the company’s facilities. Field roles are most effectively done outside of the company’s main facilities, generally at the customers’ or suppliers’ locations. Indicate if this role is an office/field/onsite role. About Philips We are a health technology company. We built our entire company around the belief that every human matters, and we won't stop until everybody everywhere has access to the quality healthcare that we all deserve. Do the work of your life to help the lives of others. Learn more about our business. Discover our rich and exciting history. Learn more about our purpose. If you’re interested in this role and have many, but not all, of the experiences needed, we encourage you to apply. You may still be the right candidate for this or other opportunities at Philips. Learn more about our commitment to diversity and inclusion here. Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Primary & Secondary Skill Databricks – Pyspark, Python and Collibra ( Primary) Unity catalog ETL AWS JD (Detailed) Design of data solutions on Databricks including delta lake, data warehouse, data marts and other data solutions to support the analytics needs of the organization. Proficiency in using Collibra Data Governance Center, Data Catalog, and Collibra Connect for data management and governance. Apply best practices during design in data modeling (logical, physical) and ETL pipelines (streaming and batch) using cloud-based services especially Python & Pyspark Design, develop and manage the pipelining (collection, storage, access), data engineering (data quality, ETL, Data Modelling) and understanding (documentation, exploration) of the data. Interact with stakeholders regarding data landscape understanding, conducting discovery exercises, developing proof of concepts, and demonstrating it to stakeholders. Implement data quality frameworks and standards using Collibra to ensure the integrity and accuracy of data Excellent collaboration skills to work effectively with cross-functional teams Strong verbal and written communication skills Show more Show less

Posted 1 day ago

Apply

3.0 - 5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Assistant Vice President level role to join our Analytics & Data organization as part of the Key Platform & Data Solutions function. The candidate will be responsible for the ongoing support, analysis and reporting on key business data metrics for the management and stakeholders, collaborating to help determine timely solutions and providing actionable insights. Key Responsibilities Key responsibilities will include, but will not be limited to the following: > Work in a fun, fast-paced environment designing, building, and optimizing analytics solutions that drive value for our clients. > Act as a key support resource for users of the Wealth Management Analytics Platform, assisting them with DataIKU, Tableau, and other tools to solve business problems. > Develop and maintain data pipelines and workflows in DataIKU, ensuring data quality and governance best practices are followed. > Leverage Snowflake, Azure, and Databricks to manage, integrate, and transform large datasets for analytics purposes. > Implement and maintain data security and access controls through Immuta in compliance with organizational policies. > Participate in Proof of Concepts (POCs) and evaluation of new analytics tools and technologies. > Collaborate with development teams to ensure features and functionality meet user needs and align with product vision. > Work closely with platform, technology, legal, and compliance teams to drive decisions across cross-functional groups. > Assist with defining critical user journeys, use cases, workflows, and business processes that align with the product vision and address critical user needs. > Support the prioritization of the product roadmap, including release planning and development of business requirements. > Help maintain a prioritized product backlog; identify, groom, and validate epics and user stories for agile sprints. > Contribute to backlog refinement and sprint planning ceremonies to communicate requirements effectively to the development team. > Teamwork skills - The candidate must be flexible in their work style and be able to work collaboratively with A&D team-members in Mumbai, NY/NJ, and Budapest. > Create documentation, enhance user experience, and drive user adoption. > Provide reporting support as needed. Experience > 3-5 years' experience in data-centric role (Overall ~6 years). > Bachelor's degree required. > Experience in Data & Analytics and/or Financial Services is a plus. Required Skills > Bachelor's degree in computer science, Information Systems, Statistics, Mathematics, or a related field. > Proven hands-on experience with DataIKU for building data pipelines and workflows. > Knowledge of Snowflake, Azure Data Services, and Databricks for cloud data warehousing and analytics. > Understanding of Immuta or similar data access control tools. > Solid understanding of data governance and data quality frameworks and best practices. > Experience working with SQL and large datasets. > Strong communication skills, with the ability to support and train users in a dynamic environment. > Familiarity with agile methodologies and experience contributing to backlog grooming and sprint planning. What You Can Expect From Morgan Stanley We are committed to maintaining the first-class service and high standard of excellence that have defined Morgan Stanley for over 89 years. Our values - putting clients first, doing the right thing, leading with exceptional ideas, committing to diversity and inclusion, and giving back - aren’t just beliefs, they guide the decisions we make every day to do what's best for our clients, communities and more than 80,000 employees in 1,200 offices across 42 countries. At Morgan Stanley, you’ll find an opportunity to work alongside the best and the brightest, in an environment where you are supported and empowered. Our teams are relentless collaborators and creative thinkers, fueled by their diverse backgrounds and experiences. We are proud to support our employees and their families at every point along their work-life journey, offering some of the most attractive and comprehensive employee benefits and perks in the industry. There’s also ample opportunity to move about the business for those who show passion and grit in their work. To learn more about our offices across the globe, please copy and paste https://www.morganstanley.com/about-us/global-offices into your browser. Morgan Stanley is an equal opportunities employer. We work to provide a supportive and inclusive environment where all individuals can maximize their full potential. Our skilled and creative workforce is comprised of individuals drawn from a broad cross section of the global communities in which we operate and who reflect a variety of backgrounds, talents, perspectives, and experiences. Our strong commitment to a culture of inclusion is evident through our constant focus on recruiting, developing, and advancing individuals based on their skills and talents. Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

Entity: Finance Job Family Group: Business Support Group Job Description: We are a global energy business involved in every aspect of the energy system. We are working towards delivering light, heat and mobility to millions of people, every day. In India, we operate bp’s FBT, which is a coordinated part of bp. Our people want to play their part in solving the big, sophisticated challenges facing our world today and, guided by our bp values, are working to help meet the world’s need for more energy while lowering carbon emissions. In our offices at Pune, we work in customer service, finance, accounting, procurement, HR services and other enabling functions – providing solutions across all bp. Would you like to discover how our diverse, hardworking people are owning the way in making energy cleaner and better – and how you can play your part in our outstanding team? Join our team, and develop your career in an encouraging, forward-thinking environment! Key Accountabilities Data Quality/Modelling/Design thinking: Demonstrating SAP MDG/ECCs experience the candidate is able to investigate to do root cause analysis for assigned use cases. Also able to work with Azure data lake (via Databricks) using SQL/Python. Data Model (Conceptual and Physical) will be needed to be identified and built that provides automated mechanism to supervise on going DQ issues. Multiple workshops may also be needed to work through various options and identifying the one that is most efficient and effective Works with business (Data Owners/Data Stewards) to profile data for exposing patterns indicating data quality issues. Also is able to identify impact to specific CDEs deemed relevant for each individual business. Identifies financial impacts of Data Quality Issue. Also is able to identify business benefit (quantitative/qualitative) from a remediation standpoint along with leading implementation timelines. Schedules regular working groups with business that have identified DQ issues and ensures progression for RCA/Remediation or for presenting in DGFs Identifies business DQ rules basis which critical metrics/Measures are stood up that champion into the dashboarding/workflows for BAU monitoring. Red flags are raised and investigated Understanding of Data Quality value chain, starting with Critical Data Element concepts, Data Quality Issues, Data Quality important metrics/Measures is needed. Also has experience owing and completing Data Quality Issue assessments to aid improvements to operational process and BAU initiatives Highlights risk/hidden DQ issues to Lead/Manager for further mentorship/customer concern. Interpersonal skills are significant in this role as this is outward facing and focus has to be on clearly articulation messages. Dashboarding & Workflow: Builds and maintains effective analytics and partner concern mechanisms which detect poor data and help business lines drive resolution Support crafting, building and deployment of data quality dashboards via PowerBI Resolves critical issue paths and constructs workflow and alerts which advise process and data owners of unresolved data quality issues Collaborates with IT & analytics teams to drive innovation (AI, ML, cognitive science etc.) DQ Improvement Plans: Creates, embeds and drives business ownership of DQ improvement plans Works with business functions and projects to create data quality improvement plans Sets targets for data improvements .Monitors and intervenes when sufficient progress is not being made Supports initiatives which are driving data clean-up of existing data landscape Project Delivery: Oversees, advises Data Quality Analysts and participates in delivery of data quality activities including profiling, establishing conversion criteria and resolving technical and business DQ issues Owns and develops relevant data quality work products as part of the DAS data change methodology Ensures data quality aspects are delivered as part of Gold and Silver data related change projects Supports the creation of cases with insight into the cost of poor data Crucial Experience and Job Requirements: 11-15 total yrs of experience in Oil & Gas or a Financial Services/Banking industry within Data Management space Experience of working with Data Models/Structures and investigating to design and fine tune them Experience of Data Quality Management i.e. Governance, DQI management (root cause analysis, Remediation /solution identification), Governance Forums (papers production, quorum maintenance, Minutes publication), CDE identification, Data Lineage (identification of authoritative data sources) preferred. Understand of important metrics/Measures needed as well Experience of having worked with senior partners in multiple Data Domain/Business Areas, CDO and Technology. Ability to operate in global teams within multiple time zones Ability to operate in a multifaceted and changing setup and be able to identify priorities. Also ability to operate independently without too much direction Desirable criteria SAP MDG/SAP ECC experience (T codes, Tables structures etc) Azure Data lake /AWS/Data Bricks Crafting dashboards & workflows (powerBI QlikView or Tableau etc.) Crafting analytics and insight in a DQ setting (PowerBI/power Query) Profiling and analysis skills (SAP DI, Informatica or Collibra) Persuading, influencing and communication at a senior level management level. Certification in Data Management, Data Science, Python/R desirable Travel Requirement No travel is expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks. Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

Entity: Finance Job Family Group: Business Support Group Job Description: We are a global energy business involved in every aspect of the energy system. We are working towards delivering light, heat and mobility to millions of people, every day. In India, we operate bp’s FBT, which is a coordinated part of bp. Our people want to play their part in solving the big, sophisticated challenges facing our world today and, guided by our bp values, are working to help meet the world’s need for more energy while lowering carbon emissions. In our offices at Pune, we work in customer service, finance, accounting, procurement, HR services and other enabling functions – providing solutions across all bp. Would you like to discover how our diverse, hardworking people are owning the way in making energy cleaner and better – and how you can play your part in our outstanding team? Join our team, and develop your career in an encouraging, forward-thinking environment! Key Accountabilities Data Quality/Modelling/Design thinking: Demonstrating SAP MDG/ECCs experience the candidate is able to investigate to do root cause analysis for assigned use cases. Also able to work with Azure data lake (via Databricks) using SQL/Python. Data Model (Conceptual and Physical) will be needed to be identified and built that provides automated mechanism to supervise on going DQ issues. Multiple workshops may also be needed to work through various options and identifying the one that is most efficient and effective Works with business (Data Owners/Data Stewards) to profile data for exposing patterns indicating data quality issues. Also is able to identify impact to specific CDEs deemed relevant for each individual business. Identifies financial impacts of Data Quality Issue. Also is able to identify business benefit (quantitative/qualitative) from a remediation standpoint along with leading implementation timelines. Schedules regular working groups with business that have identified DQ issues and ensures progression for RCA/Remediation or for presenting in DGFs Identifies business DQ rules basis which critical metrics/Measures are stood up that champion into the dashboarding/workflows for BAU monitoring. Red flags are raised and investigated Understanding of Data Quality value chain, starting with Critical Data Element concepts, Data Quality Issues, Data Quality important metrics/Measures is needed. Also has experience owing and completing Data Quality Issue assessments to aid improvements to operational process and BAU initiatives Highlights risk/hidden DQ issues to Lead/Manager for further mentorship/customer concern. Interpersonal skills are significant in this role as this is outward facing and focus has to be on clearly articulation messages. Dashboarding & Workflow: Builds and maintains effective analytics and partner concern mechanisms which detect poor data and help business lines drive resolution Support crafting, building and deployment of data quality dashboards via PowerBI Resolves critical issue paths and constructs workflow and alerts which advise process and data owners of unresolved data quality issues Collaborates with IT & analytics teams to drive innovation (AI, ML, cognitive science etc.) DQ Improvement Plans: Creates, embeds and drives business ownership of DQ improvement plans Works with business functions and projects to create data quality improvement plans Sets targets for data improvements .Monitors and intervenes when sufficient progress is not being made Supports initiatives which are driving data clean-up of existing data landscape Project Delivery: Oversees, advises Data Quality Analysts and participates in delivery of data quality activities including profiling, establishing conversion criteria and resolving technical and business DQ issues Owns and develops relevant data quality work products as part of the DAS data change methodology Ensures data quality aspects are delivered as part of Gold and Silver data related change projects Supports the creation of cases with insight into the cost of poor data Crucial Experience and Job Requirements: 11-15 total yrs of experience in Oil & Gas or a Financial Services/Banking industry within Data Management space Experience of working with Data Models/Structures and investigating to design and fine tune them Experience of Data Quality Management i.e. Governance, DQI management (root cause analysis, Remediation /solution identification), Governance Forums (papers production, quorum maintenance, Minutes publication), CDE identification, Data Lineage (identification of authoritative data sources) preferred. Understand of important metrics/Measures needed as well Experience of having worked with senior partners in multiple Data Domain/Business Areas, CDO and Technology. Ability to operate in global teams within multiple time zones Ability to operate in a multifaceted and changing setup and be able to identify priorities. Also ability to operate independently without too much direction Desirable criteria SAP MDG/SAP ECC experience (T codes, Tables structures etc) Azure Data lake /AWS/Data Bricks Crafting dashboards & workflows (powerBI QlikView or Tableau etc.) Crafting analytics and insight in a DQ setting (PowerBI/power Query) Profiling and analysis skills (SAP DI, Informatica or Collibra) Persuading, influencing and communication at a senior level management level. Certification in Data Management, Data Science, Python/R desirable Travel Requirement No travel is expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks. Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

Linkedin logo

Job Summary: We are looking for a highly skilled Data Scientist with deep expertise in time series forecasting, particularly in demand forecasting and customer lifecycle analytics (CLV). The ideal candidate will be proficient in Python or PySpark, have hands-on experience with tools like Prophet and ARIMA, and be comfortable working in Databricks environments. Familiarity with classic ML models and optimization techniques is a plus. Key Responsibilities • Develop, deploy, and maintain time series forecasting models (Prophet, ARIMA, etc.) for demand forecasting and customer behavior modeling. • Design and implement Customer Lifetime Value (CLV) models to drive customer retention and engagement strategies. • Process and analyze large datasets using PySpark or Python (Pandas). • Partner with cross-functional teams to identify business needs and translate them into data science solutions. • Leverage classic ML techniques (classification, regression) and boosting algorithms (e.g., XGBoost, LightGBM) to support broader analytics use cases. • Use Databricks for collaborative development, data pipelines, and model orchestration. • Apply optimization techniques where relevant to improve forecast accuracy and business decision-making. • Present actionable insights and communicate model results effectively to technical and non-technical stakeholders. Required Qualifications • Strong experience in Time Series Forecasting, with hands-on knowledge of Prophet, ARIMA, or equivalent – Mandatory. • Proven track record in Demand Forecasting – Highly Preferred. • Experience in modeling Customer Lifecycle Value (CLV) or similar customer analytics use cases – Highly Preferred. • Proficiency in Python (Pandas) or PySpark – Mandatory. • Experience with Databricks – Mandatory. • Solid foundation in statistics, predictive modeling, and machine learning. Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

Linkedin logo

Job Title Data Science - Intern Division WBS - Data Tower Location Remote Work Arrangement Hybrid Purpose of Role A Data Science Intern in a mining manufacturing organization like Weir plays a pivotal role in bridging academic knowledge with real-world data challenges. By applying analytical techniques, building predictive models, and uncovering actionable insights that support strategic decision-making, the intern contributes to the organization's data-driven transformation. Why Choose Weir Be part of a global organization dedicated to building a better future: At Weir, we constantly reinvent, adapt, and find sustainable ways to access essential resources. Each of us contributes to this mission, making it an exciting challenge. An opportunity to grow your own way: Weir offers a dynamic environment where you can take on new challenges, explore new areas, and tailor your career path with support and freedom. Feel empowered to be yourself and belong: Weir is inclusive and values each individual's contribution. We focus on people and their well-being, promoting fairness, honesty, transparency, and authenticity. Key Responsibilities Perform exploratory data analysis (EDA) to uncover insights. Targeted capability testing of Data Platforms such as Microsoft Fabric and Databricks. Creating and developing evaluation framework for Agentic Systems. Assist in data developing frameworks for creation of reusable patterns in Generative AI product development. Collaborate with cross-functional teams to understand business requirements. Document findings, methodologies, and code for reproducibility. Demonstrate 100% commitment to our zero harm behaviors in support of our drive towards developing a world class safety culture. Job Knowledge/Education And Qualifications Basic understanding of statistics and machine learning concepts. Proficiency in Python (Pandas, NumPy, Scikit-learn, etc.). Familiarity with data visualization tools (e.g., Matplotlib, Seaborn). Experience with SQL is a plus. Exposure of Generative AI is a plus Strong analytical and problem-solving skills. Good communication and teamwork abilities. Currently pursuing a Bachelor's or master's degree in engineering. Availability for a full-time internship for 2 months Founded in 1871, Weir is a world leading engineering business with a purpose to make mining operations smarter, more efficient and sustainable. Thanks to Weir’s technology, our customers can produce essential metals and minerals using less energy, water and waste at lower cost. With the increasing need for metals and minerals for climate change solutions, Weir colleagues are playing their part in powering a low carbon future. We are a global family of 11,000 uniquely talented people in over 60 counties, inspiring each other to do the best work of our lives. For additional information about what it is like to work at Weir, please visit our Career Page and LinkedIn Life Page. Weir is committed to an inclusive and diverse workplace. We are an equal opportunity employer and do not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation, veteran status, disability, age, or any other legally protected status. Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Description Senior Data Analyst: Job Description – We are looking for an experienced and highly analytical Senior Data Analyst to join our team. In this role, you will leverage your expertise to lead complex data analysis projects, deliver actionable insights, and support strategic decision-making across the organization. You will collaborate with cross-functional teams and mentor junior analysts to drive data-driven culture and business outcomes. Required skillsets: Experience with cloud data platforms (e.g., AWS, Azure, GCP). Familiarity with data warehousing concepts and tools. Knowledge of business intelligence (BI) best practices. Exposure to machine learning concepts and predictive analytics. Experience in [industry-specific experience, if relevant Lead the design, implementation, and delivery of advanced data analyses and reporting solutions. Partner with business stakeholders to identify opportunities, define metrics, and translate business requirements into analytical solutions. Develop, maintain, and optimize dashboards, reports, and data visualizations for various audiences. Perform deep-dive analyses to uncover trends, patterns, and root causes in large, complex datasets. Present findings and recommendations to senior management and non-technical stakeholders. Ensure data quality, integrity, and governance across all analytics initiatives. Mentor and provide guidance to junior analysts and team members. Collaborate with data engineering and IT teams to improve data infrastructure and processe Must Have: SQL, Databricks Good to Have: AWS Skills Senior Data Analyst aws azure gcp sql databricks business intelligence Skills: cloud data platforms (aws, azure, gcp),gcp,machine learning concepts,databricks,business intelligence (bi),sql,data warehousing concepts,cloud,azure,predictive analytics Show more Show less

Posted 1 day ago

Apply

4.0 years

0 Lacs

Thiruvananthapuram, Kerala, India

On-site

Linkedin logo

The world's top banks use Zafin's integrated platform to drive transformative customer value. Powered by an innovative AI-powered architecture, Zafin's platform seamlessly unifies data from across the enterprise to accelerate product and pricing innovation, automate deal management and billing, and create personalized customer offerings that drive expansion and loyalty. Zafin empowers banks to drive sustainable growth, strengthen their market position, and define the future of banking centered around customer value. The world's top banks use Zafin's integrated platform to drive transformative customer value. Powered by an innovative AI-powered architecture, Zafin's platform seamlessly unifies data from across the enterprise to accelerate product and pricing innovation, automate deal management and billing, and create personalized customer offerings that drive expansion and loyalty. Zafin empowers banks to drive sustainable growth, strengthen their market position, and define the future of banking centered around customer value. Zafin is privately owned and operates out of multiple global locations including North America, Europe, and Australia. Zafin is backed by significant financial partners committed to accelerating the company's growth, fueling innovation and ensuring Zafin's enduring value as a leading provider of cloud-based solutions to the financial services sector. Zafin is proud to be recognized as a top employer. In Canada, UK and India, we are certified as a "Great Place to Work". The Great Place to Work program recognizes employers who invest in and value their people and organizational culture. The company's culture is driven by strategy and focused on execution. We make and keep our commitments. What is the opportunity? This role is at the intersection of banking and analytics. It requires diving deep into the banking domain to understand and define the metrics, and into the technical domain to implement and present the metrics through business intelligence tools. We're building a next-generation analytics product to help banks maximize the financial wellness of their clients. The product is ambitious - that's why we're looking for a team member who is laterally skilled and comfortable with ambiguity. Reporting to the Senior Vice President, Analytics as part of the Zafin Product Team, you are a data-visualization subject matter expert who can define and implement the insights to be embedded in the product using data visualization tools (DataViz) and applying analytics expertise to make an impact. If storytelling with data is a passion of yours, and data visualization and analytics expertise is what has enabled you to reach your current level in your career - you should take a look at how we do it on one of the most advanced banking technology products in the market today - connect with us to learn more. Location – Chennai or Trivandrum India Purpose of the Role As a Software Engineer – APIs & Data Services, you will own the "last mile" that transforms data pipelines into polished, product-ready APIs and lightweight microservices. Working alongside data engineers and product managers, you will deliver features that power capabilities like Dynamic Cohorts, Signals, and our GPT-powered release notes assistant. What You'll Build & Run Approximate Focus: 60% API / 40% Data Focus Area Typical Work Product-Facing APIs Design REST/GraphQL endpoints for cohort, feature-flag, and release-notes data. Build microservices in Java/Kotlin (Spring Boot or Vert.x) or Python (FastAPI) with production-grade SLAs. Schema & Contract Management Manage JSON/Avro/Protobuf schemas, generate client SDKs, and enforce compatibility through CI/CD pipelines. Data-Ops Integration Interface with Delta Lake tables in Databricks using Spark/JDBC. Transform datasets with PySpark or Spark SQL and surface them via APIs. Pipeline Stewardship Extend Airflow 2.x DAGs (Python), orchestrate upstream Spark jobs, and manage downstream triggers. Develop custom operators as needed. DevOps & Quality Manage GitHub Actions, Docker containers, Kubernetes manifests, and Datadog dashboards to ensure service reliability. LLM & AI Features Enable prompt engineering and embeddings exposure via APIs; experiment with tools like OpenAI, LangChain, or LangChain4j to support product innovation. About You You're a language-flexible engineer with a solid grasp of system design and the discipline to ship robust, well-documented, and observable software. You're curious, driven, and passionate about building infrastructure that scales with evolving product needs. Mandatory Skills 4 to 6 years of professional experience in Java (11+) and Spring Boot Solid command of API design principles (REST, OpenAPI, GraphQL) Proficiency in SQL databases Experience with Docker, Git, and JUnit Hands-on knowledge of low-level design (LLD) and system design fundamentals Highly Preferred / Optional Skills Working experience with Apache Airflow Familiarity with cloud deployment (e.g., Azure AKS, GCP, AWS) Exposure to Kubernetes and microservice orchestration Frontend/UI experience in any modern framework (e.g., React, Angular) Experience with Python (FastAPI, Flask) Good-to-Have Skills CI/CD pipeline development using GitHub Actions Familiarity with code reviews, HLD, and architectural discussions Experience integrating with LLM APIs like OpenAI and building prompt-based systems Exposure to schema validation tools such as Pydantic, Jackson, Protobuf Monitoring and alerting with Datadog, Prometheus, or equivalent What's in it for you Joining our team means being part of a culture that values diversity, teamwork, and high-quality work. We offer competitive salaries, annual bonus potential, generous paid time off, paid volunteering days, wellness benefits, and robust opportunities for professional growth and career advancement. Want to learn more about what you can look forward to during your career with us? Visit our careers site and our openings: zafin.com/careers Zafin welcomes and encourages applications from people with disabilities. Accommodations are available on request for candidates taking part in all aspects of the selection process. Zafin is committed to protecting the privacy and security of the personal information collected from all applicants throughout the recruitment process. The methods by which Zafin contains uses, stores, handles, retains, or discloses applicant information can be accessed by reviewing Zafin's privacy policy at https://zafin.com/privacy-notice/. By submitting a job application, you confirm that you agree to the processing of your personal data by Zafin described in the candidate privacy notice. Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Do you like working with data and analytics to gain insight to solve problems? Do you enjoy collaborating across teams to build and deliver products that make a difference? Join Our Inclusive Team About The Team Within Elsevier Operations, Platform Operations is responsible for ensuring that Product content meets quality standards, is delivered on time and within budget, and made available to end-users via Elsevier's product websites such as Knovel, Engineering Village, and Scopus. About The Role The Senior Production Manager is a member of the Engineering Segment and leads support for the Engineering Collection (Engineering Village-EV). The successful candidate takes ownership of end-to-end production workflows and process improvements and is responsible for key decisions related to content analysis, content production, and content delivery. Success in this role requires knowledge of publishing and bibliographic metadata standards, and the ability to correlate multiple data sets to one or more strategic priorities. Responsibilities Build and maintain strong relationships with EV Product and Content teams Develop knowledge of the research landscape to understand EV use cases, product requirements and product vision Understand and coordinate development of workflows for content types currently outside of RDP (e.g., Standards, Patents, Pre-Prints, Dissertations) Working with suppliers (in consultation with Supplier Management), serve as the project manager for optimization of existing workflows and development of new workflows and ensure successful delivery of content to EV Improve data quality with a focus on completeness and error reduction Identify key metrics and work with CDA team to deliver dashboards and visualizations Organize and lead stakeholder meetings to review product health and align priorities Assist customer support to resolve customer-reported issues quickly and successfully Prepare budget forecasts and track spending on production and indexing by suppliers Requirements Strong analytical skills and facility with analytics tools Ability to dive into data, frame hypotheses and arrive at logical conclusions Ability to create reliable data that can stand along or be integrated with other data sets Strong communication skills Strong research skills Project management, business process management (businessoptix), stakeholder management Minimum one year working with a product development team Minimum one year of exposure to agile methodologies Familiarity with data analysis methods and tools for handling large data sets (e.g., Databricks) Familiarity with markup languages (e.g., XML), query languages (e.g., SQL) and scripting languages (e.g., Python) Knowledge of bilbiographic metadata and publishing standards and best practices Project and stakeholder management Leading Change: Champions Change Focus on Results: Drives for Results Focus on Results: Takes initiative Personal Capability: Solves Problems & Analyzes Issues Personal Capability: Practices Self-Development Interpersonal Skills: Collaboration & Teamwork Interpersonal Skills: Builds Relationships Working With Us We promote a healthy work/life balance across the organisation. We offer an appealing working prospect for our people. With numerous wellbeing initiatives, shared parental leave, study assistance and sabbaticals, we will help you meet your immediate responsibilities and your long-term goals. Working For You We know that your wellbeing and happiness are key to a long and successful career. These are some of the benefits we are delighted to offer: Comprehensive Health Insurance: Covers you, your immediate family, and parents. Enhanced Health Insurance Options: Competitive rates negotiated by the company. Group Life Insurance: Ensuring financial security for your loved ones. Group Accident Insurance: Extra protection for accidental death and permanent disablement. Flexible Working Arrangement: Achieve a harmonious work-life balance. Employee Assistance Program: Access support for personal and work-related challenges. Medical Screening: Your well-being is a top priority. Modern Family Benefits: Maternity, paternity, and adoption support. Long-Service Awards: Recognizing dedication and commitment. New Baby Gift: Celebrating the joy of parenthood. Subsidized Meals in Chennai: Enjoy delicious meals at discounted rates. Various Paid Time Off: Take time off with Casual Leave, Sick Leave, Privilege Leave, Compassionate Leave, Special Sick Leave, and Gazetted Public Holidays. Free Transport pick up and drop from the home -office - home (applies in Chennai) About The Business We are a global leader in information and analytics, helping researchers and healthcare professionals' advance science and improve health outcomes. Building on our publishing heritage, we combine quality information and vast data sets with analytics to support visionary science, research, health education, interactive learning, and exceptional healthcare and clinical practice. At Elsevier, your work contributes to addressing the world's grand challenges and creating a more sustainable future. We harness innovative technologies to support science and healthcare, partnering for a better world. Show more Show less

Posted 1 day ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Vestas is a major player in wind technology and a motivation in the development of the wind power industry. Vestas' core business comprises the development, manufacture, sale, marketing, and maintenance of Wind Turbines. Come and join us at Vestas! Digital Solutions & Development > Digital Architecture & Data & AL , Data Domains & AI > Data Domain - Tech Area Responsibilities Create and maintain scalable data pipelines for analytics use cases assembling large, complex data sets that meet functional & non-functional business requirements Develop logical & physical data models using optimal data model structure for data warehouse and data mart designs to support analytical needs Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability Collaborate with technology and platform management partners to optimize data sourcing and processing rules to ensure appropriate data quality Hands-on role (100%) - building data solutions using best practices and architecture recommendations Qualifications Bachelor's / Master's in engineering (Degree in Computer Science, IT, Engineering or similar) Work experience as Data Engineer as part of Data & Analytics team, with 3+ years of relevant work experience and an overall experience of 6-10 years Data Engineering Experience: Advanced working SQL knowledge and experience in building & maintaining scalable ETL/EL data pipelines to support continuing increase in data volume and complexity Enterprise working experience in business intelligence/analytics teams supporting design, development, and maintenance of backend data layer for BI/ML solutions Deep understanding of data structure / data models to design and develop data solutions ensuring data availability, security, and accessibility Competencies Tools/Technologies/Frameworks: Expertise in working with various Data Warehouse solutions and constructing data products using technologies such as Snowflake, Databricks, Azure Data Engineering Stack (like storage accounts, key vaults, MS SQL, etc.) is mandatory Strong work experience in SQL/Stored procs and relational modeling to build data layer for BI/analytics is mandatory Extensive hands-on data modelling experience in cloud data warehouse and data structures. Hands-on working experience in one of the ETL/EL tools like DBT/Azure Data Factory/SSIS will be an advantage Proficiency in code management / version control tools such as GIT, DevOps Business/Soft Skills: Strong in data/software engineering fundamentals; experience in an Agile/Scrum environment preferred Ability to communicate with stakeholders across different geographies and collaborate with analytics & data science teams to match technical solutions with customer business requirements Familiar with business metrics such as KPIs, PPIs and other indicators Curious and passionate about building value-creating and innovative data solutions What We Offer An opportunity to impact climate change and the future of next generations through data, analytics, cloud and machine learningSteep learning curve. We are building a strong team of Data Engineers with both broad and deep knowledge. That means that everyone will have somebody to learn from, just as we will invest in continuous learning, knowledge sharing and upskilling Strong relationships. We will strive to build an environment of mutual trust and a tightly knit team, where we can support and inspire each other to deliver great impact for Vestas Opportunity to shape your role. We have been asked to scale and deliver data & insights products. The rest is up to us Healthy work life balance. Commitment to fostering a diverse and inclusive workplace environment where everyone can thrive and bring their unique perspectives and skills to the team Overall, we offer you the opportunity to make a difference and work in a multicultural international company, where you have the opportunity to improve your skills and grow professionally to reach new heights Additional Information Your primary workplace will be Chennai. Please note: We do amend or withdraw our jobs and reserve the right to the right to do so at any time, including prior to the advertised closing date. Please be advised to apply on or before 16th July 2025. BEWARE – RECRUITMENT FRAUD It has come to our attention that there are a number of fraudulent emails from people pretending to work for Vestas. Read more via this link, https://www.vestas.com/en/careers/our-recruitment-process DEIB Statement At Vestas, we recognise the value of diversity, equity, and inclusion in driving innovation and success. We strongly encourage individuals from all backgrounds to apply, particularly those who may hesitate due to their identity or feel they do not meet every criterion. As our CEO states, "Expertise and talent come in many forms, and a diverse workforce enhances our ability to think differently and solve the complex challenges of our industry". Your unique perspective is what will help us powering the solution for a sustainable, green energy future. About Vestas Vestas is the energy industry’s global partner on sustainable energy solutions. We are specialised in designing, manufacturing, installing, and servicing wind turbines, both onshore and offshore. Across the globe, we have installed more wind power than anyone else. We consider ourselves pioneers within the industry, as we continuously aim to design new solutions and technologies to create a more sustainable future for all of us. With more than 185 GW of wind power installed worldwide and 40+ years of experience in wind energy, we have an unmatched track record demonstrating our expertise within the field. With 30,000 employees globally, we are a diverse team united by a common goal: to power the solution – today, tomorrow, and far into the future. Vestas promotes a diverse workforce which embraces all social identities and is free of any discrimination. We commit to create and sustain an environment that acknowledges and harvests different experiences, skills, and perspectives. We also aim to give everyone equal access to opportunity. To learn more about our company and life at Vestas, we invite you to visit our website at www.vestas.com and follow us on our social media channels. We also encourage you to join our Talent Universe to receive notifications on new and relevant postings. Show more Show less

Posted 1 day ago

Apply

5.0 - 10.0 years

0 Lacs

Cochin

On-site

Orion Innovation is a premier, award-winning, global business and technology services firm. Orion delivers game-changing business transformation and product development rooted in digital strategy, experience design, and engineering, with a unique combination of agility, scale, and maturity. We work with a wide range of clients across many industries including financial services, professional services, telecommunications and media, consumer products, automotive, industrial automation, professional sports and entertainment, life sciences, ecommerce, and education. Data Engineer Locations- Kochi/Chennai/Coimbatore/Mumbai/Pune/Hyderabad Job Overview : We are seeking a highly skilled and experienced Senior Data Engineer to join our growing data team. The ideal candidate will have deep expertise in Azure Databricks and Python, and experience building scalable data pipelines. Familiarity with Data Fabric architectures is a plus. You'll work closely with data scientists, analysts, and business stakeholders to deliver robust data solutions that drive insights and innovation. Key Responsibilities: Design, build, and maintain large-scale, distributed data pipelines using Azure Databricks and Py Spark. Design, build, and maintain large-scale, distributed data pipelines using Azure Data Factory Develop and optimize data workflows and ETL processes in Azure Cloud environments. Write clean, maintainable, and efficient code in Python for data engineering tasks. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. • Monitor and troubleshoot data pipelines for performance and reliability issues. • Implement data quality checks, validations, and ensure data lineage and governance. Contribute to the design and implementation of a Data Fabric architecture (desirable). Required Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. 5–10 years of experience in data engineering or related roles. • Expertise in Azure Databricks, Delta Lake, and Spark. • Strong proficiency in Python, especially in a data processing context. Experience with Azure Data Lake, Azure Data Factory, and related Azure services. Hands-on experience in building data ingestion and transformation pipelines. Familiarity with CI/CD pipelines and version control systems (e.g., Git). Good to Have: Experience or understanding of Data Fabric concepts (e.g., data virtualization, unified data access, metadata-driven architectures). • Knowledge of modern data warehousing and lakehouse principles. • Exposure to tools like Apache Airflow, dbt, or similar. Experience working in agile/scrum environments. DP-500 and DP-600 Certifications What We Offer: Competitive salary and performance-based bonuses. Flexible work arrangements. Opportunities for continuous learning and career growth. A collaborative, inclusive, and innovative work culture. www.orioninc.com (21) Orion Innovation: Company Page Admin | LinkedIn Orion is an equal opportunity employer, and all qualified applicants will receive consideration for employment without regard to race, color, creed, religion, sex, sexual orientation, gender identity or expression, pregnancy, age, national origin, citizenship status, disability status, genetic information, protected veteran status, or any other characteristic protected by law. Candidate Privacy Policy Orion Systems Integrators, LLC and its subsidiaries and its affiliates (collectively, "Orion," "we" or "us") are committed to protecting your privacy. This Candidate Privacy Policy (orioninc.com) ("Notice") explains: What information we collect during our application and recruitment process and why we collect it; How we handle that information; and How to access and update that information. Your use of Orion services is governed by any applicable terms in this notice and our general Privacy Policy.

Posted 1 day ago

Apply

10.0 years

4 - 8 Lacs

Hyderābād

On-site

Summary Internal title: Assoc.Dir. DDIT US&I Data Architect #LI-Hybrid Location: Hyderabad, India Relocation Support: Yes Step into a pivotal role where your expertise in data architecture will shape the future of analytics at Novartis. As Associate Director - Data Architect, you’ll lead the design and implementation of innovative data solutions that empower business decisions and drive digital transformation. This is your opportunity to influence enterprise-wide strategies, collaborate with cross-functional teams, and bring emerging technologies to life—all while making a meaningful impact on global healthcare. About the Role Key Responsibilities Design and implement scalable data architecture solutions aligned with business strategy and innovation goals Lead architecture for US&I Analytics Capabilities including GenAI, MLOps, NLP, and data visualization Collaborate with cross-functional teams to ensure scalable, future-ready data solutions Define and evolve architecture governance frameworks, standards, and best practices Drive adoption of emerging technologies through rapid prototyping and enterprise-scale deployment Architect data solutions using AWS, Snowflake, Databricks, and other modern platforms Oversee delivery of data lake projects including acquisition, transformation, and publishing Ensure data security, governance, and compliance across all architecture solutions Promote a data product-centric approach to solution design and delivery Align innovation efforts with business strategy, IT roadmap, and regulatory requirements Essential Requirements Bachelor’s degree in computer science, engineering, or a related field Over 10 years of experience in analytical and technical frameworks for descriptive and prescriptive analytics Strong expertise in AWS, Databricks, and Snowflake service offerings Proven experience delivering data lake projects from acquisition to publishing Deep understanding of data security, governance policies, and enforcement mechanisms Agile delivery experience managing multiple concurrent delivery cycles Strong knowledge of MLOps and analytical data lifecycle management Excellent communication, problem-solving, and cross-functional collaboration skills Desirable Requirements Experience working with pharmaceutical data and familiarity with global healthcare data sources Exposure to regulatory frameworks and compliance standards in the life sciences industry Commitment to Diversity and Inclusion: Novartis is committed to building an outstanding, inclusive work environment and diverse teams' representative of the patients and communities we serve. Accessibility and accommodation Novartis is committed to working with and providing reasonable accommodation to individuals with disabilities. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the recruitment process, or in order to perform the essential functions of a position, please send an e-mail to diversityandincl.india@novartis.com and let us know the nature of your request and your contact information. Please include the job requisition number in your message Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting and inspiring each other. Combining to achieve breakthroughs that change patients’ lives. Ready to create a brighter future together? https://www.novartis.com/about/strategy/people-and-culture Join our Novartis Network: Not the right Novartis role for you? Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up: https://talentnetwork.novartis.com/network Benefits and Rewards: Read our handbook to learn about all the ways we’ll help you thrive personally and professionally: https://www.novartis.com/careers/benefits-rewards Division Operations Business Unit CTS Location India Site Hyderabad (Office) Company / Legal Entity IN10 (FCRS = IN010) Novartis Healthcare Private Limited Functional Area Technology Transformation Job Type Full time Employment Type Regular Shift Work No

Posted 1 day ago

Apply

7.0 years

0 Lacs

Hyderābād

On-site

Digital Solutions Consultant I - HYD015Q Company : Worley Primary Location : IND-AP-Hyderabad Job : Digital Solutions Schedule : Full-time Employment Type : Agency Contractor Job Level : Experienced Job Posting : Jun 16, 2025 Unposting Date : Jul 16, 2025 Reporting Manager Title : Senior General Manager : We deliver the world’s most complex projects. Work as part of a collaborative and inclusive team. Enjoy a varied & challenging role. Building on our past. Ready for the future Worley is a global professional services company of energy, chemicals and resources experts headquartered in Australia. Right now, we’re bridging two worlds as we accelerate to more sustainable energy sources, while helping our customers provide the energy, chemicals, and resources that society needs now. We partner with our customers to deliver projects and create value over the life of their portfolio of assets. We solve complex problems by finding integrated data-centric solutions from the first stages of consulting and engineering to installation and commissioning, to the last stages of decommissioning and remediation. Join us and help drive innovation and sustainability in our projects. The Role As a Digital Solutions Consultant with Worley, you will work closely with our existing team to deliver projects for our clients while continuing to develop your skills and experience etc. We are looking for a skilled Data Engineer to join our Digital Customer Solutions team. The ideal candidate should have experience in cloud computing and big data technologies. As a Data Engineer, you will be responsible for designing, building, and maintaining scalable data solutions that can handle large volumes of data. You will work closely with stakeholders to ensure that the data is accurate, reliable, and easily accessible. Responsibilities: Design, build, and maintain scalable data pipelines that can handle large volumes of data. Document design of proposed solution including structuring data (data modelling applying different techniques including 3-NF and Dimensional modelling) and optimising data for further consumption (working closely with Data Visualization Engineers, Front-end Developers, Data Scientists and ML-Engineers). Develop and maintain ETL processes to extract data from various sources (including sensor, semi-structured and unstructured, as well as structured data stored in traditional databases, file stores or from SOAP and REST data interfaces). Develop data integration patterns for batch and streaming processes, including implementation of incremental loads. Build quick porotypes and prove-of-concepts to validate assumption and prove value of proposed solutions or new cloud-based services. Define Data engineering standards and develop data ingestion/integration frameworks. Participate in code reviews and ensure all solutions are lined to architectural and requirement specifications. Develop and maintain cloud-based infrastructure to support data processing using Azure Data Services (ADF, ADLS, Synapse, Azure SQL DB, Cosmos DB). Develop and maintain automated data quality pipelines. Collaborate with cross-functional teams to identify opportunities for process improvement. Manage a team of Data Engineers. About You To be considered for this role it is envisaged you will possess the following attributes: Bachelor’s degree in Computer Science or related field. 7+ years of experience in big data technologies such as Hadoop, Spark, Hive & Delta Lake. 7+ years of experience in cloud computing platforms such as Azure, AWS or GCP. Experience in working in cloud Data Platforms, including deep understanding of scaled data solutions. Experience in working with different data integration patterns (batch and streaming), implementing incremental data loads. Proficient in scripting in Java, Windows and PowerShell. Proficient in at least one programming language like Python, Scala. Expert in SQL. Proficient in working with data services like ADLS, Azure SQL DB, Azure Synapse, Snowflake, No-SQL (e.g. Cosmos DB, Mongo DB), Azure Data Factory, Databricks or similar on AWS/GCP. Experience in using ETL tools (like Informatica IICS Data integration) is an advantage. Strong understanding of Data Quality principles and experience in implementing those. Moving forward together We want our people to be energized and empowered to drive sustainable impact. So, our focus is on a values-inspired culture that unlocks brilliance through belonging, connection and innovation. We’re building a diverse, inclusive and respectful workplace. Creating a space where everyone feels they belong, can be themselves, and are heard. And we're not just talking about it; we're doing it. We're reskilling our people, leveraging transferable skills, and supporting the transition of our workforce to become experts in today's low carbon energy infrastructure and technology. Whatever your ambition, there’s a path for you here. And there’s no barrier to your potential career success. Join us to broaden your horizons, explore diverse opportunities, and be part of delivering sustainable change. Worley takes personal data protection seriously and respects EU and local data protection laws. You can read our full Recruitment Privacy Notice Here. Please note: If you are being represented by a recruitment agency you will not be considered, to be considered you will need to apply directly to Worley.

Posted 1 day ago

Apply

5.0 - 8.0 years

4 - 9 Lacs

Hyderābād

On-site

Wipro Limited (NYSE: WIT, BSE: 507685, NSE: WIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. Job Description Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ͏ Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements ͏ Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ͏ Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ͏ Deliver No Performance Parameter Measure 1 Process No. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT 2 Team Management Productivity, efficiency, absenteeism 3 Capability development Triages completed, Technical Test performance Mandatory Skills: DataBricks - Data Engineering. Experience: 5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 day ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Company: Kanerika Inc Website: Visit Website Business Type: Startup Company Type: Product & Service Business Model: B2B Funding Stage: Pre-seed Industry: IT Services Job Description Roles and Responsibilities : The Jr .NET Data Engineer will be responsible for designing and developing scalable backend systems using .NET Core, Web API, and Azure-based data engineering tools like Databricks, MS Fabric, or Snowflake. They will build and maintain data pipelines, optimize SQL/NoSQL databases, and ensure high-performance systems through design patterns and microservices architecture. Strong communication skills and the ability to collaborate with US counterparts in an Agile environment are essential. Experience with Azure DevOps, Angular, and MongoDB is a plus. Technical Skills Strong hands-on experience on C#, SQL Server, OOPS Concepts, Micro Services Architecture. At least one-year hands-on experience on .NET Core, ASP.NET Core, Web API, SQL, No SQL, Entity Framework 6 or above, Azure, Database performance tuning, Applying Design Patterns, Agile. Net back-end development with data engineering expertise. Must have experience with Azure Data Engineering, Azure Databricks, MS Fabric as data platform/ Snowflake or similar tools. Skill for writing reusable libraries. Excellent Communication skills both oral & written. Excellent troubleshooting and communication skills, ability to communicate clearly with US counter parts What we need? Educational Qualification: B.Tech, B.E, MCA, M.Tech. Work Mode: Must be willing to work from the office (onsite only). Nice To Have Knowledge on Angular, Mongo DB, NPM and Azure Devops Build/ Release configuration. Self – Starter with solid analytical and problem- solving skills. This is an experienced level position, and we train the qualified candidate in the required applications. Willingness to work extra hours to meet deliverables. Show more Show less

Posted 1 day ago

Apply

8.0 years

2 - 8 Lacs

Gurgaon

On-site

Requisition Number: 101352 Architect I - Data Location: This is a hybrid opportunity in Delhi-NCR, Bangalore, Hyderabad, Gurugram area. Insight at a Glance 14,000+ engaged teammates globally with operations in 25 countries across the globe. Received 35+ industry and partner awards in the past year $9.2 billion in revenue #20 on Fortune’s World's Best Workplaces™ list #14 on Forbes World's Best Employers in IT – 2023 #23 on Forbes Best Employers for Women in IT- 2023 $1.4M+ total charitable contributions in 2023 by Insight globally Now is the time to bring your expertise to Insight. We are not just a tech company; we are a people-first company. We believe that by unlocking the power of people and technology, we can accelerate transformation and achieve extraordinary results. As a Fortune 500 Solutions Integrator with deep expertise in cloud, data, AI, cybersecurity, and intelligent edge, we guide organisations through complex digital decisions. About the role As an Architect I , you will focus on leading our Business Intelligence (BI) and Data Warehousing (DW) initiatives. We will count on you to be involved in designing and implementing end-to-end data pipelines using cloud services and data frameworks. Along the way, you will get to: Architect and implement end-to-end data pipelines, data lakes, and warehouses using modern cloud services and architectural patterns. Develop and build analytics tools that deliver actionable insights to the business. Integrate and manage large, complex data sets to meet strategic business requirements. Optimize data processing workflows using frameworks such as PySpark. Establish and enforce best practices for data quality, integrity, security, and performance across the entire data ecosystem. Collaborate with cross-functional teams to prioritize deliverables and design solutions. Develop compelling business cases and return on investment (ROI) analyses to support strategic initiatives. Drive process improvements for enhanced data delivery speed and reliability. Provide technical leadership, training, and mentorship to team members, promoting a culture of excellence. What we’re looking for 8+ years in Business Intelligence (BI) solution design, with 6+ years specializing in ETL processes and data warehouse architecture. 6+ years of hands-on experience with Azure Data services including Azure Data Factory, Azure Databricks, Azure Data Lake Gen2, Azure SQL DB, Synapse, Power BI, and MS Fabric. Strong Python and PySpark software engineering proficiency, coupled with a proven track record of building and optimizing big data pipelines, architectures, and datasets. Proficient in transforming, processing, and extracting insights from vast, disparate datasets, and building robust data pipelines for metadata, dependency, and workload management. Familiarity with software development lifecycles/methodologies, particularly Agile. Experience with SAP/ERP/Datasphere data modeling is a significant plus. Excellent presentation and collaboration skills, capable of creating formal documentation and supporting cross-functional teams in a dynamic environment. Strong problem-solving, time management, and organizational abilities. Keen to learn new languages and technologies continually. Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or an equivalent field What you can expect We’re legendary for taking care of you, your family and to help you engage with your local community. We want you to enjoy a full, meaningful life and own your career at Insight. Some of our benefits include: Freedom to work from another location—even an international destination—for up to 30 consecutive calendar days per year. Medical Insurance Health Benefits Professional Development: Learning Platform and Certificate Reimbursement Shift Allowance But what really sets us apart are our core values of Hunger, Heart, and Harmony, which guide everything we do, from building relationships with teammates, partners, and clients to making a positive impact in our communities. Join us today, your ambITious journey starts here. When you apply, please tell us the pronouns you use and any reasonable adjustments you may need during the interview process. At Insight, we celebrate diversity of skills and experience so even if you don’t feel like your skills are a perfect match - we still want to hear from you! Today's talent leads tomorrow's success. Learn more about Insight: https://www.linkedin.com/company/insight/ Insight is an equal opportunity employer, and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, sexual orientation or any other characteristic protected by law. Insight India Location:Level 16, Tower B, Building No 14, Dlf Cyber City In It/Ites Sez, Sector 24 &25 A Gurugram Gurgaon Hr 122002 India

Posted 1 day ago

Apply

10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Role: Senior Manager Location: Bengaluru WHAT YOU’LL DO We’re MiQ, a global programmatic media partner for marketers and agencies. Our people are at the heart of everything we do, so you will be too. No matter the role or the location, we’re all united in the vision to lead the programmatic industry and make it better. As part of the DnA team, you will lead a team of analysts driving the analytics delivery on digital media campaigns for a specific market or region within MiQ. You would be a part of the the DnA leadership responsible to define strategic priorities for the team that would help drive revenue growth, market sustainability and account innovation. You’d be responsible for team development, operational excellence, building analytics expertise in the team and sharing new learnings/ analytics best practices across the business. Develop strong commercial awareness, identify opportunities to grow business and actively take part in market and account objective setting. Focus on Growth and Retain strategy: Conceptualise and propose solutions to address business challenges. Being part of the leadership team, enhance the analytics and DS solutions outlook of MiQ’s offering. Build and strengthen relationship with commercial leaders and play an influential role in sales, client services, trading, solutions etc. teams. Lead by example! Be a hands-on leader demonstrating strong business, technical and functional knowledge. Work with DnA leadership to identify focus areas and build department level short/long term strategy baking in micro and macro factors. Provide analytics and data science project leadership and oversee development, deployment, and adoption of solutions in the specific market and in DnA. Responsible for providing technical and analytics expertise to the team and to bring in better ways of analytics/problem solving to the team continuously. Play key stakeholder role for Product & Tech developments and spearhead internal tool adoption within the market and team Set performance standards for the team! Own the OKRs creation, development planning, L&D plan, feedback and performance appraisals for the team. Drive operational excellence: Setup processes & frameworks for effort & cost tracking, effectively measure the impact of delivered outcomes. Ensure effective resource planning for the market by forecasting demand and making data backed assumptions. Develop a culture of feedback and continuous learning within the team. Ensure team develops an experimental and innovation focussed mindset and finds newer efficient ways of doing things. Have an innovative and transformation mindset to identify improvement opportunities to optimize processes, decrease costs and increase client/business value. Manage team wellbeing and ensure team is engaged. Active involvement in recruitment, branding and external event participation. Who are your stakeholders? As an Senior Manager you are required to work with different stakeholders across the MiQ ecosystem: Programmatic Traders : DnA collaborates with traders to optimize campaigns. By leveraging our data analysis skills & understanding of the data landscape, we provide insights on audience targeting, ad performance, and bidding strategies. This helps traders make data-driven decisions, optimize their advertising campaigns, and improve overall campaign effectiveness and ROI. Account Managers : We work closely with account managers to leverage the power of data partnerships. Through our analysis, we help uncover valuable insights about customer behavior, market trends, and campaign performance. This information allows account managers to create a compelling narrative, enhance engagement with advertisers, and showcase the effectiveness of MiQ's advertising solutions. Sales Representatives : We help the sales team by creating insights based on the key market trends and events. Our analysis helps identify potential opportunities and develop a gripping sales narrative. Additionally, we assist in responding to Request for Proposals (RFPs) by providing data-driven insights and recommendations that help us in increasing the revenue streams. Agencies & Clients : Our expertise in data analytics and data sciences is invaluable for agency and advertiser clients. By providing detailed analysis reports & solutions, we empower them to make informed decisions regarding their marketing strategies. Our insights help clients optimize their advertising budgets, target the right audience, and maximize the effectiveness of their campaigns. Additionally, we promote MiQ's internal solutions and capabilities, showcasing MiQ's unique value proposition in the programmatic landscape. In summary, as a Senior Manager, you add value by building strong partnerships with leaders in these key teams and collectively build market strategies that foster business growth. You also guide the DnA team to build data-driven insights and recommendations to traders, account managers, sales teams, and agency/advertiser clients that empowers MiQ and its stakeholders reach the right audience with right content at the right time. What You’ll Bring 10+ years’ industry experience experience in business analytics or analytics consulting Proven leadership and people management experience. 5+ years developing the careers of 8 or more direct reports. A Bachelor’s Degree in Computer Science, Mathematical or Statistical sciences or related quantitative disciplines is required. Strong analytical acumen and problem-solving abilities to address complex client problems leveraging data Expertise in SQL, Excel and PowerPoint High degree of comfort with either R or Python Good understanding of Statistical concepts Knowledge of big data processing tools/frameworks like Qubole / Databricks /Spark, AWS Excellent Storytelling and visualization skills Programmatic Media / Ad-Tech /Digital advertising domain knowledge Knowledge of Tableau/PowerBI/Google Data Studio Ability to thrive in an unstructured environment, working autonomously on a strong team to find opportunity and deliver business impact We’ve highlighted some key skills, experience and requirements for this role. But please don’t worry if you don’t meet every single one. Our talent team strives to find the best people. They might see something in your background that’s a fit for this role, or another opportunity at MiQ. If you have a passion for the role, please still apply. What impact will you create? As a Senior Manager, your role will create value for MiQ in the following ways: Driving client stickiness: With your analytics expertise you will help our stakeholders make informed data-driven decisions. By providing accurate and actionable insights, you contribute to improving campaign performance and identifying new opportunities thereby improving customer stickiness. Driving Profitability: By leveraging the power of data you are expected to identify areas where we can optimize costs & thereby maintain a competitive edge MiQ Growth: Being on top of market trends & developments to suggest strategic measures that can help support MiQ's business & tap into new revenue streams to drive growth Support Key Decision Making: Your expertise in data analysis and reporting provides decision-makers with the necessary information to make informed choices. You will help guide agencies, advertisers & internal stakeholders in making strategic and tactical decisions that align with the MiQ's or client's objectives. Analytics Best Practices: As a Senior Manager for Analytics Excellence, you are expected to introduce analytics & data best practices within the team, helping in setting up structures and quality frameworks within the team & internal stakeholders Developing Custom Analytics Solutions: Leveraging your experience in data science & advanced analytics, you will be expected to provide recommendations on MiQ products & assist in enhancing their consumption within the target market What’s in it for you? Our Center of Excellence is the very heart of MiQ, and it’s where the magic happens. It means everything you do and everything you create will have a huge impact across our entire global business. MiQ is incredibly proud to foster a welcoming culture. We do everything possible to make sure everyone feels valued for what they bring. With global teams committed to diversity, equity, and inclusion, we’re always moving towards becoming an even better place to work. Values Our values are so much more than statements . They unite MiQers in every corner of the world. They shape the way we work and the decisions we make. And they inspire us to stay true to ourselves and to aim for better. Our values are there to be embraced by everyone, so that we naturally live and breathe them. Just like inclusivity, our values flow through everything we do - no matter how big or small. We do what we love - Passion We figure it out - Determination We anticipate the unexpected - Agility We always unite - Unite We dare to be unconventional - Courage Benefits Every region and office have specific perks and benefits, but every person joining MiQ can expect: A hybrid work environment New hire orientation with job specific onboarding and training Internal and global mobility opportunities Competitive healthcare benefits Bonus and performance incentives Generous annual PTO paid parental leave, with two additional paid days to acknowledge holidays, cultural events, or inclusion initiatives. Employee resource groups designed to connect people across all MiQ regions, drive action, and support our communities. Apply today! Equal Opportunity Employer Show more Show less

Posted 1 day ago

Apply

0 years

5 - 7 Lacs

Pune

Remote

Entity: Finance Job Family Group: Business Support Group Job Description: We are a global energy business involved in every aspect of the energy system. We are working towards delivering light, heat and mobility to millions of people, every day. In India, we operate bp’s FBT, which is a coordinated part of bp. Our people want to play their part in solving the big, sophisticated challenges facing our world today and, guided by our bp values, are working to help meet the world’s need for more energy while lowering carbon emissions. In our offices at Pune, we work in customer service, finance, accounting, procurement, HR services and other enabling functions – providing solutions across all bp. Would you like to discover how our diverse, hardworking people are owning the way in making energy cleaner and better – and how you can play your part in our outstanding team? Join our team, and develop your career in an encouraging, forward-thinking environment! Key Accountabilities Data Quality/Modelling/Design thinking: Demonstrating SAP MDG/ECCs experience the candidate is able to investigate to do root cause analysis for assigned use cases. Also able to work with Azure data lake (via Databricks) using SQL/Python. Data Model (Conceptual and Physical) will be needed to be identified and built that provides automated mechanism to supervise on going DQ issues. Multiple workshops may also be needed to work through various options and identifying the one that is most efficient and effective Works with business (Data Owners/Data Stewards) to profile data for exposing patterns indicating data quality issues. Also is able to identify impact to specific CDEs deemed relevant for each individual business. Identifies financial impacts of Data Quality Issue. Also is able to identify business benefit (quantitative/qualitative) from a remediation standpoint along with leading implementation timelines. Schedules regular working groups with business that have identified DQ issues and ensures progression for RCA/Remediation or for presenting in DGFs Identifies business DQ rules basis which critical metrics/Measures are stood up that champion into the dashboarding/workflows for BAU monitoring. Red flags are raised and investigated Understanding of Data Quality value chain, starting with Critical Data Element concepts, Data Quality Issues, Data Quality important metrics/Measures is needed. Also has experience owing and completing Data Quality Issue assessments to aid improvements to operational process and BAU initiatives Highlights risk/hidden DQ issues to Lead/Manager for further mentorship/customer concern. Interpersonal skills are significant in this role as this is outward facing and focus has to be on clearly articulation messages. Dashboarding & Workflow: Builds and maintains effective analytics and partner concern mechanisms which detect poor data and help business lines drive resolution Support crafting, building and deployment of data quality dashboards via PowerBI Resolves critical issue paths and constructs workflow and alerts which advise process and data owners of unresolved data quality issues Collaborates with IT & analytics teams to drive innovation (AI, ML, cognitive science etc.) DQ Improvement Plans: Creates, embeds and drives business ownership of DQ improvement plans Works with business functions and projects to create data quality improvement plans Sets targets for data improvements .Monitors and intervenes when sufficient progress is not being made Supports initiatives which are driving data clean-up of existing data landscape Project Delivery: Oversees, advises Data Quality Analysts and participates in delivery of data quality activities including profiling, establishing conversion criteria and resolving technical and business DQ issues Owns and develops relevant data quality work products as part of the DAS data change methodology Ensures data quality aspects are delivered as part of Gold and Silver data related change projects Supports the creation of cases with insight into the cost of poor data Crucial Experience and Job Requirements: 11-15 total yrs of experience in Oil & Gas or a Financial Services/Banking industry within Data Management space Experience of working with Data Models/Structures and investigating to design and fine tune them Experience of Data Quality Management i.e. Governance, DQI management (root cause analysis, Remediation /solution identification), Governance Forums (papers production, quorum maintenance, Minutes publication), CDE identification, Data Lineage (identification of authoritative data sources) preferred. Understand of important metrics/Measures needed as well Experience of having worked with senior partners in multiple Data Domain/Business Areas, CDO and Technology. Ability to operate in global teams within multiple time zones Ability to operate in a multifaceted and changing setup and be able to identify priorities. Also ability to operate independently without too much direction Desirable criteria SAP MDG/SAP ECC experience (T codes, Tables structures etc) Azure Data lake /AWS/Data Bricks Crafting dashboards & workflows (powerBI QlikView or Tableau etc.) Crafting analytics and insight in a DQ setting (PowerBI/power Query) Profiling and analysis skills (SAP DI, Informatica or Collibra) Persuading, influencing and communication at a senior level management level. Certification in Data Management, Data Science, Python/R desirable Travel Requirement No travel is expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks.

Posted 1 day ago

Apply

Exploring Databricks Jobs in India

Databricks is a popular technology in the field of big data and analytics, and the job market for Databricks professionals in India is growing rapidly. Companies across various industries are actively looking for skilled individuals with expertise in Databricks to help them harness the power of data. If you are considering a career in Databricks, here is a detailed guide to help you navigate the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Chennai
  5. Mumbai

Average Salary Range

The average salary range for Databricks professionals in India varies based on experience level: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-25 lakhs per annum

Career Path

In the field of Databricks, a typical career path may include: - Junior Developer - Senior Developer - Tech Lead - Architect

Related Skills

In addition to Databricks expertise, other skills that are often expected or helpful alongside Databricks include: - Apache Spark - Python/Scala programming - Data modeling - SQL - Data visualization tools

Interview Questions

  • What is Databricks and how is it different from Apache Spark? (basic)
  • Explain the concept of lazy evaluation in Databricks. (medium)
  • How do you optimize performance in Databricks? (advanced)
  • What are the different cluster modes in Databricks? (basic)
  • How do you handle data skewness in Databricks? (medium)
  • Explain how you can schedule jobs in Databricks. (medium)
  • What is the significance of Delta Lake in Databricks? (advanced)
  • How do you handle schema evolution in Databricks? (medium)
  • What are the different file formats supported by Databricks for reading and writing data? (basic)
  • Explain the concept of checkpointing in Databricks. (medium)
  • How do you troubleshoot performance issues in Databricks? (advanced)
  • What are the key components of Databricks Runtime? (basic)
  • How can you secure your data in Databricks? (medium)
  • Explain the role of MLflow in Databricks. (advanced)
  • How do you handle streaming data in Databricks? (medium)
  • What is the difference between Databricks Community Edition and Databricks Workspace? (basic)
  • How do you set up monitoring and alerting in Databricks? (medium)
  • Explain the concept of Delta caching in Databricks. (advanced)
  • How do you handle schema enforcement in Databricks? (medium)
  • What are the common challenges faced in Databricks projects and how do you overcome them? (advanced)
  • How do you perform ETL operations in Databricks? (medium)
  • Explain the concept of MLflow Tracking in Databricks. (advanced)
  • How do you handle data lineage in Databricks? (medium)
  • What are the best practices for data governance in Databricks? (advanced)

Closing Remark

As you prepare for Databricks job interviews, make sure to brush up on your technical skills, stay updated with the latest trends in the field, and showcase your problem-solving abilities. With the right preparation and confidence, you can land your dream job in the exciting world of Databricks in India. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies