Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
0 Lacs
Pune, Maharashtra
On-site
DataPune Posted On 31 Jul 2025 End Date 31 Dec 2025 Required Experience 4 - 8 Years Basic Section Grade Role Senior Software Engineer Employment Type Full Time Employee Category Organisational Group Company NewVision Company Name New Vision Softcom & Consultancy Pvt. Ltd Function Business Units (BU) Department/Practice Data Department/Practice Data Engineering Region APAC Country India Base Office Location Pune Working Model Hybrid Weekly Off Pune Office Standard State Maharashtra Skills Skill AZURE DATABRICKS Highest Education GRADUATION/EQUIVALENT COURSE CERTIFICATION DP-201: DESIGNING AN AZURE DATA SOLUTION DP-203T00: DATA ENGINEERING ON MICROSOFT AZURE Working Language ENGLISH Job Description Position Summary: We are seeking a talented Databricks Data Engineer with a strong background in data engineering to join our team. You will play a key role in designing, building, and maintaining data pipelines using a variety of technologies, with a focus on the Microsoft Azure cloud platform. Responsibilities: Design, develop, and implement data pipelines using Azure Data Factory (ADF) or other orchestration tools. Write efficient SQL queries to extract, transform, and load (ETL) data from various sources into Azure Synapse Analytics. Utilize PySpark and Python for complex data processing tasks on large datasets within Azure Databricks. Collaborate with data analysts to understand data requirements and ensure data quality. Hands-on experience in designing and developing Datalakes and Warehouses Implement data governance practices to ensure data security and compliance. Monitor and maintain data pipelines for optimal performance and troubleshoot any issues. Develop and maintain unit tests for data pipeline code. Work collaboratively with other engineers and data professionals in an Agile development environment. Preferred Skills & Experience: Good knowledge of PySpark & working knowledge of Python Full stack Azure Data Engineering skills (Azure Data Factory, DataBricks and Synapse Analytics) Experience with large dataset handling Hands-on experience in designing and developing Datalakes and Warehouses New Vision is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees
Posted 2 days ago
2.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Description Alexa Shopping Operations strives to become the most reliable source for dataset generation and annotations. We work in collaboration with Shopping feature teams to enhance customer experience (CX) quality across shopping features, devices, and locales. Our primary focus lies in handling annotations for training, measuring, and improving Artificial Intelligence (AI) and Large Language Models (LLMs), enabling Amazon to deliver a superior shopping experience to customers worldwide. Our mission is to empower Amazon's LLMs through Reinforcement Learning from Human Feedback (RLHF) across various categories at high speed. We aspire to provide an end-to-end data solution for the LLM lifecycle, leveraging technology alongside our operational excellence. By joining us, you will play a pivotal role in shaping the future of the shopping experience for customers worldwide. Key job responsibilities Become a subject matter expert on one or more services Provide support activities for these services and regularly work with development teams to establish and improve service support Operate with guidance from management and drive issues to resolution Understand the business logic and architecture of supported services to regularly resolve undocumented trouble tickets Be able to read and understand complex application code and make approved code fixes to resolve support issues Provide mentoring, training, documentation, and tools to other Support Engineers to enable them to perform support activities Regularly contribute to the creation and improvement of all support documentation Perform code builds and deployments communicating status regularly before, during, and after each deployment Create and interpret metrics that measure support success and service performance Help develop and refine operational policies and procedures used by teams and internal customers Participate fully and constructively in the planning of team’s work Have the ability to write simple and efficient tools to improve operational efficiency Learn to contribute to design and development of support tools using software engineering best practices Mentor other Support Engineers and are involved with interviewing and onboarding new team members Basic Qualifications 2+ years of software development, or 2+ years of technical support experience Experience scripting in modern program languages Experience troubleshooting and debugging technical systems Preferred Qualifications Knowledge of web services, distributed systems, and web application development Experience troubleshooting & maintaining hardware & software RAID Experience with REST web services, XML, JSON Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI MAA 12 SEZ Job ID: A3047884
Posted 2 days ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Description: Job Title : Python Developer/Backend Developer (AI/ML knowledge preferred), AWS Cloud We are currently seeking a talented Python Developer with a strong foundation in software development and a keen interest in artificial intelligence and machine learning. While AI/ML knowledge is not mandatory, it is considered an asset for this role. As a Python Developer at EXL you will have the opportunity to work on diverse projects and collaborate with cross-functional teams to deliver high-quality solutions. Responsibilities: Develop and maintain scalable and robust Python applications and services. Collaborate with software engineers, data scientists, and other stakeholders to integrate AI/ML components into software solutions. Assist in implementing AI/ML algorithms and models using Python-based libraries and frameworks. Participate in code reviews, testing, and debugging activities to ensure the quality and reliability of software products. Stay updated on emerging technologies and trends in AI/ML to contribute insights and ideas for enhancing our products and services. Work closely with data engineers to access, preprocess, and analyze data for AI/ML model development. Document code, processes, and best practices to facilitate knowledge sharing and collaboration within the team. Provide support and assistance to other team members as needed. Qualifications: Bachelor’s or master’s degree in computer science, Engineering, or related field. Strong proficiency in Python programming language. Strong proficiency in AWS Cloud Familiarity with software development methodologies, tools, and best practices. Understanding of basic concepts in artificial intelligence and machine learning is good to have. Strong proficiency in python programming for ML development Hand on experience working with ML frameworks (Tensor, Scikit, etc.) Knowledge of Azure cloud and especially working with Azure ML studio and cognitive services. Knowledge on working with SQL, NO SQL Databases and REST APIs Knowledge on Azure OpenAI is good have and preferred. Dataset preparation and cleansing for model creation. Working knowledge of different types of data (structured, semi-structured, and unstructured) Expertise in python frameworks such as Fast API, Flask and Django. Working with huge data sets and data analysis with Pandas and NumPy Working with Python ORM Libraries Ability to handle large datasets. Ability to work independently and collaboratively in a fast-paced environment. Excellent problem-solving skills and attention to detail. Effective communication and interpersonal skills. While prior experience or knowledge in AI/ML is preferred, we welcome candidates who are passionate about learning and growing in this field. If you are a talented Python Developer looking to expand your skills and contribute to exciting projects, we encourage you to apply and join our dynamic team at EXL.
Posted 2 days ago
0 years
0 Lacs
Thane, Maharashtra, India
On-site
Calling all innovators – find your future at Fiserv. We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv. Job Title Data Analyst What does a successful Data Analyst do at Fiserv ? Data Analyst is responsible for identifying any issues or ways to improve the collection, distribution and consumption of data. The data analysts will also monitor performance and quality control plans to identify any issues or ways to improve data orchestrations. This role requires collaborating with architects and developers to implement effective automation processes. What Will You Do Ability to manage time and priorities with multiple tasks and projects, to work with loosely defined requirements. Analyze, query and manipulate financial and business level data. Validate data sets are in synch with sources. Perform reconciliations of defined data. Identify, compare, and resolve data quality problems. Evaluate large dataset for quality and accuracy. Determine business impact level for data quality issues. Work with Programmers to correct data quality errors. Determine root cause for data quality errors and make recommendations for solutions. Research and determine scope and complexity of issue to identify steps to fix issue. Develop process improvements to enhance overall data quality and execute data cleanup measures. Maintain a record of original data and corrected data. Ensure adherence to data quality standards. Identify areas of improvement to achieve data quality. Resolve all internal data exceptions in timely and accurate manner. What Will You Need To Know Bachelor’s Degree or equivalent experience. Must have analytical, problem solving, and team building skills. Ability to work independently, prioritize tasks and solve problems Proficient in MS PowerBI, SQL and Python. Excellent communication (verbal and written), interpersonal, organizational, collaboration, and trouble shooting skills What Would Be Great To Have Exposure to Foundry or Snowflake a plus. Experience in VBA is a plus. We welcome and encourage diversity in our workforce. Fiserv is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, colour, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protector veteran. Explore the possibilities of a career with Fiserv and Find your Forward with us! Thank You For Considering Employment With Fiserv. Please Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our Commitment To Diversity And Inclusion Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note To Agencies Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning About Fake Job Posts Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.
Posted 2 days ago
0 years
0 Lacs
India
Remote
Location: Remote Type: Project-Based / Contract We’re looking for an independent AI Pipeline Developer with deep expertise in Kohya.ss (LoRA/full training) and ComfyUI to build a complete AI model development and generation pipeline for our internal use. You will not be working with a team — this is a solo, project-delivery role where you will set up the full pipeline , optimize our existing infrastructure , and deliver clear SOPs (Standard Operating Procedures) so that our internal team can use and maintain the system after delivery. What You’ll Build A complete training and generation pipeline using Kohya.ss and ComfyUI Customizable templates for LoRA / full model training ComfyUI workflows for prompt-based image and video generation Automation scripts for dataset handling , class image prep , and model management Documentation and SOPs for using, updating, and scaling the pipeline Scope of the Role Fully independent execution : no internal collaboration expected You’ll analyze our current computing setup and design an efficient workflow around it Proficiency in generating images using Stable Diffusion Models Deliver usable tools + documentation for us to run training and generation ourselves post-project Bonus: Recommend or create basic UIs/scripts if needed for more effortless operation What You Need to Have Proven experience with Kohya.ss (LoRA & full model training) Mastery of ComfyUI and ability to design flexible, scalable generation workflows Proficiency in Python scripting and automation Experience setting up training pipelines from scratch Ability to document your work in simple, repeatable step-by-step SOPs Environment & Tools Available High-end multi-GPU compute setup already prepared Structured storage for datasets, models, and outputs Freedom to design and optimize pipeline components as you see fit Deliverables Complete working pipeline for training and generation ComfyUI workflows with reusable templates Supporting scripts/tools (for automation, dataset prep, model versioning, etc.) Clear and comprehensive SOP document for internal operation and maintenance If you’re a self-driven AI practitioner capable of building and documenting end-to-end training and generation pipelines, and you’re excited to contribute to a platform focused on high-efficiency model development, we’d love to hear from you. Please share your portfolio, GitHub, or examples of similar projects you've completed. ➡️ Apply via this Google Form: https://docs.google.com/forms/d/e/1FAIpQLSdVQd0LoMzeI6uRtmxZoXkwqgh9p1VHIIhryEQH_Ow_7FzGuQ/viewform
Posted 2 days ago
8.0 years
0 Lacs
India
On-site
About Ellucian Ellucian is a global market leader in education technology. We power innovation for higher education, partnering with more than 2,900 customers across 50 countries and serving over 20 million students. Ellucian's AI-powered platform, trained on the richest dataset available in higher education, drives efficiency, personalized experiences, and strengthened engagement for all students, faculty and staff. Fueled by decades of experience with a singular focus on the unique needs of learning institutions, the Ellucian platform features best-in-class SaaS capabilities and delivers insights needed now and into the future. These solutions and services span the entire student lifecycle, from student recruitment, enrollment, and retention to workforce analytics, fundraising, and alumni engagement. Ellucian's innovative solutions, vast ecosystem of partners, and user community of more than 45,000 provide best practices leading to greater institutional success and achieving better student outcomes. Values Rooted in Purpose We embrace the power to lead , the courage to innovate , and the determination to grow . At our core, we believe in humanizing our approach, recognizing that our people are our greatest strength. With a shared vision of transformation , we endeavor to shape a brighter future for higher education. About The Opportunity Ellucian is seeking to hire a Senior SAP Business Analyst to join within their Information Technology department. As a Senior Analyst, you will be responsible for managing the Order to Cash processes using SAP SD module on ECC platform and be part of the transformation initiative to migrate to SAP S/4 HANA solution. This role will also focus on bridging the gap between business operations and IT to drive technological solutions that optimize financial and operational processes in our software business. Where you will make an impact Acts as the key liaison across all functional areas and provide recommendations for improved system processes, including business units, information technology and outside vendors Configure, implement and maintain SAP S/4 HANA SD/Subscription Billing module for Order to Cash (OTC) processes, including Sales Order Management, Billing, Delivery Processing, Revenue Recognition, Pricing, and Credit Management. Expertise in configuring SAP RAR (Revenue Accounting and Reporting) module to ensure accurate revenue recognition, especially in complex scenarios. Support migration activities from SAP ECC SD to S/4 HANA SD, including data migration, validation, and testing. Coordinate user acceptance testing (UAT) and support business users in testing. Support end-to-end integration flows, train end-users and provide post-implementation support, ensuring a smooth transition to new systems. Problem-solving skills and the ability to provide day-to-day support in production environments adhering to the SLAs. Provide ongoing support to the business users on day-to-day issues related to the Order to Cash process in SAP ECC SD and RAR module. Troubleshoot and resolve issues across SD modules with a focus on Revenue Recognition (RAR), invoicing, and sales order management. Demonstrated curiosity and aptitude for leveraging AI tools to enhance productivity, problem-solving, and decision-making in day-to-day work. What You Will Bring 8 years of hands-on experience in SAP ECC and S/4 HANA SD module, with focus on the Order to Cash (OTC) processes with at least 2 full implementation cycles. Strong knowledge of SD module configuration, including Sales Order Management, Billing, Revenue Recognition, Pricing, Credit Management, Delivery Processing, and Invoicing. Experience with RAR (Revenue Accounting and Reporting), including configuration and troubleshooting. Good experience in SAP S/4 HANA SD migration, including configuration and post-migration support. A good understanding of integration points between SD and FI Strong communication skills, with the ability to work with both technical teams and business stakeholders. Ability to work in a team-oriented environment and manage multiple priorities effectively. What makes #Ellucianlife 22 days annual leave plus 11 public holidays Competitive gratuity policy Group insurance and Annual health checkup plan with a variety of family and wellness benefits. Thrive Flex Lifestyle Account (LSA) that allows you to contribute towards your health, financial or learning interests 5 charitable days to support the community that supports us Wellness Headspace (mental health) Wellbeats (virtual fitness classes) RethinkCare - caregiver support Diversity and inclusion programs that promote employee resource groups such as: Buzzinga and Lean In Team to name a few. Parental leave Employee referral bonuses to encourage the addition of great new people to the team We Foster a learning culture with: Education Assistance Program Professional development opportunities
Posted 2 days ago
3.0 years
6 - 9 Lacs
India
On-site
Job Title: Core PHP Developer (3–10 Years Experience) Location: Madhapur, Hyderabad. Job Type: Full-Time | Permanent About the Role: We are seeking a Core PHP Developer with 3 to 10 years of experience , who has a solid foundation in PHP development and can work independently or in small teams. Many of our projects are legacy applications (7–10 years old) built using Core PHP, so a deep understanding of non-framework PHP development, MySQL optimization, and backend integrations is essential. Key Responsibilities: Maintain, enhance, and troubleshoot legacy Core PHP projects Work independently or collaboratively to manage backend, frontend, and database development Optimize performance for large data sets (e.g., import 1 lakh+ records efficiently into MySQL) Troubleshoot and recover crashed MySQL databases and perform advanced DB operations Implement or manage integrations with tools like Elasticsearch or MongoDB (preferred) Handle deployment and server-level configurations (DNS, SSL, TLS, FTP) Use Git for version control and collaborate effectively with the team Required Skills & Experience: 3+ years of hands-on experience with Core PHP (non-framework projects) Experience working on long-term, monolithic PHP applications Strong MySQL knowledge: optimization, large dataset handling, backup/recovery Experience with frontend basics (HTML, CSS, JavaScript) as needed Ability to manage full-stack tasks independently Familiar with Git-based version control Exposure to DNS, SSL/TLS, FTP, and hosting-related configurations Preferred (Not Mandatory): Working knowledge of Elasticsearch or MongoDB Familiarity with Linux server environments Experience with importing bulk data and performance tuning Ideal Candidate: Self-driven, problem-solver, and capable of taking ownership of complete modules or projects Comfortable handling both code and server-side aspects Efficient in debugging legacy codebases and refactoring when needed Job Type: Full-time Pay: ₹50,000.00 - ₹80,000.00 per month Ability to commute/relocate: Madhapur, Hyderabad, Telangana: Reliably commute or planning to relocate before starting work (Preferred) Location: Madhapur, Hyderabad, Telangana (Required) Work Location: In person
Posted 2 days ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Title: Data Engineer - Senior Location: Noida Employment Type: Permanent Experience Required: Minimum 5 years Primary Skills: Cloud - AWS (AWS Lambda, AWS EventBridge, AWS Fargate) --- Job Description We are seeking a highly skilled Senior Data Engineer to design, implement, and maintain scalable data pipelines that support machine learning model training and inference. Responsibilities: Build and maintain large-scale data pipelines ensuring scalability, reliability, and efficiency. Collaborate with data scientists to streamline the deployment and management of machine learning models. Design and optimize ETL (Extract, Transform, Load) processes and integrate data from multiple sources into structured storage systems. Automate ML workflows using MLOps tools and frameworks (e.g., Kubeflow, MLflow, TensorFlow Extended - TFX). Monitor model performance, data lineage, and system health in production environments. Work cross-functionally to improve data architecture and enable seamless ML model integration. Manage and optimize cloud platforms and data storage solutions (AWS, GCP, Azure). Ensure data security, integrity, and compliance with governance policies. Troubleshoot and optimize pipelines to improve reliability and performance. --- Required Skills Languages: Python, SQL, PySpark Cloud: AWS Services (Lambda, EventBridge, Fargate), Cloud Platforms (AWS, GCP, Azure) DevOps: Docker, Kubernetes, Containerization ETL Tools: AWS Glue, SQL Server (SSIS, SQL Packages) Nice to Have: Redshift, SAS dataset knowledge --- Mandatory Competencies DevOps/Configuration Management: Docker DevOps/Configuration Management: Cloud Platforms - AWS DevOps/Configuration Management: Containerization (Docker, Kubernetes) ETL: AWS Glue Database: SQL Server - SQL Packages
Posted 3 days ago
3.0 - 4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
What You’ll Do This position reports to Director, Portfolio Strategy & Analytics and is part of the COE function providing financial analysis, data analytics, reporting, and transaction support on key GRE projects and initiatives. Job Responsibilities Execute all GRE reporting, dashboard or other reporting tools such as Power BI to track key real estate portfolio metrics and monitor performance. Conduct monthly quality checks aligned to location governance to ensure data accuracy and integrity Maintain and update real estate database (CoStar) to ensure timely and accurate entry of non-lease data; assign and manage location IDs for newly acquired or established properties Assist with design and implementation of set of standardized GRE templates and tools for stakeholder discussions and presentations as well as with preparation of playbooks on best practices and standard operating procedures for all GRE related activities Support sustainability reporting for real estate portfolio for ESG disclosures and compliance initiatives Administer and maintain GRE MS team site as the centralized depository and key mode of distribution channel for all key real estate data, analytics, playbooks, tools, and all other related materials Administer and maintain Real Estate inbox and calendar as key centralized communication channel with internal and external GRE stakeholders Manage any vendor or landlord payment and/or any other property/lease specific issues with timely coordination and communications among internal stakeholders Assist GRE Managers and key internal customers with preparation and completion of support for Capital Appropriation Request(s) Work on any ad-hoc special projects as assigned Qualifications Bachelor's Degree in business, finance or related field required 3 - 4 years proven experience in Corporate Real Estate, Finance or Data Analysis. Skills Real estate practices, familiarity with real estate finance, accounting, and legal concepts Skilled in MS Excel and ability to analyze and synthesize large amount of raw dataset and turn them into meaningful analysis. Tech, digital mindset Experienced in Power Point and other data presentation tool such as Power BI Strong analytical, reasoning, organization, and problem-solving skills Ability to multi-task and work well under deadlines Must have strong written, verbal and communication skills Requires the ability to be flexible, adaptable and to deal with ambiguity and change.
Posted 3 days ago
3.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Risk Management Level Senior Associate Job Description & Summary At PwC, our people in audit and assurance focus on providing independent and objective assessments of financial statements, internal controls, and other assurable information enhancing the credibility and reliability of this information with a variety of stakeholders. They evaluate compliance with regulations including assessing governance and risk management processes and related controls. Those in internal audit at PwC help build, optimise and deliver end-to-end internal audit services to clients in all industries. This includes IA function setup and transformation, co-sourcing, outsourcing and managed services, using AI and other risk technology and delivery models. IA capabilities are combined with other industry and technical expertise, in areas like cyber, forensics and compliance, to address the full spectrum of risks. This helps organisations to harness the power of IA to help the organisation protect value and navigate disruption, and obtain confidence to take risks to power growth. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary Are you looking for a technically challenging role? then we’ve one for you. We are looking for a seasoned software engineer to design and execute our platform migration from monolithic to microservice based architecture. In this role you’ll / Your main responsibilities You’ll be responsible for redesigning the application from present monolithic architecture to microservices based architecture, in the most efficient and scalable way. You’ll be owning the application migration from current platform to data driven streaming platform Responsibilities hybrid working model Level - Senior Associate Exp - 3 to 6 years BCBS239 work experience into Data Governance. Data analytical skills is a must. Collibra in and out experience. SQL and Power BI Advanced. Person should be up and running into making PDI's work (Priority Data initiatives). Below is the full JD- 4 Data Divisional Office: Services Required The team will support execution of prioritized data initiatives in compliance with the Global Data Policy and BCBS239 requirements across FRM via 1 Providing governance and advise risk coverage teams to identify and document risk data ownership, authority, sourcing, controls, and quality. 2 Review of controls including controls identified at dataset level. 3 Supporting documentation and cataloguing of information in the firmwide data catalogue, Collibra Experience Experience in risk management. Experience of a governance role (data quality governance preferred) Experience in review of controls (operational risk, audit)/control assessment (preferred) Desired Skills and Competencies Experience in data analysis and documentation. Experience with industry standard data cataloguing techniques (knowledge of Collibra preferred) Strong communication skills Key attention to detail Ability to work with multiple stakeholders, review and challenge with strong rationale. Good Excel and PowerPoint skills Education, Background & Experience Required Bachelor’s degree or equivalent Good understanding of risk management concepts and financial products, particularly with respect to commercial financial market data Qualifications Desired No specific additional qualifications required. Mandatory skill sets 4 Data Divisional Office: Services Required The team will support execution of prioritized data initiatives in compliance with the Global Data Policy and BCBS239 requirements across FRM via 1 Providing governance and advise risk coverage teams to identify and document risk data ownership, authority, sourcing, controls, and quality. 2 Review of controls including controls identified at dataset level. 3 Supporting documentation and cataloguing of information in the firmwide data catalogue, Collibra Preferred skill sets Experience in investment banking is must Years of experience required 3 to 6 years of experience Education Qualification MBA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Business Analysis Optional Skills Accepting Feedback, Accepting Feedback, Accounting and Financial Reporting Standards, Active Listening, Analytical Thinking, Artificial Intelligence (AI) Platform, Auditing, Auditing Methodologies, Business Process Improvement, Communication, Compliance Auditing, Corporate Governance, Creativity, Data Analysis and Interpretation, Data Ingestion, Data Modeling, Data Quality, Data Security, Data Transformation, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Financial Accounting, Financial Audit {+ 24 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 3 days ago
1.0 - 2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Purpose (State in one sentence the overall objective of the job) The role provides day-to-day case monitoring, assessment and reporting of quality and adverse event complaint records involving Alcon manufactured products and responds to complaint activities required to comply with local and international regulations, guidelines, and applicable directives. This role may be required to work in shifts. Major Accountabilities (Describe the main results of the job to be achieved) Case Processing: Process case files according to Standard Operating Procedures (SOP). Work with affiliate offices to ensure required dataset has been received/requested. Re-assess the data, ensure accurate product selection and assign required event code(s) in the system. Complete initial and follow-up reporting assessments as information is received (initial report, follow-up questionnaires, phone calls, investigation findings). Respond to Manufacturing Quality Assurance (QA) requests and Health Authority inquiries Launch required quality investigation records. Schedule expedited and periodic regulatory reports based on local and international reporting regulations. Perform and receive quality feedback on case management and coding. Adherence to all corporate compliance guidelines & corporate programs. Maintains a working knowledge of the following: Alcon Products for assigned therapeutic areas and corresponding documentation (Product Information, Directions for Use, Manuals, Promotional materials) Eye anatomy Common diseases Ophthalmic evaluation procedures Eye terminology and abbreviations Safety database(s) and reporting tools Process and review Surgical – Intra Ocular Lens (IOL) complaint records in accordance with Alcon Standard Operating Procedures (SOPs) Provide support in reconciliation activities and audit as required. Evaluate and escalate potential safety issues to management. Role Dimensions: Number of associates: None Financial responsibility: None Impact on the organization: Low Key Performance Indicators (Indicate how performance will be measured: indicators, activities…) KPIs will be outlined in detail in the goal sheet, and will largely be around below parameters: Meets internal and external quality standards Review and close files within prescribed timelines Creates high quality regulatory reports for submission on or before assigned due dates Ideal Background (State the minimum and desirable education and experience level) Education Minimum: Graduation in Science Desirable: Graduation in Optometry/ Pharmacy/M. Pharm/B. Pharm /BDS/BAMS/BHMS/ Biomedical Engineering / Registered Nurse Experience requirement: Minimum: Healthcare professional with 1-2 years of experience Desirable: Experience in Device Vigilance / Pharmacovigilance / Regulatory Submissions / Clinical Research / PVPI/ Medical Coding Languages: Minimum: English (written and spoken) Specific Professional Competencies: Indicate any other soft/technical/professional knowledge and skills requirements Excellent listening ability and communication skills Excellent decision quality and negotiation skills Ability to manage multiple tasks, attention to detail, prioritize work and manage time well Knowledge and understanding of national and international medical device regulations and regulatory guidelines Knowledge of medical aspects of medical device safety, medical device vigilance in pre- and post-marketing safety practice Basic knowledge of MS Office ATTENTION: Current Alcon Employee/Contingent Worker If you are currently an active employee/contingent worker at Alcon, please click the appropriate link below to apply on the Internal Career site. Find Jobs for Employees Find Jobs for Contingent Worker Alcon is an Equal Opportunity Employer and takes pride in maintaining a diverse environment. We do not discriminate in recruitment, hiring, training, promotion or other employment practices for reasons of race, color, religion, gender, national origin, age, sexual orientation, gender identity, marital status, disability, or any other reason.
Posted 3 days ago
3.0 years
0 Lacs
Goregaon, Maharashtra, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Risk Management Level Senior Associate Job Description & Summary At PwC, our people in audit and assurance focus on providing independent and objective assessments of financial statements, internal controls, and other assurable information enhancing the credibility and reliability of this information with a variety of stakeholders. They evaluate compliance with regulations including assessing governance and risk management processes and related controls. Those in internal audit at PwC help build, optimise and deliver end-to-end internal audit services to clients in all industries. This includes IA function setup and transformation, co-sourcing, outsourcing and managed services, using AI and other risk technology and delivery models. IA capabilities are combined with other industry and technical expertise, in areas like cyber, forensics and compliance, to address the full spectrum of risks. This helps organisations to harness the power of IA to help the organisation protect value and navigate disruption, and obtain confidence to take risks to power growth. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary Are you looking for a technically challenging role? then we’ve one for you. We are looking for a seasoned software engineer to design and execute our platform migration from monolithic to microservice based architecture. In this role you’ll / Your main responsibilities You’ll be responsible for redesigning the application from present monolithic architecture to microservices based architecture, in the most efficient and scalable way. You’ll be owning the application migration from current platform to data driven streaming platform Responsibilities hybrid working model Level - Senior Associate Exp - 3 to 6 years BCBS239 work experience into Data Governance. Data analytical skills is a must. Collibra in and out experience. SQL and Power BI Advanced. Person should be up and running into making PDI's work (Priority Data initiatives). Below is the full JD- 4 Data Divisional Office: Services Required The team will support execution of prioritized data initiatives in compliance with the Global Data Policy and BCBS239 requirements across FRM via 1 Providing governance and advise risk coverage teams to identify and document risk data ownership, authority, sourcing, controls, and quality. 2 Review of controls including controls identified at dataset level. 3 Supporting documentation and cataloguing of information in the firmwide data catalogue, Collibra Experience Experience in risk management. Experience of a governance role (data quality governance preferred) Experience in review of controls (operational risk, audit)/control assessment (preferred) Desired Skills and Competencies Experience in data analysis and documentation. Experience with industry standard data cataloguing techniques (knowledge of Collibra preferred) Strong communication skills Key attention to detail Ability to work with multiple stakeholders, review and challenge with strong rationale. Good Excel and PowerPoint skills Education, Background & Experience Required Bachelor’s degree or equivalent Good understanding of risk management concepts and financial products, particularly with respect to commercial financial market data Qualifications Desired No specific additional qualifications required. Mandatory skill sets 4 Data Divisional Office: Services Required The team will support execution of prioritized data initiatives in compliance with the Global Data Policy and BCBS239 requirements across FRM via 1 Providing governance and advise risk coverage teams to identify and document risk data ownership, authority, sourcing, controls, and quality. 2 Review of controls including controls identified at dataset level. 3 Supporting documentation and cataloguing of information in the firmwide data catalogue, Collibra Preferred skill sets Experience in investment banking is must Years of experience required 3 to 6 years of experience Education Qualification MBA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Business Analytics Optional Skills Accepting Feedback, Accepting Feedback, Accounting and Financial Reporting Standards, Active Listening, Analytical Thinking, Artificial Intelligence (AI) Platform, Auditing, Auditing Methodologies, Business Process Improvement, Communication, Compliance Auditing, Corporate Governance, Creativity, Data Analysis and Interpretation, Data Ingestion, Data Modeling, Data Quality, Data Security, Data Transformation, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Financial Accounting, Financial Audit {+ 24 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 3 days ago
0.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Date Posted: 2025-07-30 Country: India Location: North Gate Business Park Sy.No 2/1, and Sy.No 2/2, KIAL Road, Venkatala Village, Chowdeshwari Layout, Yelahanka, Bangalore, Karnataka 560064 Position Role Type: Unspecified Who we are: At Pratt & Whitney, we believe that powered flight has transformed – and will continue to transform – the world. That’s why we work with an explorer’s heart and a perfectionist’s grit to design, build, and service the world’s most advanced aircraft engines. We do this across a diverse portfolio – including Commercial Engines, Military Engines, Business Aviation, General Aviation, Regional Aviation, and Helicopter Aviation – and as a way of turning possibilities into realities for our customers. This is how we at Pratt & Whitney approach our work, and therefore we are inspired to go beyond. What are our expectations: P&WC engines are equipped with on-board data recorder solutions (FAST™/DCTU) that transmit wirelessly engine full flight data to P&WC Ground station File Processing Center for data processing and reporting for diagnostic and health management analysis to determine applicable engine maintenance tasks. The DPHM Ground team is developing applications to process engine full flight data and generate multiple reports and dataset distributed to various consumers. The DPHM Ground team is seeking a talented and system-aware Data Engineer to join our expanding data platform team. The ideal candidate will be responsible for supporting, developing, and optimizing our modern, event-driven data pipeline using Kafka streams, stateful processing with RocksDB, Microsoft SQL, and Power BI integrated dashboards. This role involves transforming legacy database components into a scalable data warehouse, enhancing observability, reliability, and performance, and converting raw data into actionable insights for clients and stakeholders. Join our team and be a part of evolving our real-time data pipeline hybrid infrastructure, ensuring high performance, resilience, and a scalable cloud-compatible environment for P&WC Ground Station. Qualifications You Must Have: Bachelor’s degree in Computer Science, Software Engineering, Data Engineering, or a related field. Minimum of 5 years of experience in data engineering or a similar role. Strong SQL development skills, including indexing, complex joins, window functions, stored procedures, and query optimization. Experience with data visualization tools such as Power BI or OpenSearch/Kibana. Proficiency in Microsoft Excel for data manipulation and reporting. Familiarity with Java, Python, and C# for API development and maintenance. Exposure to stream processing, schema registries, API contract versioning and evolution, and stateful operations. Analytical & Performance Mindset: Ability to interpret large datasets, draw meaningful conclusions, and present insights effectively while considering latency, throughput, and operational cost. Communication Skills: Strong written and verbal communication skills to convey information between back-end developers, data analysts, and system engineers, while maintaining accountability and ownership of data design and change outcomes. Qualifications We Prefer: Data Pipeline Engineering & Event Streaming: Architect and maintain real-time streaming pipelines using Kafka Streams, implementing key-based aggregations, windowing, and stateful operations backed by RocksDB. Design event schemas and API contracts that serve both internal components and downstream consumers with minimal coupling. Data Ingestion & Persistence: Upgrade and maintain ingestion logic to persist processed outputs into structured databases, primarily Microsoft SQL Server for on-prem deployments and cloud-native databases in AWS. Enhance and maintain database APIs for both batch and real-time data consumers throughout the Ground and Analytics pipeline. Database Optimization & Complex Query Engineering: Optimize SQL queries and stored procedures for high-volume transactional loads. Collaborate with data analysts and business units to model data tables and relations that support Power BI needs. Fine-tune indexing strategies, partitioning, and caching logic. System Monitoring, Observability & Quality: Instrument data pipeline components and APIs with structured logs for ingestion into OpenSearch and visualization in Kibana. Conduct continuous quality checks during data transformation and ingestion phases to ensure data traceability and capture anomalies. Cross-Functional Collaboration: Work closely with the team to test and validate data pipeline artifacts. Support internal and external developers in producing and consuming data pipelines and APIs through documentation and well-defined contracts/schemas. What We Offer Long-term deferred compensation programs Daycare for young children Advancement programs to enhance education skills Flexible work schedules Leadership and training programs Comprehensive benefits, savings, and pension plans Financial support for parental leave Reward programs for outstanding work Work Location: Bangalore Employment Type: Full-time RTX adheres to the principles of equal employment. All qualified applications will be given careful consideration without regard to ethnicity, color, religion, gender, sexual orientation or identity, national origin, age, disability, protected veteran status or any other characteristic protected by law. Privacy Policy and Terms: Click on this link to read the Policy and Terms
Posted 3 days ago
1.0 years
0 Lacs
Gurugram, Haryana, India
Remote
Who You'll Work With Driving lasting impact and building long-term capabilities with our clients is not easy work. You are the kind of person who thrives in a high performance/high reward culture - doing hard things, picking yourself up when you stumble, and having the resilience to try another way forward. In return for your drive, determination, and curiosity, we'll provide the resources, mentorship, and opportunities you need to become a stronger leader faster than you ever thought possible. Your colleagues—at all levels—will invest deeply in your development, just as much as they invest in delivering exceptional results for clients. Every day, you'll receive apprenticeship, coaching, and exposure that will accelerate your growth in ways you won’t find anywhere else. When you join us, you will have: Continuous learning: Our learning and apprenticeship culture, backed by structured programs, is all about helping you grow while creating an environment where feedback is clear, actionable, and focused on your development. The real magic happens when you take the input from others to heart and embrace the fast-paced learning experience, owning your journey. A voice that matters: From day one, we value your ideas and contributions. You’ll make a tangible impact by offering innovative ideas and practical solutions. We not only encourage diverse perspectives, but they are critical in driving us toward the best possible outcomes. Global community: With colleagues across 65+ countries and over 100 different nationalities, our firm’s diversity fuels creativity and helps us come up with the best solutions for our clients. Plus, you’ll have the opportunity to learn from exceptional colleagues with diverse backgrounds and experiences. World-class benefits: On top of a competitive salary (based on your location, experience, and skills), we provide a comprehensive benefits package to enable holistic well-being for you and your family. Your Impact Throughout our 6-month benchmarking study cycles, you will work with multiple clients simultaneously. As an analyst in the project team, you will have high levels of responsibility from the start. You will be the owner of the dataset, engaging directly with your clients to gather, validate and assess the quality of the data. You will also work closely with your team to explore hypotheses and contribute insights through your analyses. Our benchmarks are typically repeated annually, giving you the opportunity to build specialist expertise in a specific topic, market or client. You will have your own portfolio of clients to manage each cycle, and will interact with our global teams working on the same benchmark. This connection to our wide and diverse team will expose you to different perspectives, innovative ideas, and global trends, further enhancing your professional growth and development. While you will have the opportunity to meet senior stakeholders during the delivery of our findings, the bulk of the project cycle will be carried out remotely and does not require significant time on client-site. This will allow you to plan your schedule and manage your work-life balance in a way that suits you. You will work in our Gurugram office with our Finalta Banking team as part of McKinsey’s Financial Institution Group (FIG) Practice. Building on twenty years' experience working with leading banks and insurers, Finalta’s benchmarking work combines deep industry knowledge and proprietary performance data from more than 350 financial institutions in over 50 countries to generate insights on how our clients can achieve superior performance. Our annual benchmarking studies enable clients to objectively assess their performance against peers and quantify opportunities for improvement. This, combined with our case studies insights and recommendations, give clients a roadmap to reach world-class performance while reducing the time, cost, and risk of change. Your Qualifications and Skills MBA or master’s degree preferred Recent graduate or 1+ years of experience in strategy, market research, consulting, financial services, or similar industries Exceptional analytical skills and problem-solving capabilities Great interpersonal and client-management skills Entrepreneurial self-starter with the ability to manage time and produce high-quality work Excellent writing and communication skills
Posted 3 days ago
8.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
About Company, Droisys is an innovation technology company focused on helping companies accelerate their digital initiatives from strategy and planning through execution. We leverage deep technical expertise, Agile methodologies, and data-driven intelligence to modernize systems of engagement and simplify human/tech interaction. Amazing things happen when we work in environments where everyone feels a true sense of belonging and when candidates have the requisite skills and opportunities to succeed. At Droisys, we invest in our talent and support career growth, and we are always on the lookout for amazing talent who can contribute to our growth by delivering top results for our clients. Join us to challenge yourself and accomplish work that matters We are seeking a highly experienced Computer Vision Architect with deep expertise in Python to design and lead the development of cutting-edge vision-based systems. The ideal candidate will architect scalable solutions that leverage advanced image and video processing, deep learning, and real-time inference. You will collaborate with cross-functional teams to deliver high-performance, production-grade computer vision platforms. Key Responsibilities: Architect and design end-to-end computer vision solutions for real-world applications (e.g., object detection, tracking, OCR, facial recognition, scene understanding, etc.) Lead R&D initiatives and prototype development using modern CV frameworks(OpenCV, PyTorch, TensorFlow, etc.) Optimize computer vision models for performance, scalability, and deployment on cloud, edge, or embedded systems Define architecture standards and best practices for Python-based CV pipelines Collaborate with product teams, data scientists, and ML engineers to translate business requirements into technical solutions Stay updated with the latest advancements in computer vision, deep learning, and AI Mentor junior developers and contribute to code reviews, design discussions, and technical documentation Required Skills & Qualifications: Bachelor’s or Master’s degree in Computer Science, Electrical Engineering, or related field (PhD is a plus) 8+ years of software development experience, with 5+ years in computer vision and deep learning Proficient in Python and libraries such as OpenCV, NumPy, scikit-image, Pillow Experience with deep learning frameworks like PyTorch, TensorFlow, or Keras Strong understanding of CNNs, object detection (YOLO, SSD, Faster R-CNN), semantic segmentation, and image classification Knowledge of MLOps, model deployment strategies (e.g., ONNX, TensorRT), and containerization (Docker/Kubernetes) Experience working with video analytics, image annotation tools, and large-scale dataset pipelines Familiarity with edge deployment (Jetson, Raspberry Pi, etc.) or cloud AI services(AWS SageMaker, Azure ML, GCP AI) Droisys is an equal opportunity employer. We do not discriminate based on race, religion, color, national origin, gender, gender expression, sexual orientation, age, marital status, veteran status, disability status or any other characteristic protected by law. Droisys believes in diversity, inclusion, and belonging, and we are committed to fostering a diverse work environment.
Posted 3 days ago
9.0 years
0 Lacs
Gurugram, Haryana, India
Remote
Job Description This is a remote position. Job Summary We are looking for an experienced Senior Data Engineer to lead the development of scalable AWS-native data lake pipelines with a strong focus on time series forecasting and upsert-ready architectures. This role requires end-to-end ownership of the data lifecycle, from ingestion to partitioning, versioning, and BI delivery. The ideal candidate must be highly proficient in AWS data services, PySpark, versioned storage formats like Apache Hudi/Iceberg, and must understand the nuances of data quality and observability in large-scale analytics systems. Responsibilities Design and implement data lake zoning (Raw → Clean → Modeled) using Amazon S3, AWS Glue, and Athena. Ingest structured and unstructured datasets including POS, USDA, Circana, and internal sales data. Build versioned and upsert-friendly ETL pipelines using Apache Hudi or Iceberg. Create forecast-ready datasets with lagged, rolling, and trend features for revenue and occupancy modeling. Optimize Athena datasets with partitioning, CTAS queries, and metadata tagging. Implement S3 lifecycle policies, intelligent file partitioning, and audit logging. Build reusable transformation logic using dbt-core or PySpark to support KPIs and time series outputs. Integrate robust data quality checks using custom logs, AWS CloudWatch, or other DQ tooling. Design and manage a forecast feature registry with metrics versioning and traceability. Collaborate with BI and business teams to finalize schema design and deliverables for dashboard consumption. Requirements Essential Skills: Job Deep hands-on experience with AWS Glue, Athena, S3, Step Functions, and Glue Data Catalog. Strong command over PySpark, dbt-core, CTAS query optimization, and partition strategies. Working knowledge of Apache Hudi, Iceberg, or Delta Lake for versioned ingestion. Experience in S3 metadata tagging and scalable data lake design patterns. Expertise in feature engineering and forecasting dataset preparation (lags, trends, windows). Proficiency in Git-based workflows (Bitbucket), CI/CD, and deployment automation. Strong understanding of time series KPIs, such as revenue forecasts, occupancy trends, or demand volatility. Data observability best practices including field-level logging, anomaly alerts, and classification tagging. Personal Independent, critical thinker with the ability to design for scale and evolving business logic. Strong communication and collaboration with BI, QA, and business stakeholders. High attention to detail in ensuring data accuracy, quality, and documentation. Comfortable interpreting business-level KPIs and transforming them into technical pipelines. Preferred Skills Job Experience with statistical forecasting frameworks such as Prophet, GluonTS, or related libraries. Familiarity with Superset or Streamlit for QA visualization and UAT reporting. Understanding of macroeconomic datasets (USDA, Circana) and third-party data ingestion. Personal Proactive, ownership-driven mindset with a collaborative approach. Strong communication and collaboration skills. Strong problem-solving skills with attention to detail. Have the ability to work under stringent deadlines and demanding client conditions. Strong analytical and problem-solving skills. Ability to work in fast-paced, delivery-focused environments. Strong mentoring and documentation skills for scaling the platform. Other Relevant Information Bachelor’s degree in Computer Science, Information Technology, or a related field. Minimum 9+ years of experience in data engineering & architecture. Benefits This role offers the flexibility of working remotely in India. LeewayHertz is an equal opportunity employer and does not discriminate based on race, color, religion, sex, age, disability, national origin, sexual orientation, gender identity, or any other protected status. We encourage a diverse range of applicants. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#6875E2;border-color:#6875E2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered="">
Posted 3 days ago
2.0 years
0 Lacs
Vadodara, Gujarat, India
On-site
We’re reinventing the market research industry. Let’s reinvent it together. At Numerator, we believe tomorrow’s success starts with today’s market intelligence. We empower the world’s leading brands and retailers with unmatched insights into consumer behavior and the influencers that drive it. Provide an excellent day-to-day service to your UK team(s) through taking successful ownership of the quality of our regular deliverables and delivery of regular briefs. Prioritise your time to meet the day-to-day needs of your UK team(s) to ensure a successful collaboration in servicing and delivering to our clients Develop a strong knowledge of our core capabilities and platform functionality to allow you to take independent ownership for your client workload Demonstrate the ability and confidence to consistently take day to day projects from initial client brief to quality insight delivery with only light touch oversight from the UK team Monitor and escalate data queries and challenges back to the UK team Establish strong relationships with your key day-to-day contacts and become trusted to identify and deliver against their core needs. Develop expertise in your client(s) business and category to enhance the service provided to your client(s) and help build the knowledge of your wider team. Demonstrate expertise in the UK retail and shopper marketplace What You'll Bring to Numerator Graduate level with 18 months to 2 years experience - preferably in industry Numerate and adept in Microsoft applications A passion and enthusiasm for the industry, shopper behaviour and providing an excellent level of service. Ability to take and follow instruction The skills to analyse data and build insight led and actionable presentations. Strong people skills and engaging communication style to enable you to build meaningful relationships with colleagues, across the business. Self-starter in ownership of self-development, prepared to seek out and act upon feedback. Excellent organisation and communication skills to effectively manage the day-to-day workload and deadlines from your UK team A solid knowledge of the foundations of our dataset, analyses, and delivery platforms. Excellent attention to detail.
Posted 3 days ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
About This Role This role sits within Preqin, a part of BlackRock. Preqin plays a key role in how we are revolutionizing private markets data and technology for clients globally, complementing our existing Aladdin technology platform to deliver investment solutions for the whole portfolio. The individual is responsible for aiding in the future growth and development of the team’s internal and external services as well as assisting with the ongoing development of the team leaders and team members. This is achieved through operational excellence and clear day-to-day maintenance of the dataset & content, including core business-as-usual (BAU) tasks and going above and beyond to collaborate, innovate and execute on relevant ad-hoc projects to improve the accuracy, completeness and timeliness of data. The individual is also responsible for people management to create a high-performing team. Collaboration with the Business Units will be key to achieving the above. Responsibilities Ensure data collected by the team is of high quality, meets client expectations and is world class in terms of its breadth and depth Managing day to day data operations with an aim to create and further operational excellence Creating business optimization opportunities through understanding data workflows Lead the day-to-day development of the team, ensuring that training is well-executed, protocols are followed, and the team remains engaged and motivated Manage vendor operations for the dataset including training and performance management Drive data quality (DQ) remediation initiatives by conducting root cause analysis, and impact assessment of changes when there are data quality problems Troubleshoot escalated data quality incidents, collaborating with regional stakeholders and/or global BUs as needed and represent and own data to our clients. Assist in the communication of KPI reporting requirements for the team to relevant stakeholders to ensure reporting processes, including automated solutions, are accurate and updated in a timely manner when development is required Create and Implement data driven innovation (adhoc projects) of products and processes to ensure the accuracy, completeness and timeliness of the dataset remain best in class Adhere to, and promoting, the business definitions and documenting transformations of data through data flows Create management reports on the overall performance of the underlying teams Understand impact and relevancy of data sources and data collection processes available in industry with an aim to improve core data offerings Collaborate across sub functional teams to identify further opportunities to improve the core tenet of data services Demonstrate subject matter expert (SME) for the dataset, supporting internal (including sales teams) and external (including clients) stakeholders in gaining an understanding of internal processes concerning data management Provide support to Research Insights, Product & Marketing Groups, etc., to ensure all related publications and content are delivered according to guidelines and deadlines Identify areas of improvement for data management organization globally Hiring and Performance Management Uphold and encourage adoption of company values and company policies Key Requirements At least 5+ years of relevant working experience in financial data services/alternative assets required , with additional 1-2 years of leadership experience in a people management role Prior Management and leadership experience is key for the role Relevant experience supporting a senior manager is a must Proven track record of excellent operational management abilities - exhibiting persistence, patience, and a keen eye for detail within a well-respected data company Experience introducing and tracking Key Performance Indicators and other performance metrics Hands on, motivational and entrepreneurial leader with the demonstrated professionalism to lead the team Prior experience in collaborating across functions to execute on requirements is a must Ability and enthusiasm to work as part of a team and independently to achieve business and individual goals Proactive approach to identifying and investigating anomalies, as well as a desire develop and rationalize solutions Proven Project Management skills including workflow design is preferred Strong time management and prioritization skills Demonstrated communications skills with experience working with high level executives. Business fluency in English is required Education level – Bachelors’ Degree. Knowledge of business intelligence systems such as QlikSense/Tableau/Power BI, etc as a consumer to direct teams on valuable reports is a must Candidates with the right to work will be prioritized Desired Experience Project Management certification preferred. Organisational design skills are key in order to work with senior management on strategic initiatives. Knowledge of techstack and workflow systems is preferred Understanding of data lifecycle management from acquisition to data publishing is preferred Relevant project experience with outsourcing partners preferred Experience in working across cultures and locations and/or in a remote management setup preferred. Change Management experience supporting global transitions preferred. Technical knowledge (SQL/Python) preferred. Prior experience working in or knowledge of the alternative assets industry or Prior expertise within Data Management in financial services preferred Experience with various workflow tools such as Alteryx, Salesforce, Outreach, Jira, Office 365, etc Our Benefits To help you stay energized, engaged and inspired, we offer a wide range of benefits including a strong retirement plan, tuition reimbursement, comprehensive healthcare, support for working parents and Flexible Time Off (FTO) so you can relax, recharge and be there for the people you care about. Our hybrid work model BlackRock’s hybrid work model is designed to enable a culture of collaboration and apprenticeship that enriches the experience of our employees, while supporting flexibility for all. Employees are currently required to work at least 4 days in the office per week, with the flexibility to work from home 1 day a week. Some business groups may require more time in the office due to their roles and responsibilities. We remain focused on increasing the impactful moments that arise when we work together in person – aligned with our commitment to performance and innovation. As a new joiner, you can count on this hybrid model to accelerate your learning and onboarding experience here at BlackRock. About BlackRock At BlackRock, we are all connected by one mission: to help more and more people experience financial well-being. Our clients, and the people they serve, are saving for retirement, paying for their children’s educations, buying homes and starting businesses. Their investments also help to strengthen the global economy: support businesses small and large; finance infrastructure projects that connect and power cities; and facilitate innovations that drive progress. This mission would not be possible without our smartest investment – the one we make in our employees. It’s why we’re dedicated to creating an environment where our colleagues feel welcomed, valued and supported with networks, benefits and development opportunities to help them thrive. For additional information on BlackRock, please visit @blackrock | Twitter: @blackrock | LinkedIn: www.linkedin.com/company/blackrock BlackRock is proud to be an Equal Opportunity Employer. We evaluate qualified applicants without regard to age, disability, family status, gender identity, race, religion, sex, sexual orientation and other protected attributes at law.
Posted 3 days ago
6.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Primary Purpose of Role This role combines data analysis and consultation from the start, with plenty of room for growth. You find the stories behind scattered dataset. Responsibilities Data Synthesis & Insights Generation: Synthesize data from various source (From Industry/internal data, brand track, custom research, Global data, Statista). Producing and analyzing regular reports for example monthly review, market shares etc. Participate in, specific primary & secondary market research projects Carry out prioritized analysis defined by the Senior Consumer Insight Manager within e.g. business performance, competitors, customers and consumers. Support colleagues with data analysis & market insight tools (SPSS, Survey Reporter, Power BI, Discover etc.. other tools for data analysis). Work closely with brand and trade marketing team to provide insights basis strategic, tracking, ad hoc research to drive insights-based brand strategy and in-market interventions. A Little Bit About You Proactive Strong stakeholder management and interpersonal communication skills Ability to analyze and interpret data in a challenging and insightful manner Strong and articulate verbal and written communication skills Efficient time management, ability to multi-task and detail oriented Displays maturity and creative problem-solving skills in handling crises Recommend improvements in work processes within area of responsibility Qualifications Bachelor or Masters or MBA in marketing, Market Research 5/6 years’ experience in Quantitative Market Research in FMCG (Agency side experience IPSOS, Kantar) Proficient with excel and able to handle large dataset. Strong analytical skills with ability to synthesize complex data and draw insights, syndicated and first-party insight tools. Open to travel Working knowledge of statistics Working knowledge of research techniques and methodologies. Good to have exposure in – Product Test Pack Test Brand Track Comms Test Retail Audit (Good to have not necessary)
Posted 3 days ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Position Overview: To maintain the Exchange Traded Instrument data like Realtime, Reference and End of day pricing data for Equities on database by adhering to all efficiency, quality & compliance requirements, and handle Customer Queries within Exchange Traded Instruments Department. Key Responsibilities: Validate the accuracy of data received from various sources. Developing expertise in data related issues. Building up knowledge of financial regulations and market practices/conventions in relevant markets. Responsible for delivering projects efficiently. Ensure that this information is stored in databases and is accurately reflected on products by crafting or running data quality checks and standards. Ensure the quality and time efficient production of financial information to products. Respond to data queries and provide high accurate data to the clients. Analyse client cases to form patterns and proactively improve data accuracy. Consolidate information around the dataset leading to the establishment of standard processes. Monitor market events to anticipate changes in financial instruments and take actions efficiently. Improve usage of available tools to best of advantage to maintain/improve content quality during daily operations. Mentor and train analysts on data issues, databases & products. Frequently run automated/semi-automated checks to ensure accurate data is provided to our clients with high quality of content. Work on simplification and innovation. Support specific projects, as assigned by manager. Implement change control procedures, data operations standards and current data policies and procedures. Key Requirements: Good Financial Market Knowledge. Knowledge of Refinitiv products. Excellent verbal and written communication skills. Candidate should be open to work in shifts. Required Excel and VBA knowledge. Qualification: Graduate / Post-Graduate preferably in finance, Accounting, Marketing or any other equivalent experience if any. LSEG is a leading global financial markets infrastructure and data provider. Our purpose is driving financial stability, empowering economies and enabling customers to create sustainable growth. Our purpose is the foundation on which our culture is built. Our values of Integrity, Partnership , Excellence and Change underpin our purpose and set the standard for everything we do, every day. They go to the heart of who we are and guide our decision making and everyday actions. Working with us means that you will be part of a dynamic organisation of 25,000 people across 65 countries. However, we will value your individuality and enable you to bring your true self to work so you can help enrich our diverse workforce. You will be part of a collaborative and creative culture where we encourage new ideas and are committed to sustainability across our global business. You will experience the critical role we have in helping to re-engineer the financial ecosystem to support and drive sustainable economic growth. Together, we are aiming to achieve this growth by accelerating the just transition to net zero, enabling growth of the green economy and creating inclusive economic opportunity. LSEG offers a range of tailored benefits and support, including healthcare, retirement planning, paid volunteering days and wellbeing initiatives. We are proud to be an equal opportunities employer. This means that we do not discriminate on the basis of anyone’s race, religion, colour, national origin, gender, sexual orientation, gender identity, gender expression, age, marital status, veteran status, pregnancy or disability, or any other basis protected under applicable law. Conforming with applicable law, we can reasonably accommodate applicants' and employees' religious practices and beliefs, as well as mental health or physical disability needs. Please take a moment to read this privacy notice carefully, as it describes what personal information London Stock Exchange Group (LSEG) (we) may hold about you, what it’s used for, and how it’s obtained, your rights and how to contact us as a data subject. If you are submitting as a Recruitment Agency Partner, it is essential and your responsibility to ensure that candidates applying to LSEG are aware of this privacy notice.
Posted 3 days ago
3.0 years
0 Lacs
Noida
On-site
Position Overview: Here at ShyftLabs, we are looking for an experienced Data Scientist who can derive performance improvement and cost efficiency in our product through a deep understanding of the ML and infra system, and provide a data-driven insight and scientific solution. ShyftLabs is a growing data product company that was founded in early 2020, and works primarily with Fortune 500 companies. We deliver digital solutions built to help accelerate the growth of businesses in various industries, by focusing on creating value through innovation. Job Responsibilities: Data Analysis and Research: Analyzing a large dataset with queries and scripts, extracting valuable signals out of noise, and producing actionable insights into how we could complete and improve a complex ML and bidding system. Simulation and Modelling: Validating and quantifying the efficiency and performance gain from hypotheses through rigorous simulation and modelling. Experimentation and Causal Inference: Developing a robust experiment design and metric framework, and providing reliable and unbiased insights for product and business decision making. Basic Qualifications: Master's degree in a quantitative discipline or equivalent. 3+ years minimum professional experience. Distinctive problem-solving skills, good at articulating product questions, pulling data from large datasets and using statistics to arrive at a recommendation. Excellent verbal and written communication skills, with the ability to present information and analysis results effectively. Ability to build positive relationships within ShyftLabs and with our stakeholders, and work effectively with cross-functional partners in a global company. Statistics: Must have strong knowledge and experience in experimental design, hypothesis testing, and various statistical analysis techniques such as regression or linear models. Machine Learning: Must have a deep understanding of ML algorithms (i.e., deep learning, random forest, gradient boosted trees, k-means clustering, etc.) and their development, validation, and evaluation. Programming: Experience with Python, R, or other scripting language, and database language (e.g. SQL) or data manipulation (e.g. Pandas). We are proud to offer a competitive salary alongside a strong insurance package. We pride ourselves on the growth of our employees, offering extensive learning and development resources.
Posted 3 days ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Title: Full Stack Test Automation Engineer (with Power BI exposure) Overview: We’re looking for a skilled Full Stack Automation Tester to help build a robust test automation framework for our Power BI analytics platform. You'll work closely with BI developers, DevOps, and business stakeholders to ensure our reporting products meet quality, performance, and compliance standards across Dev, UAT, and Prod environments. Key Responsibilities: Design and implement automated test strategies across data, UI, and performance layers. Develop and maintain test scripts to validate: Data accuracy (SQL, APIs) UI integrity (filters, layout) Report performance (load time, refresh success) Integrate automated tests into CI/CD pipelines using Azure DevOps or similar tools. Monitor automated tests and dataset refreshes; support alerting and failure triage. Contribute to UAT coordination, version control workflows, and defect tracking. Collaborate with development teams to enforce QA best practices. Required Skills: Strong experience with automation frameworks (e.g., Selenium, PyTest, NUnit, Postman). Proficient in at least one language: C#, Python, Java . Familiar with API testing and scripting (e.g., PowerShell, REST). Solid understanding of CI/CD pipelines and version control (Git, Azure Repos). Experience testing data platforms (SQL validation, ETL, reports) is a strong plus. Nice to Have: Exposure to Power BI Service, DAX, or Power BI REST APIs. Familiarity with tools like DAX Studio, ALM Toolkit, or Tabular Editor. Soft Skills: Strong attention to detail and structured approach to testing. Clear communicator who can translate business logic into test coverage. Proactive and collaborative mindset.
Posted 3 days ago
6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Senior Technical Program Manager Mastercard is a technology company in the Global Payments Industry. We operate the world’s fastest payments processing network, connecting consumers, financial institutions, merchants, governments and businesses in more than 210 countries and territories. Mastercard products and solutions make everyday commerce activities – such as shopping, travelling, running a business and managing finances – easier, more secure and more efficient for everyone. Mastercard’s Data & Services team is a key differentiator for MasterCard, providing cutting-edge services that help our customers grow. Focused on thinking big and scaling fast around the globe, this dynamic team is responsible for end-to-end solutions for a diverse global customer base. Centered on data-driven technologies and innovation, these services include payments-focused consulting, loyalty and marketing programs, business experimentation, and data-driven information and risk management services. We are currently looking for a Senior Technical Program Manager for Business Intelligence platforms within the Data & Services group. You will manage end-to-end delivery of engineering projects for some of our analytics and BI solutions that leverage Mastercard dataset combined with proprietary analytics techniques, to help businesses around the world solve multi-million dollar business problems. Roles And Responsibilities Act as Technical Program Manager for managing new product development Act as Scrum Master / SAFe agilist and drive all ceremonies Implement Scaled Agile Framework and / or other Agile methodologies Guide, mentor & coach the team(s) on Agile Scrum, SAFe principles Remove impediments with the right sense of urgency, or guide teams in doing so Build a trusting and safe environment for the team where problems are discussed without fear of blame, retribution, or being judged, with an emphasis on healing and problem solving. Coordinate with various groups in MasterCard across locations to ensure success of the projects. Being keenly aware of what is being delivered by the team and why, and the big program picture. Maintain, Radiate and Present project related information, & metrics for leadership review Recommend strategic direction, continuous improvements and policy changes. Lead Project/Program management activities around cost, schedule, quality etc. All About You Overall career experience of 6-9 years of in Technical Program Management Experience playing the Agile Scrum Master role for at least 3 years A very strong understanding and experience on Scaled Agile Framework preferably with certifications Proven experience in managing microservices architecture based software delivery. Understanding of cloud technologies, CI/CD processes. Good skills around Servant Leadership, facilitation, situational awareness, conflict resolution, continual improvements, empowerment, and transparency Knowledge and practice of patterns and techniques for planning and monitoring progress of work. Strong communication and stakeholder management skills involving Business Owners, Development Teams etc. Coordination and organization skills and the ability to work with multiple stakeholders and vendors across locations to ensure success of the project Strong understanding, inclination, and experience towards Lean methodology in general Strong understanding of project/program management techniques around cost, schedule, quality. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 4 days ago
0 years
0 Lacs
India
On-site
Company Description Shoplinky is a platform that enables content creators to monetize their content through personalized storefronts, AI-driven brand collaborations, seamless monetization options and social media automations. For brands, Shoplinky offers AI-powered features to integrate products into organic content that drives sales and builds trust. Shoplinky aims to create a community where creators, brands, and audiences collaborate to make commerce more engaging and rewarding. Role Responsibilities Participate in developing computer vision models for object detection, scene segmentation, and tracking in video content Implement motion-aware logic for placing 3D objects accurately in dynamic video scenes Build video processing workflows for analyzing frames and applying virtual product overlays Contribute to NLP and recommendation models for suggesting relevant products based on content context Collaborate with 3D and platform teams to support rendering and positioning of assets in real time Engage in testing, debugging, and optimizing AI models. Participate in model evaluation and dataset handling for iterative improvements Work closely with developers and the team Qualifications Foundational knowledge of Computer Vision, 3D geometry and motion tracking Exposure to object detection, pose estimation, or scene understanding (OpenCV, Mediapipe, etc.) Familiarity with 3D formats or engines (e.g., GLTF, Three.js, Unity, Blender) Proficient in Python and basic experience with ML libraries (e.g., PyTorch, TensorFlow) Bonus: exposure to NeRFs, SLAM, or AR/VR projects Bachelor’s degree in Computer Science, AI, or related field (or pre-final year student with bold understanding about the skillset mentioned) Experience in integrating Generative AI Models. Enthusiastic learner with strong problem-solving skills and interest in content + commerce AI Interested candidates, please share your resumes at nabeel@shoplinky.com
Posted 4 days ago
3.0 years
0 Lacs
Gurugram, Haryana
On-site
DESCRIPTION At Alexa Shopping Operations strives to become the most reliable source for dataset generation and annotations. We work in collaboration with Shopping feature teams to enhance customer experience (CX) quality across shopping features, devices, and locales. Our primary focus lies in handling annotations for training, measuring, and improving Artificial Intelligence (AI) and Large Language Models (LLMs), enabling Amazon to deliver a superior shopping experience to customers worldwide. Our mission is to empower Amazon's LLMs through Reinforcement Learning from Human Feedback (RLHF) across various categories at high speed. We aspire to provide an end-to-end data solution for the LLM lifecycle, leveraging cuttingedge technology alongside our operational excellence. By joining us, you will play a pivotal role in shaping the future of the shopping experience for customers worldwide. Key job responsibilities The candidate actively seeks to understand Amazon’s core business values and initiatives, and translates those into everyday practices. Some of the key result areas include, but not limited to: Experience in managing process and operational escalations Driving appropriate data oriented analysis, adoption of technology solutions and process improvement projects to achieve operational and business goal Managing stakeholder communication across multiple lines of business on operational milestones, process changes and escalations Communicate and take the lead role in identifying gaps in process areas and work with all stakeholders to resolve the gaps Be a SME for the process and a referral point for peers and junior team members Has the ability to drive business/operational metrics through quantitative decision making, and adoption of different tools and resources Ability to meet deadlines in a fast paced work environment driven by complex software systems and processes Ability to perform deep dive in the process and come up with process improvement solutions Shall collaborate effectively with other teams and subject matter experts (SMEs), Language Engineers (LaEs) to support launches of new process and services BASIC QUALIFICATIONS A Bachelor’s Degree and relevant work experience of 3+ years. Excellent level of English and either of Spanish/French/Italian/Portuguese, C1 level. Candidate must demonstrate ability to analyze and interpret complex SOPs. Excellent problem-solving skills with a proactive approach to identifying and implementing process improvements. Strong communication and interpersonal skills to effectively guide and mentor associates. Ability to work collaboratively with cross-functional teams. Thoroughly understand multiple SOPs and ensure adherence to established processes. Identify areas for process improvement and SOP enhancement, and develop actionable plans for implementation. Lead and participate in process improvement initiatives. Comfortable working in a fast paced, highly collaborative, dynamic work environment · Willingness to support several projects at one time, and to accept re-prioritization as necessary. Adaptive to change and able to work in a fast-paced environment. PREFERRED QUALIFICATIONS Experience with Artificial Intelligence interaction, such as prompt generation. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 4 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough