Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 5.0 years
13 - 17 Lacs
Mumbai
Work from Office
Are you an innovator who loves to solve problems, with a keen interest in financial marketsWe are looking for a Product Specialist for Fixed Income like you to: As a Fixed Income Product Specialist, you are required to manage the presales and post sales activity in the Asia-Pacific Private Banking space for the FI Advisory desk You will be required to maintain strong relationship with Fixed Income sales team members located in Hong Kong and Singapore. You will be responsible for carrying out Enhanced Product Due Diligence for all the complex bonds supported by Fixed Income Advisory desk You will be responsible for performing control tasks on fixed income products on timely basis You will also be required to cover product governance/investment suitability aspects for Bonds as per HKMA and MAS regulations. You ll be part of the Unified Global Markets team with colleagues across different countries in the APAC region. We own the responsibility for Best-in-Class compliant product shelf across Capital Markets products for contractual and non-contractual advisory business. Experience of 2-5 years of Private Banking business along with knowledge of Fixed Income products Deep understanding of complex conventional features of bonds 360-degree awareness of Capital Markets products offered to the Wealth Management arena be it from a client facing, operational or regulatory perspective Knowledge of Bloomberg software along with data extraction using Bloomberg Strong knowledge on MS Office applications. VBA will be a plus. An excellent communicator, both written and verbal, and comfortable with interacting with everyone at all levels Strong analytical, problem-solving and project management skills with strong attention to detail Reliable when working independently, with sound judgment for when to escalate issues Ability to excel under pressure and possess a positive can-do attitude Adaptable, with strong interpersonal skills
Posted 1 month ago
5.0 - 10.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BW/4HANA Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements in Hyderabad. You will play a crucial role in developing solutions to enhance business operations and processes. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the team in implementing innovative solutions- Conduct regular team meetings to ensure project progress- Mentor junior team members to enhance their skills Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BW/4HANA- Strong understanding of data modeling and data extraction- Experience in developing complex data models- Knowledge of SAP BW/4HANA architecture- Hands-on experience in SAP BW/4HANA implementation- Good To Have Skills: Experience with SAP BusinessObjects Additional Information:- The candidate should have a minimum of 5 years of experience in SAP BW/4HANA- This position is based at our Hyderabad office- A 15 years full-time education is required Qualification 15 years full time education
Posted 1 month ago
3.0 - 8.0 years
5 - 9 Lacs
Chennai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BW/4HANA Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Develop and implement SAP BW/4HANA applications.- Collaborate with cross-functional teams to ensure application functionality.- Conduct testing and debugging of applications.- Provide technical support and troubleshooting for application issues.- Stay updated on industry trends and best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BW/4HANA.- Strong understanding of data modeling and data extraction.- Experience with SAP BW/4HANA data integration.- Knowledge of SAP BW/4HANA reporting tools.- Hands-on experience in SAP BW/4HANA implementation. Additional Information:- The candidate should have a minimum of 3 years of experience in SAP BW/4HANA.- This position is based at our Chennai office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 1 month ago
0.0 - 1.0 years
2 - 6 Lacs
Kochi
Work from Office
We are looking for detail-oriented and enthusiastic fresh graduates to join our data team. As a Data Associate, you will be responsible for ensuring the quality of data inputs and outputs in our AI-powered system. You will work closely with AI developers, product managers, and QA leads to ensure high-quality deliverables. Key Responsibilities 1. RFI Question Extraction & Verification Identify and extract questions from customer-submitted RFI documents (PDFs, Word, Excel, etc.) Categorize and structure questions accurately under the guidance of domain experts Ensure completeness and clarity of extracted data 2. Digitized Document Format Review Review digitized versions of documents for formatting errors, misinterpretations, and incomplete data capture Make corrections and annotate issues to improve data quality for the AI training process Assist in maintaining metadata such as document type, language, and structure 3. AI Answer Review Evaluate answers generated by the AI model based on accuracy, relevance, and clarity Highlight incorrect, ambiguous, or missing responses and suggest better alternatives Tag responses as Verified / Needs Review / Incorrect based on defined quality standards 4. Reporting & Collaboration Document and report issues found during data extraction, verification, or AI output evaluation Communicate clearly with development and QA teams to relay edge cases or repeated issues Assist in creating feedback loops to help the AI system learn from human corrections Qualifications Required Strong attention to detail and analytical thinking Good command of English language - both written and comprehension Comfortable working with documents in various formats and using basic tools (Google Docs, Sheets, PDFs, Excel) Preferred Familiarity with ChatGPT or other AI tools is a plus Interest in AI, document processing, or quality control Ability to learn quickly and follow structured processes Skills & Competencies Document interpretation and summarization Basic understanding of data accuracy and formatting Communication and teamwork Initiative and adaptability in fast-paced environments What We Offer: Competitive fresher salary with performance-based incentives Work with cutting-edge AI systems Mentorship from experienced AI/QA professionals Friendly environment Chance to own responsibility and demonstrate accountability Flexible Work Arrangements: Onsite or remote options (as per company policy). International horizons - work with international teams and clients from all corners of the world Company parties, team events and the opportunity to work with great colleagues from all over the world Opportunity to transition into technical or analytical roles Medical Insurance Location & Work Mode: Remote or Hybrid. Flexible work hours, but availability during core collaboration hours is preferred.
Posted 1 month ago
3.0 - 5.0 years
10 - 11 Lacs
Pune
Work from Office
We are seeking an experienced Azure Data Factory Engineer to design, develop, and manage data pipelines using Azure Data Factory. The ideal candidate will possess hands-on expertise in ADF components and activities, and have practical knowledge of incremental data loading, file management, API integration, and cloud storage solutions. This role involves automating data workflows, optimizing performance, and ensuring the seamless flow of data within our cloud environment. Key Responsibilities: Design and Develop Data Pipelines: Build and maintain scalable data pipelines using Azure Data Factory, ensuring efficient and reliable data movement and transformation. Incremental Data Loads: Implement and manage incremental data loading processes to ensure that only updated or new data is processed, optimizing data pipeline performance and reducing resource consumption. File Management: Handle data ingestion and management from various file sources, including CSV, JSON, and Parquet formats, ensuring data accuracy and consistency. API Integration: Develop and configure data pipelines to interact with RESTful APIs for data extraction and integration, handling authentication and data retrieval processes effectively. Cloud Storage Management: Work with Azure Blob Storage and Azure Data Lake Storage to manage and utilize cloud storage solutions, ensuring data is securely stored and easily accessible. ADF Automation: Leverage Azure Data Factory s automation capabilities to schedule and monitor data workflows, ensuring timely execution and error-free operations. Performance Optimization: Continuously monitor and optimize data pipeline performance, troubleshoot issues, and implement best practices to enhance efficiency. Collaboration: Work closely with data engineers, analysts, and other stakeholders to gather requirements, provide technical guidance, and ensure successful data integration solutions. Qualifications: Educational Background: Bachelor s degree in Computer Science, Information Technology, or a related field (B. E, B.Tech, MCA, MCS). Advanced degrees or certifications are a plus. Experience: Minimum 3-5 years of hands-on experience with Azure Data Factory, including designing and implementing complex data pipelines. Technical Skills: Strong knowledge of ADF components and activities, including datasets, pipelines, data flows, and triggers. Proficiency in incremental data loading techniques and optimization strategies. Experience working with various file formats and handling large-scale data files. Proven ability to integrate and interact with APIs for data retrieval and processing. Hands-on experience with Azure Blob Storage and Azure Data Lake Storage. Familiarity with ADF automation features and scheduling. Soft Skills: Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Ability to work independently and manage multiple tasks effectively.
Posted 1 month ago
5.0 - 10.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BW/4HANA Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements in a dynamic work environment. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead the development and implementation of new software applications.- Conduct code reviews and ensure adherence to coding standards.- Troubleshoot and resolve complex technical issues.- Stay updated with the latest technologies and trends in application development. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BW/4HANA.- Strong understanding of data modeling and data extraction in SAP BW/4HANA.- Experience in developing and optimizing SAP BW/4HANA data models.- Knowledge of SAP BW/4HANA administration and performance tuning.- Experience with SAP Analytics Cloud for reporting and visualization. Additional Information:- The candidate should have a minimum of 5 years of experience in SAP BW/4HANA.- This position is based at our Hyderabad office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 1 month ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP BW on HANA Data Modeling & Development Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with cross-functional teams to gather requirements, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to deliver high-quality applications that meet user expectations and business goals. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application specifications and user guides.- Engage in continuous learning to stay updated with the latest technologies and best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BW on HANA Data Modeling & Development.- Strong understanding of data modeling concepts and techniques.- Experience with SAP BW reporting tools and data extraction methods.- Familiarity with performance tuning and optimization of SAP BW applications.- Ability to troubleshoot and resolve application issues effectively. Additional Information:- The candidate should have minimum 3 years of experience in SAP BW on HANA Data Modeling & Development.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
5.0 - 10.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Syniti ADM for SAP Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : syniti Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will collaborate with teams to ensure successful project delivery and meet client expectations. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead the design and development of software applications.- Conduct code reviews and provide technical guidance to team members.- Participate in project planning and provide input on technical feasibility and implementation. Professional & Technical Skills: - Must To Have Skills: Proficiency in Syniti ADM for SAP.- Strong understanding of data migration and data quality management.- Experience in SAP data migration projects.- Knowledge of SAP data models and structures.- Hands-on experience with SAP data extraction and transformation. Additional Information:- The candidate should have a minimum of 5 years of experience in Syniti ADM for SAP.- This position is based at our Bengaluru office.- A syniti education is required. Qualification syniti
Posted 1 month ago
5.0 - 7.0 years
6 - 12 Lacs
Gurugram
Hybrid
We are seeking a Market Research Analyst with 5-7 years of professional work experience to support our data & marketing teams in gathering, analyzing & visualizing business data. Required Candidate profile Strong proficiency in Adv. Excel, SQL, Python (BeautifulSoup/Selenium), Tableau, Data Visualization ,Statistical Analysis, Web scraping tools (ParseHub, Octoparse), SaaS Research and Analysis
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Description: ACCOUNTABILITIES: Designs, codes, tests, debugs and documents software according to Dell s systems quality standards, policies and procedures. Analyzes business needs and creates software solutions. Responsible for preparing design documentation. Prepares test data for unit, string and parallel testing. Evaluates and recommends software and hardware solutions to meet user needs. Resolves customer issues with software solutions and responds to suggestions for improvements and enhancements. Works with business and development teams to clarify requirements to ensure testability. Drafts, revises, and maintains test plans, test cases, and automated test scripts. Executes test procedures according to software requirements specifications Logs defects and makes recommendations to address defects. Retests software corrections to ensure problems are resolved. Documents evolution of testing procedures for future replication. May conduct performance and scalability testing. RESPONSIBILITIES: Leads small to moderate budget projects; may perform in project leadership role and/or may supervise the activities of lower level personnel. Provides resolutions to a diverse range of complex problems. Executes schedules, costs and documentation to ensure assigned projects come to successful conclusion. May assist in training, assigning and checking the work of less experienced developers. Performs estimation efforts on projects and tracks progress. Drafts and revises test plans and scripts with consideration to end-to-end system flows. Executes test scripts according to application requirements documentation. Logs defects, identifies course of action and performs preliminary root cause analysis. Analyzes and communicates test results to project team. Description Comments Additional Details Description Comments : Skills: Python, PySpark and SQL 5 years of experience in Spark, Scala, PySpark for big data processing Proficiency in Python programming for data manipulation and analysis. Experience with Python libraries such as Pandas, NumPy. Knowledge of Spark architecture and components (RDDs, DataFrames, Spark SQL). Strong knowledge of SQL for querying databases. Experience with database systems like Lakehouse, PostgreSQL, Teradata, SQL Server. Ability to write complex SQL queries for data extraction and transformation. Strong analytical skills to interpret data and provide insights. Ability to troubleshoot and resolve data-related issues. Strong problem-solving skills to address data-related challenges Effective communication skills to collaborate with cross-functional teams.Role/Responsibilities: Work on development activities along with lead activities Coordinate with the Product Manager (PdM) and Development Architect (Dev Architect) and handle deliverables independently Collaborate with other teams to understand data requirements and deliver solutions. Design, develop, and maintain scalable data pipelines using Python and PySpark. Utilize PySpark and Spark scripting for data processing and analysis Implement ETL (Extract, Transform, Load) processes to ensure data is accurately processed and stored. Develop and maintain Power BI reports and dashboards. Optimize data pipelines for performance and reliability. Integrate data from various sources into centralized data repositories. Ensure data quality and consistency across different data sets. Analyze large data sets to identify trends, patterns, and insights. Optimize PySpark applications for better performance and scalability. Continuously improve data processing workflows and infrastructure. Not to Exceed Rate : (No Value)
Posted 1 month ago
2.0 - 3.0 years
0 Lacs
Ahmedabad
Work from Office
Job Opening for Software Developer in Ahmedabad, Gujarat, India AI Automation Internship (n8n, RAG & LLMs) Job Description Design and build smart automation workflows using n8n , Zapier , and Make.com . Integrate APIs and connect third-party apps to streamline business processes. Use LLMs (e.g., OpenAI, Cohere) for tasks like summarization, data extraction, and decision logic. Build RAG pipelines with vector databases like Pinecone , ChromaDB , or Weaviate . Develop and test autonomous agents using LangChain , AutoGen , or similar frameworks. Write clean, modular code in Python or JavaScript to support custom workflow logic. Prototype ideas quickly and ship real features used in production environments. Document your workflows and collaborate with developers, consultants, and product teams. Key Skills Final-year students: Only final year students in Computer Science, AI/ML, Data Science, Information Systems, or related fields., who are willing to work full time after internship. Curiosity & Initiative : You love experimenting with new tools/technologies and aren t afraid to break things to learn. Basic to Intermediate Coding Skills : Comfortable writing Python or JavaScript/TypeScript. Able to read API docs and write modular code. Familiarity (or willingness to learn) Workflow Platforms : Exposure to n8n, Zapier, Make.com, or similar; if you haven t used n8n yet, we ll help you onboard. API Knowledge : Understanding of RESTful APIs, JSON, authentication mechanisms. Interest in AI/LLMs : You know the basics of LLMs or are eager to dive in prompt engineering, embeddings, RAG concepts. Problem-Solving Mindset : You can break down complex tasks into smaller steps, map flows, and foresee edge cases. Communication & Documentation : You can explain your workflows, document steps, and write clean README/instructions. Team Player : Open to feedback, collaborate in agile/scrum-like setups, and help peers troubleshoot.
Posted 1 month ago
1.0 - 4.0 years
9 - 13 Lacs
Mumbai
Work from Office
We are looking for an analyst for our Client Services Operations team which performs, Data extraction, Data analysis on financial models and financial valuation reports along with report updates and various support services. The team undertakes research and collects financial and business data based on the request from the internal Kroll business units. The relevant financial and business data is collected through various publicly available sources and Kroll proprietary files. Pursuant to the collection, the data is summarized in the format prescribed by the Kroll business units. The team also undertakes subsequent analysis with respect to the completeness of the data and verification of accuracy of the information. This enables the business units to have easy access of information / data as available at various sources. Analyst will perform research and analyze financial information to help company make well informed commercial decisions, conduct research, and monitor financial movements. The day-to-day responsibilities include but are not limited to: Conduct investigations and analyses to evaluate client profiles in line with CIP standards, focusing on CDD and EDD globally. Perform research on Politically Exposed Persons, sanctions, adverse media, and screenings using tools like World-Check, Regulatory Data Corp, and LexisNexis. Prepare compliance-ready plausibility statements and manage periodic reviews, onboarding, and event-driven assessments. Analyze financial data to highlight exceptions or variations proactively. Maintain databases and fixed asset models/templates, ensuring adherence to client and business unit guidelines. Ensure high-quality deliverables (>99% accuracy) within stipulated timelines (24-48 hours or as per TAT). Collaborate in team huddles, resolve discrepancies, and contribute ideas for workflow and process improvements. Review deliverables prepared by Analysts, maintaining high-quality standards and compliance adherence. Essential traits: Bachelor s degree (preferably in BAF, BFM, B. Com, B. Tech, BMS, BBI, BBA, etc. ) or a Master s degree in Finance or Accounting. MBA or MMS from an accredited college or university. 1-4 years of experience/ skill set in Due Diligence, KYC operations, Customer due diligence, with a passion for data. Possessing working knowledge of Customer Due Diligence (CDD) and Enhanced Due Diligence (EDD). Strong research background using both primary and secondary public databases. Proficiency in identifying Politically Exposed Persons, sanctions, adverse media, and name list screening using third-party applications such as World-Check, Regulatory Data Corp, and LexisNexis. Drafting detailed plausibility statements. Good understanding of US/EU/APAC regulatory requirements Attention to details Self-starter capable of working under pressure with a high level of accuracy. Excellent communication skill Team player with the ability to build relationships and partnerships. Highly independent, motivated, and able to work independently. Proven ability to manage and prioritize multiple complex tasks with minimal supervision. Advanced expertise in regulatory frameworks and client onboarding standards (CIP, CDD, EDD, AML/KYC). Preferred: To have CAMS (Certified Anti-Money Laundering Specialist) or CKYCA (Certified KYC Associate) certifications or Globally Certified KYC Specialist (GO-AKS) certification About Kroll In a world of disruption and increasingly complex business challenges, our professionals bring truth into focus with the Kroll Lens. Our sharp analytical skills, paired with the latest technology, allow us to give our clients clarity not just answering all areas of business. We value the diverse backgrounds and perspectives that enable us to think globally. As part of One team, One Kroll, you ll contribute to a supportive and collaborative work environment that empowers you to excel. Kroll is the premier global valuation and corporate finance advisor with expertise in complex valuation, disputes and investigations, MA, restructuring, and compliance and regulatory consulting. Our professionals balance analytical skills, deep market insight and independence to help our clients make sound decisions. As an organization, we think globally and encourage our people to do the same. Kroll is committed to equal opportunity and diversity, and recruits people based on merit. In order to be considered for a position, you must formally apply via careers. kroll. com #LI-JC1 #LI-Hybrid
Posted 1 month ago
2.0 - 6.0 years
6 - 11 Lacs
Pune
Work from Office
Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist In this role, you will: Highly skilled and experienced Developer Engineer with expertise in Java, Java 8, Microservices, Springboot 3. 0. 0, postgres, JPA, UI -React, Typescript, JS, Apache Flink, Apache Beam, MongoDB. Strong knowledge on Google Cloud Platform (GCP) services such as Dataflow, Big Query/Clickhouse, Cloud storage, Pub/Sub, Google Cloud Storage (GCS), and Composer. The ideal candidate should also have hands-on experience with Apache Airflow, Google Kubernetes Engine (GKE), and Python for scripting and automation , automation testing. You will play a critical role in designing, developing, and maintaining scalable, high-performance pipelines and cloud-native solutions Strong focus on real-time stream processing using Apache Flink. Collaborate with cross-functional teams to define, design, and deliver new features and enhancements. Monitor and optimize the performance of data pipelines and applications. Ensure data quality, integrity, and security across all data pipelines and storage solutions. Provide technical guidance and mentorship to junior team members. Stay up-to-date with the latest data engineering technologies, best practices, and industry trends. Requirements To be successful in this role, you should meet the following requirements: 2 to 6 years of experience in development engineering, with a focus on ETL processes and data pipeline development. Bachelor s or Master s degree in Computer Science, Engineering, or a related field. Strong expertise in Java, SQL for data extraction, transformation, and loading. Strong problem-solving, analytical skills and the ability to troubleshoot complex data and application issues. Excellent communication and collaboration skills, with the ability to work effectively in a team environment. Familiarity with Helm charts for Kubernetes deployments. Experience with monitoring tools like Prometheus, Grafana, or Stackdriver. Knowledge of security best practices for cloud and Kubernetes environments. Knowledge of DevOps Skills will be an added advantage
Posted 1 month ago
3.0 - 8.0 years
12 - 22 Lacs
Chennai
Work from Office
Responsibilities: Collect, clean, and analyze large sets of structured and unstructured data to extract meaningful insights and trends Develop and implement advanced machine learning algorithms to solve complex business problems Support moving models to production, by creating high quality code modules that can be seamlessly integrated into existing systems (both on-prem and cloud) Communicate complex findings to both technical and non-technical audiences through effective data visualization and storytelling. Collaborate with cross-functional teams to identify data-driven opportunities and translate business requirements into actionable data solutions. Support the development and maintenance of data pipelines and infrastructure Stay up-to-date with industry trends and advancements in Data Science and Machine Learning technologies. Skills Required: Strong foundation in statistics, and machine learning algorithms Strong proficiency in programming languages like Python and SQL. Excellent problem-solving and analytical skills. Ability to work independently and as part of a team. Should have built production models using at least 2 of the ML techniques: Clustering, Regression, Classification Experience in Banking & Financial Services is preferred. Experience working on cloud platforms (e.g., AWS, GCP) is preferred. A passion for data and a curiosity to explore new trends and technologies
Posted 1 month ago
4.0 - 9.0 years
10 - 17 Lacs
Mumbai
Work from Office
Actively work with latest technologies&leading practices specific to analytics,data visualization,AI/ML&RPA to drive strategic benefits in the area of audit quality,efficiency&value creation leading audit related data extractions,enablement,analytics Required Candidate profile Min 4 yrs relevant data analytics experience,EDW concepts,good understanding of data mining and programming languages such as python, oracle sql working exp with visualization tools power BI,SAP BO
Posted 1 month ago
2.0 - 7.0 years
1 - 5 Lacs
Mumbai
Work from Office
Job Title: Email Marketing Specialist Responsibilities: Plan, design, and execute targeted email marketing campaigns. Generate relevant content for email campaigns, including subject lines, copy, and calls to action, tailored to the target audience's needs and preferences. Manage and segment email lists to ensure effective targeting, engagement, and compliance with email marketing regulations. (e.g., GDPR, CAN-SPAM) Monitor and analyse campaign performance metrics to identify areas for improvement and implement necessary changes. Manage email databases for accuracy and segmentation purposes to maximize targeting effectiveness. Generate data from specified sources and Data extraction from different resources Conduct A/B testing on various email campaigns to optimize open click-through and conversion rates. Should be running Global email marketing campaigns. Qualifications Proven track record in email marketing with a strong understanding of email best practices. Proficiency in HTML, CSS, and email marketing platforms (e.g., Mailchimp, HubSpot, Marketo, zoominfo) will be a plus Excellent writing and communication skills. Skills Required: Experience in the events and exhibitions industry or any B2B email marketing would be a plus. Contact: +91 8169054726 Amruta Sawant, HR Manager, AARS Designs Address: E-110, Crystal Plaza, Link Road, Opposite Infinity Mall, Andheri West, Mumbai Suburban, Maharashtra, India If you are passionate about creating impactful exhibition experiences and thrive in a collaborative environment, we would love to hear from you!
Posted 1 month ago
1.0 - 3.0 years
1 - 2 Lacs
Thane, Navi Mumbai, Mumbai (All Areas)
Work from Office
Principal Duties and Responsibilities Interpreting data, analyzing results using statistical techniques. Developing and implementing data analyses, data collection systems and other strategies that optimize statistical efficiency and quality. Acquiring data from primary or secondary data sources and maintaining databases. Key Responsibilities: Data Collection and Processing: Gather data from various sources, ensuring accuracy and completeness. Cleanse and preprocess data to remove errors and inconsistencies. Statistical Analysis and Interpretation: Utilize statistical methods to analyze data and identify trends, patterns, and correlations. Present findings through reports, visualizations, and presentations to stakeholders. Data Visualization and Reporting: Create visualizations and dashboards to effectively communicate insights. Prepare regular reports and ad-hoc analyses to support strategic decision-making. Problem-Solving and Recommendations: Collaborate with cross-functional teams to address business challenges using data-driven insights.
Posted 1 month ago
2.0 - 7.0 years
20 - 25 Lacs
Hyderabad
Work from Office
Transportation Financial Systems (TFS) owns the technology components that perform the financial activities for transportation business. These systems are used across all transportation programs and retail expansion to new geographies. TFS systems provide financial document creation & management, expense auditing, accounting, payments and cost allocation functions. Our new generation products are highly scalable and operate at finer level granularity to reconcile every dollar in transportation financial accounts with zero manual entries or corrections. The goal is to develop global product suite for all freight modes touching every single package movement across Amazon. Our mission is to abstract logistics complexities from financial world and financial complexities from logistics world. We are looking for an innovative, hands-on and customer-obsessed candidate for this role. Candidate must be detail oriented, have superior verbal and written communication skills, and should be able to juggle multiple tasks at once. The candidate must be able to make sound judgments and get the right things done. We seek a Business Intelligence (BI) Engineer to strengthen our data-driven decision-making processes. This role requires an individual with excellent statistical and analytical abilities, deep knowledge of business intelligence solutions and have the ability to strongly utilize the GenAI technologies to analyse and solving problem, able to collaborate with product, business & tech teams. The successful candidate will demonstrate the ability to work independently and learn quickly, quick comprehension of Transportation Finance system functions and have passion for data and analytics, be a self-starter comfortable with ambiguity, an ability to work in a fast-paced and entrepreneurial environment, and driven by a desire to innovate Amazon s approach to this space. 1) Translate business problems into analytical requirements and define expected output 2) Develop and implement key performance indicators (KPIs) to measure business performance and product impact. Responsible for deep-dive analysis on key metrics. 3) Create & execute analytical approach to solve the problem inline with stakeholder expectation 4) Strongly leveraging GenAI technologies to solve problems and building solutions 5) Be the domain expert and have knowledge of data availability from various sources. 6) Execute solution with scalable development practices in scripting, write & optimize SQL queries, reporting, data extraction and data visualization. 7) Proactively and independently work with stakeholders to construct use cases and associated standardized outputs for your work 8) Actively manage the timeline and deliverables of projects, focusing on interactions in the team About the team Transportation Financial Systems (TFS) owns the technology components that perform the financial activities for transportation business. These systems are used across all transportation programs and retail expansion to new geographies. TFS systems provide financial document creation & management, expense auditing, accounting, payments and cost allocation functions. Our new generation products are highly scalable and operate at finer level granularity to reconcile every dollar in transportation financial accounts with zero manual entries or corrections. The goal is to develop global product suite for all freight modes touching every single package movement across Amazon. Our mission is to abstract logistics complexities from financial world and financial complexities from logistics world. 2+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with one or more industry analytics visualization tools (e.g. Excel, Tableau, QuickSight, MicroStrategy, PowerBI) and statistical methods (e.g. t-test, Chi-squared) Experience with scripting language (e.g., Python, Java, or R) Masters degree, or Advanced technical degree Knowledge of data modeling and data pipeline design Experience with statistical analysis, co-relation analysis
Posted 1 month ago
4.0 - 9.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Write complex SQL queries for data extraction, perform exploratory data analysis (EDA) to uncover insights. Strong proficiency in Python and Py Spark for scalable data processing and analytics. Create, transform, and optimize features to enhance model performance. Train, evaluate, and maintain machine learning models in production. Write efficient, maintainable, and version-controlled code that handles large datasets. Regularly update internal teams and clients on project progress, results, and insights. Conduct hypothesis testing and experiment analysis to drive data-driven decisions using AB testing. Scaling machine learning algorithms to work on massive data sets and strict SLAs. Automate operations pipeline which runs on regular intervals to update required datasets. What you'll Bring: A masters or bachelors degree in computer science or a related field from a top university. 4+ years of hands-on experience in Machine Learning (ML) or Data Science with a focus on building scalable solutions. Strong programming expertise in Python and PySpark is must. Proven ability to write highly optimized SQL queries for efficient data extraction and transformation. Experience in feature engineering, inferencing pipelines, and real-time model prediction deployment. Strong fundamentals in applied statistics , with expertise in A/B test design and hypothesis testing. Solid understanding of distributed computing systems and hands-on experience with at least one cloud platform (GCP, AWS, or Azure) Additional Skills Understanding of Git, DevOps, CI / CD, data security, experience in designing on cloud platform. Experienced in automating operations using job scheduler like Airflow Experience in data engineering in Big Data systems
Posted 1 month ago
10.0 - 15.0 years
32 - 37 Lacs
Pune
Work from Office
Dashboard Development & Management : Design and maintain advanced Splunk dashboards to deliver comprehensive insights into system performance and File Transmission component health. Performance Optimization : Improve dashboard efficiency when handling large datasets using techniques such as optimized queries, summary indexing, and data models. Advanced Regex Utilization : Apply sophisticated regular expressions to create accurate search queries and extract meaningful data. Custom Alert Configuration : Implement highly customized alerting mechanisms to detect anomalies, manage alert actions, throttle conditions, and integrate with lookup tables and dynamic time-based arguments. File Transmission Monitoring : Track and report on each stage of file transmission, continuously refining monitoring strategies for enhanced reliability and visibility. Cross-Functional Collaboration : Work closely with various teams to integrate Splunk monitoring with broader IT systems and workflows. Conduct discovery of file transmission workflows, including file life cycle, endpoint configurations, log analysis, SLA definitions, and exception scenarios. Develop and deploy advanced Splunk queries to ensure end-to-end visibility into file transmission processes. Configure and optimize alerting mechanisms for timely detection and resolution of issues. Design and implement IT Service Intelligence (ITSI) strategies to enhance monitoring capabilities and deliver actionable insights. Establish and manage monitoring frameworks based on the file life cycle to ensure traceability and accountability. Collaborate with IT and operations teams to integrate Splunk with other tools and resolve data ingestion issues. Analyze monitoring data to identify trends, detect anomalies, and recommend improvements. Serve as a Splunk subject matter expert, providing guidance, best practices, and training to team members. What You Will Need to Have Education : bachelors and/or masters degree in Information Technology, Computer Science, or a related field. Experience : Minimum of 10 years in IT, with a focus on Splunk, SFTP tools, data integration, or technical support roles. Splunk Expertise : Proficiency in advanced SPL techniques including subsearches, joins, and statistical functions. Regex Proficiency : Strong command of regular expressions for search and data extraction. Database Skills : Experience with relational databases and writing complex SQL queries with advanced joins. File Transmission Tools : Hands-on experience with platforms like Sterling File Gateway, IBM Sterling, or other MFT solutions. Analytical Thinking : Proven problem-solving skills and the ability to troubleshoot technical issues effectively. Communication : Strong verbal and written communication skills for collaboration with internal and external stakeholders. Attention to Detail : High level of accuracy to ensure data integrity and reliability. What Would Be Great to Have Scripting & Automation : Proficiency in Python or similar scripting languages to automate monitoring tasks. Tool Experience : Familiarity with tools such as Dynatrace, Sterling File Gateway, and other MFT solutions. Linux Proficiency : Strong working knowledge of Linux and command-line operations. Secure File Transfer Protocols : Hands-on experience with SFTP and tools like SFG, NDM, and MFT using SSH encryption. Task Scheduling Tools : Experience with job scheduling platforms such as AutoSys, Control-M, or cron.
Posted 1 month ago
8.0 - 10.0 years
10 - 12 Lacs
Bengaluru
Work from Office
Experience in collecting business requirements to develop and implement SSIS packages. Analyzing existing data in legacy systems, data cleansing, and transformation of data Data Mapping between source and destination systems Involved in planning, designing, mapping discussions, schedules the job related meeting Coordination between Dev teams and Data Migration team Understanding of D365 CE Apps from the Dataverse side Understanding of CRM sales functionalities Must have good knowledge on Dynamics 365 CRM and coding best practices. Includes discussion around data migration, design specifications, data extraction, problem diagnosis, debugging, testing, modification and assurance of technical requirements of computer programs per established industry practices and Customers Development Management Methodology (DMM). Understanding of data management related tools like SSIS for ETL purpose, Need to be we'll versed in Data analyzing Must have good knowledge on Dynamics 365 CRM Sales. Must be able to work directly with the client and Microsoft. Must have good communication skills Mandatory Skills SSIS Package and Dynamics CRM modules Data Migration Data Mapping and Migration knowledge, using Data migration tools like SSIS package Good to Have Skills Need to be we'll versed in D365 architecture from DB/Dataverse understanding
Posted 1 month ago
7.0 - 10.0 years
5 - 9 Lacs
Gurugram
Work from Office
23 Title Expert Engineer Department GPS Technology Location Gurugram, India Reports To Project Manager Level Grade 4 We re proud to have been helping our clients build better financial futures for over 50 years. How have we achieved this? By working together - and supporting each other - all over the world. So, join our [insert name of team/ business area] team and feel like your part of something bigger. About your team The Technology function provides IT services to the Fidelity International business, globally. These include the development and support of business applications that underpin our revenue, operational, compliance, finance, legal, customer service and marketing functions. The broader technology organisation incorporates Infrastructure services that the firm relies on to operate on a day-to-day basis including data centre, networks, proximity services, security, voice, incident management and remediation. About your role Expert engineer is a seasoned technology expert who is highly skilled in programming, engineering and problem-solving skills. They can deliver value to business faster and with superlative quality. Their code and designs meet business, technical, non-functional and operational requirements most of the times without defects and incidents. So, if relentless focus and drive towards technical and engineering excellence along with adding value to business excites you, this is absolutely a role for you. If doing technical discussions and whiteboarding with peers excites you and doing pair programming and code reviews adds fuel to your tank, come we are looking for you. Understand system requirements, analyse, design, develop and test the application systems following the defined standards. The candidate is expected to display professional ethics in his/her approach to work and exhibit a high-level ownership within a demanding working environment. About you Essential Skills You have excellent software designing, programming, engineering, and problem-solving skills. Strong experience working on Data Ingestion, Transformation and Distribution using AWS or Snowflake Exposure to SnowSQL, Snowpipe, Role based access controls, ETL / ELT tools like Nifi, Matallion / DBT Hands on working knowledge around EC2, Lambda, ECS/EKS, DynamoDB, VPC s Familiar with building data pipelines that leverage the full power and best practices of Snowflake as well as how to integrate common technologies that work with Snowflake (code CICD, monitoring, orchestration, data quality, monitoring) Experience with designing, implementing, and overseeing the integration of data systems and ETL processes through Snaplogic Designing Data Ingestion and Orchestration Pipelines using AWS, Control M Establish strategies for data extraction, ingestion, transformation, automation, and consumption. Experience in Data Lake Concepts with Structured, Semi-Structured and Unstructured Data Experience in creating CI/CD Process for Snowflake Experience in strategies for Data Testing, Data Quality, Code Quality, Code Coverage Ability, willingness & openness to experiment / evaluate / adopt new technologies. Passion for technology, problem solving and team working. Go getter, ability to navigate across roles, functions, business units to collaborate, drive agreements and changes from drawing board to live systems. Lifelong learner who can bring the contemporary practices, technologies, ways of working to the organization. Effective collaborator adept at using all effective modes of communication and collaboration tools. Experience delivering on data related Non-Functional Requirements like- Hands-on experience dealing with large volumes of historical data across markets/geographies. Manipulating, processing, and extracting value from large, disconnected datasets. Building water-tight data quality gates on investment management data Generic handling of standard business scenarios in case of missing data, holidays, out of tolerance errors etc. Experience and Qualification: B.E./ B.Tech. or M.C.A. in Computer Science from a reputed University Total 7 to 10 years of relevant experience Personal Characteristics Good interpersonal and communication skills. Strong team player Ability to work at a strategic and tactical level. Ability to convey strong messages in a polite but firm manner. Self-motivation is essential, should demonstrate commitment to high quality design and development. Ability to develop & maintain working relationships with several stakeholders. Flexibility and an open attitude to change. Problem solving skills with the ability to think laterally, and to think with a medium term and long-term perspective. Ability to learn and quickly get familiar with a complex business and technology environment. Feel rewarded For starters, we ll offer you a comprehensive benefits package. We ll value your wellbeing and support your development. And we ll be as flexible as we can about where and when you work - finding a balance that works for all of us. It s all part of our commitment to making you feel motivated by the work you do and happy to be part of our team. For more about our work, our approach to dynamic working and how you could build your future here, . For more about our work, our approach to dynamic working and how you could build your future here, .
Posted 1 month ago
1.0 - 6.0 years
1 - 4 Lacs
Mumbai
Work from Office
About the Opportunity: Magnifi is looking for a proactive and detail-oriented Editorial Content Executive to join our team on a 1-year contractual basis , working directly with the Jio Hotstar team . About Videoverse: VideoVerse is an innovative and dynamic video technology company that serves as an umbrella brand for our powerful AI-based products; Magnifi & Illusto. We are an enthusiastic, passionate, fast-growing, diverse, and vibrant team that work with some of the biggest names in broadcasting (3 of the top 5 in India and growing quickly in Europe and the USA) and on some of the biggest sporting events in the world like the Indian Premier League (T20IPL), multiple European football leagues, and much more. The company is at a stage of rapid growth and is actively hiring enthusiastic individuals who believe in making a difference and revolutionizing the way content is created, distributed, and consumed in the evolving video-centric world. For more information, please click the links mentioned below: Videoverse LinkedIn: https: / / www.linkedin.com / company / videoverse / Videoverse: https://vverse.ai/ Magnifi: https://magnifi.ai/ About the products: Magnifi , is an AI-powered enterprise product that automatically detects key moments in video content, enabling real-time creation of highlights and short-form videos. With a global presence, Magnifi collaborates with various industries, including OTT platforms, sports broadcasters, and e-gaming platforms. Their vision is to empower users to create and share impactful stories across digital platforms with ease. Fostering a culture of innovation and collaboration, Magnifis leadership team is dedicated to leveraging AI for simplified video editing. The company has made notable acquisitions and received recognition for its contributions to the industry. Role and Responsibilities: For India: CMS tray creations, set-up, updates & maintenance Metadata changes, Jira ticket requests, on-call for CMS changes Masthead boosting for new & priority releases Editorial Masthead updates for tournament season Page Management on retool HoPP: Prod and Pre-Prod widget & space creation, management and experiments - creation and execution, on-call for home page changes GEC data extraction & curation for channel teams For HSI: CMS tray creations, set-up, updates & maintenance Page Management on retool Editorial Masthead upkeep About the Opportunity:Magnifi is looking for a proactive and detail-oriented Editorial Content Executive to join our team on a 1-year contractual basis, working directly with the Jio Hotstar team.About Videoverse:VideoVerse is an innovative and dynamic video technology company that serves as an umbrella brand for our powerful AI-based products; Magnifi & Illusto. We are an enthusiastic, passionate, fast-growing, diverse, and vibrant team that work with some of the biggest names in broadcasting (3 of the top 5 in India and growing quickly in Europe and the USA) and on some of the biggest sporting events in the world like the Indian Premier League (T20IPL), multiple European football leagues, and much more.The company is at a stage of rapid growth and is actively hiring enthusiastic individuals who believe in making a difference and revolutionizing the way content is created, distributed, and consumed in the evolving video-centric world.For more information, please click the links mentioned below:Videoverse LinkedIn: https: / / www.linkedin.com / company / videoverse / Videoverse: https://vverse.ai/Magnifi: https://magnifi.ai/About the products:Magnifi, is an AI-powered enterprise product that automatically detects key moments in video content, enabling real-time creation of highlights and short-form videos. With a global presence, Magnifi collaborates with various industries, including OTT platforms, sports broadcasters, and e-gaming platforms. Their vision is to empower users to create and share impactful stories across digital platforms with ease. Fostering a culture of innovation and collaboration, Magnifis leadership team is dedicated to leveraging AI for simplified video editing. The company has made notable acquisitions and received recognition for its contributions to the industry.Role and Responsibilities:For India:CMS tray creations, set-up, updates & maintenanceMetadata changes, Jira ticket requests, on-call for CMS changesMasthead boosting for new & priority releasesEditorial Masthead updates for tournament seasonPage Management on retoolHoPP: Prod and Pre-Prod widget & space creation, management and experiments - creation and execution, on-call for home page changesGEC data extraction & curation for channel teamsFor HSI:CMS tray creations, set-up, updates & maintenancePage Management on retoolEditorial Masthead upkeep
Posted 1 month ago
5.0 - 6.0 years
8 - 9 Lacs
Pune
Work from Office
Overview We are looking for a Python Data Engineer with expertise in real-time data monitoring, extraction, transformation, and visualization. The ideal candidate will have experience working with Oracle SQL databases, multithreading, and AI/ML techniques and should be proficient in deploying Python applications on IIS servers . The role involves developing a system to monitor live files and folders, extract data, transform it using various techniques, and display insights on a Plotly Dash-based dashboard . Responsibilities Backend & Frontend Development: Build end-to-end solutions using Python for both backend and frontend functionalities. Data Extraction & Transformation: Implement data cleaning, regex, formatting, and data handling to process extracted information. Database Management: Insert and update records in an Oracle SQL database, ensuring data integrity and efficiency. Live File & Folder Monitoring: Develop Python scripts using Watchdog to monitor logs, detect new files/folders, and extract data in real time. Fetch live data from the database using multithreading for smooth real-time updates. Data Visualization: Develop an interactive dashboard using Plotly Dash or react for real-time data representation. Data Analytics & Pattern Finding: Perform exploratory data analysis (EDA) to identify trends, anomalies, and key insights. Cloud & AI/ML Integration: Leverage AI/ML techniques for data processing. Deployment & Maintenance: Deploy applications on an IIS server/Cloud and ensure system scalability and security. Qualifications BE/BTECH degree in Computer Science, EE, or related field. Essential skills Strong Python programming skills Experience with Watchdog for real-time monitoring. Expertise in Oracle SQL (data insertion, updates, query optimization). Knowledge of AI/ML techniques and their practical applications. Hands-on experience with Plotly Dash/React/Angular any UI framework for dashboard development. Familiarity with IIS deployment and troubleshooting. Good understanding of data cleaning, ETL pipelines, and real-time data streaming . Strong debugging and problem-solving skills. Prior experience working on real-time monitoring systems . Experience Year of Experience: 5 - 6 years
Posted 1 month ago
2.0 - 3.0 years
4 - 5 Lacs
Hyderabad
Work from Office
Its fun to work at a company where people truly believe in what they are doing! Job Description: Job Description Position Summary The Litigation Analyst works as a member of the Operations team within Epiqs Electronic Discovery. In this role analyst is responsible for both overseeing litigation support work and interacting with Client services in order to maintain ECTs w.r.t processing team s work. Strong attention to detail, high quality work product and frequent interaction with project managers is also a major function of this role. Essential Job Responsibilities The Litigation Analyst is responsible for the following: Oversee daily tasks and workflows performed by the litigation support department as directed by management Ensure daily services requests are assigned to team members and executed accurately in accordance with client deadlines Ensure all QC procedures and protocols are followed Responsible for performing searching, search term formatting and structured analytics. Also responsible for managing processing team priorities, managing ECTs and communication with project managers whenever required. Handling general requests and assigning to other teams as per the instructions so knowledge of overall EDRM model is also required. Trouble-shoot and resolve issues from litigation Analysts and Client Services prior to escalation to managers Requirements for the role include: At least 2-3 years experience in the litigation support industry is required. Intermediate knowledge of several ESI data processing platforms (e.g. NUIX) Intermediate knowledge of several ESI data hosting platforms (e.g. Relativity, Concordance, Summation etc.) Must be flexible in working long hours and could work earlier and later than their scheduled shift to meet often last minute and tightly compressed client deadlines Must possess a strong understanding of electronic discovery tools and technology with an advanced level understanding of eDiscovery Processing and data extraction Possess and employ effective verbal and written communicate skills and work positively and effectively with other company departments Education & Experience Bachelor s degree or equivalent combination of education and experience; a degree in Computer Science, Business Management or a closely related field of study is preferred. Knowledge, Skills, and Abilities Experience working under tight deadlines in a fast-paced technical environment is strongly preferred Ability to perform troubleshooting and learn customized proprietary software Excellent communication skills (written and verbal) Strong organizational skills and an extreme attention to detail is required If you like wild growth and working with happy, enthusiastic over-achievers, youll enjoy your career with us!
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough