Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 5.0 years
8 - 12 Lacs
Mumbai
Work from Office
We re looking for a Senior AI Engineer to join our AI team, where you ll design and develop highly scalable, robust systems that drive our AI initiatives and data operations. Success in this role depends on strong software engineering skills, familiarity with large-scale distributed systems, and expertise in AI technologies. Our ideal candidate has a proven ability to build reliable platforms (rather than standalone applications) and to iterate effectively over multiple release cycles. While AI experience isn a required, you should also be enthusiastic about software engineering, and building scalable platforms and cloud services. A significant aspect of the job involves collaborating with cross-functional teams to translate complex requirements into scalable, efficient code, with responsibilities including the implementation and maintenance of software infrastructure for our AI platform. You ll regularly work alongside AI specialists, domain leads, and other engineering teams. Throughout your work, you ll apply best practices in software engineering and system architecture. If you re passionate about delivering high-impact AI solutions in a dynamic environment, this is an exciting opportunity to have a substantial influence on our AI capabilities through your expertise. Your Key Responsibilities Design and implement scalable distributed systems Architect solutions that can handle large volumes of data for real-time and batch processing Design and develop efficient AI pipelines with automation and reliability across the platform Integrate agentic workflows and AI agents into data extraction processes, and enable systems to perform multi-step reasoning and tool usage to improve accuracy and efficiency of data extraction. Deploy, monitor, and maintain the LLM-based extraction systems in production, ensuring reliability and scalability. Set up appropriate monitoring, logging, and evaluation metrics to track performance, and perform continual tuning and improvements based on human in the loop feedback. Conduct applied research and experimentation with the latest generative AI models and techniques to enhance extraction capabilities. Prototype new approaches and iterate quickly to integrate successful methods into the production pipeline. Collaborate with cross-functional teams (data engineers, product managers, domain experts) to gather requirements, align AI solutions with business needs. Your skills and experience that will help you excel 4-5 years of experience in applied AI or machine learning engineering, with a track record of building and deploying AI solutions (especially in NLP). Hands-on experience with using Generative AI models and APIs/frameworks (e.g., OpenAI GPT-4, Google Gemini). Ability to build Agentic AI systems where LLMs interact with tools or perform multi-step workflows. Proficiency in Python (preferred) and experience deploying machine learning models or pipelines at scale. Good understanding of embeddings, LLM models and experience with retrieval-augmented generation (RAG) workflows to incorporate external knowledge into LLM-based systems. Knowledge of LLMOps and cloud services (Azure, GCP, or similar) for deploying and managing AI solutions. Experience with containerization, orchestration, and monitoring of ML models in a production cloud environment. Excellent collaboration and communication skills, with the ability to work effectively in a team, translate complex technical concepts to non-technical stakeholders, and document work clearly. About MSCI What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women s Leadership Forum. . MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for . Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies . Note on recruitment scams
Posted 1 month ago
2.0 - 10.0 years
10 - 11 Lacs
Gurugram
Work from Office
About Us What s in it for YOU SBI Card truly lives by the work-life balance philosophy. We offer a robust wellness and wellbeing program to support mental and physical health of our employees Admirable work deserves to be rewarded. We have a well curated bouquet of rewards and recognition program for the employees Dynamic, Inclusive and Diverse team culture Gender Neutral Policy Inclusive Health Benefits for all - Medical Insurance, Personal Accidental, Group Term Life Insurance and Annual Health Checkup, Dental and OPD benefits Commitment to the overall development of an employee through comprehensive learning & development framework Role Purpose Data - Project Manager & Processing will be responsible for analytics related support for the functional analytics team & to manage data projects, working in close co-ordination with the Insights & Reporting team. Role Accountability Program Execution - Person is responsible to execute all the program initiatives undertaken in data vertical by managing development/SI partners to ensure the execution of projects in a timely manner with expected quality. He/She should be able to create project plan, drive both business & IT teams to work with plan, identify project/program risks and put mitigations or do timely escalation for help needed. Assist the Program Delivery Leader in creating a team with data processing skills with good understanding of Python, SAS, SQL, Tableau or other analytical tools using which they should assist Insights & Reporting team if they need help with extraction and processing of data. Maintain detailed project documentation, including project charters, status reports, FSD, TSD, etc. and ensure project is handed over to the Insights and Reporting team upon successful completion Work with business teams and Data Lake technology team and lead the programs and data initiatives arising due to new needs arising from business, audits, regulatory Actively participate in the new product initiatives and provide data requirements to be implemented for NPIs and ensure that same are implemented for appropriate data insights and analytics Role is required to interact and collaborate with multiple functions ensuring that their data requirements are correctly captured, analyzed and implemented along with new initiatives. Person is required to ensure that data requirements are not missed out in new initiatives and will ensure that proper thought process is applied in coming up with data requirements considering audits and regulatory reporting Ensure technical support is provided to Insights and reporting team wherever required to meet the data extraction & analysis requirements Person is required to build a strong understanding of data processes across the card lifecycle, how and where the data is stored across multiple layers of data platforms Collaborate with senior leadership team, Function heads and BIU Program management team to understand their data needs and deliver the same through the implementation of data initiatives and projects He/She will be responsible to drive periodic meetings with business leaders to identify data projects and work closely with IT for its implementation Person is required to build a strong understanding of data processes across the card lifecycle, how and where the data is stored across multiple layers of data platform As a People Manager, person is required to manage & lead the team with direct reportees of up to 5 team members Measures of Success Deliver data projects On Time, Within Approved Budget with no P1 defects in production Technical Skills / Experience / Certifications Good knowledge of SAS, Python, SQL and Tableau Good understanding of ETL tools and processes Good understanding of project management methodology Competencies critical to the role Person should have strong experience of delivering multiple programs and leading teams preferably in BFSI segment Good knowledge of business processes & key business metrics to provide effective solutions Person should have good knowledge in preparing High, Mid and low-level project plan. He/She should be good in Microsoft project management tool Person is required to lead cross functional teams to drive data projects and execute data processing asks He should be Strong team player - Inclusive who can collaborate with multiple teams and drive them towards achieving a common goal Strong analytical skills - strong problem-solving skills, communicates in a clear and succinct manner and effectively evaluates information / data to make decisions; anticipates obstacles and develops plans to resolve Demonstrated customer focus - evaluates decisions through the eyes of the customer; builds strong relationships and creates processes which helps with timely availability of data to all stakeholders Should have very good written and verbal communication skills Qualification B.E / MCA in Computer Science/Graduate or PG from good institute. PMP Certification desired Preferred Industry BFSI
Posted 1 month ago
6.0 - 11.0 years
6 - 10 Lacs
Pune
Work from Office
The future is our choice At Atos, as the global leader in secure and decarbonized digital, our purpose is to help design the future of the information space. Together we bring the diversity of our people s skills and backgrounds to make the right choices with our clients, for our company and for our own futures. Roles & Responsibilities: - Should have solid experience on Databricks. Overall 6+ years of experience. Requirements: Good experience on the medallion architecture Very good knowledge in pyspark, python n sql Lake flow connectors, DLT, Delta sharing. Data extraction from various source application like sfdc, sap, and cloud apps Data integration from databricks to various downstream Need to have good understanding and hands-on of data warehousing concepts. Our Offering: Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment. Wellbeing programs & work-life balance - integration and passion sharing events. Attractive Salary and Company Initiative Benefits Courses and conferences. Attractive Salary. Hybrid work culture. Here at Atos, diversity and inclusion are embedded in our DNA. Read more about our commitment to a fair work environment for all. Atos is a recognized leader in its industry across Environment, Social and Governance (ESG) criteria. Find out more on our CSR commitment. Choose your future. Choose Atos.
Posted 1 month ago
7.0 - 12.0 years
9 - 14 Lacs
Bengaluru
Work from Office
Amazon Web Services (AWS), the cloud product line of Amazon is a pioneer in blazing new trails of cloud computing. AWS India is a high-growth, fast-moving group, where new and diverse challenges arise every day. This is an exciting opportunity to join one of the fastest growing divisions within Amazon. At AWS India, you will be surrounded by business partners who are exceptionally talented, bright, and driven and believe that world-class support is critical to customer success. The AWS India Finance team is looking for a Finance Manager to work closely with business leaders to drive financial planning process and scaling mechanisms to support our India business and develop financial models to assist in evaluating new business concepts, turning insights into operational measurements. Our business partners are interested in understanding the value of our efforts and efficiency of our field teams in working with customers. 1. Lead short and long-term financial planning; 2. Build financial models for new business concepts; 3. Present financial summaries and business insights to senior management; and 4.Tighten internal controls for program spending and other investments. 1. BA/BS degree in finance, business, economics, or a related field 2. 7+ years of relevant finance experience with increasing levels of responsibility 3. Analytical, financial modeling, and reporting skills required 1. An MBA or MS in Finance, Economics, or a related field 2. Demonstrated ability to work independently and self-motivate in a fast paced and rapidly-changing environment 3. Proven ability to meet tight deadlines and prioritize workload 4. Demonstrated ability to influence decisions through effective verbal and written communication, logical reasoning and the presentation of alternatives 5. Passion for diving into the details of productivity metrics and cost measurement including ROI and customer lifetime value analysis 6. Exceptional business judgment capable of driving a diverse organization to the right results with a focused, pragmatic approach 7. Experience with TM1 and Oracle Financials or similar tools 8. Advanced Excel skills 9. Experience with data extraction using SQL or similar tools
Posted 1 month ago
1.0 - 3.0 years
3 - 5 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
As a Research Consultant on diverse projects, you will play a critical role on evidence synthesis projects, gaining experience in leading projects as well as strengthening your current skill set and experience in content delivery. Collaborating closely with Cytelso leadership team, you will have the opportunity to deliver exceptional services conducting litrerature reviews using Cytels cutting-edge products including LiveSLR and LiveNMA. Advanced degree in life sciences, health economics, or a clinical discipline is preferred (e.g., PhD, PharmD, MPH, MS). 1 to 3 years of experience in conducting SLRs, and 3-5 within the field of health economics and outcomes research (HEOR) and/or consultancy. Proficiency in English, including reading, writing, and oral communication. Ability to absorb and synthesize a broad range of information, including clinical and scientific data. Working knowledge of pharmaceutical product development and HEOR. Methodological, disciplined, well-organized, and professional approach. Strong ability to work independently and as part of a team. Excellent attention to detail and efficiency in managing large databases of studies and data extraction. Strong multitasking and prioritization skills. Advanced Microsoft Office skills to present information in an engaging, clean, and concise manner. Join our team at Cytel, a dynamic and innovative company that values scientific excellence, collaboration, and personal growth. As a Research Consultant, you will have the opportunity to make a meaningful impact on evidence synthesis while working with cutting-edge SLR software products. We offer a supportive work environment that encourages innovation and professional development. Note: Candidate ideally lives in India and can travel to the office 2-3 days/week to meet with team . Develop and execute SLR protocols for clinical efficacy and effectiveness, health-related quality of life (HRQoL), healthcare resource use (HCRU), costs, and economic evaluations. Conduct literature screening and data extraction Perform quality checks on screening, data extraction, and data upload into our LiveSLR software. Interpret and summarize evidence, providing insights and updates to internal and external stakeholders. Can turn a proposal into a protocol / protocol into a report including table construction Gain experience in managing internal and external project plans, ensuring timely delivery and effective communication. Whilst respecting QC process, trends towards working independently for his/her projects on some tasks Works on problems of moderate scope where analysis of situation or data requires review of a variety of factors Participates in at least one project from A to Z before RC role and can juggle multiple projects (e.g. 4 mid-size projects) Knows SOPs inside out and has practice with applying these through projects Communicates and collaborate actively with PL, team members and build stable working relationships with colleagues, can present deliverables to client
Posted 1 month ago
8.0 - 13.0 years
35 - 40 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Description: ACCOUNTABILITIES: Designs, codes, tests, debugs and documents software according to Dell s systems quality standards, policies and procedures. Analyzes business needs and creates software solutions. Responsible for preparing design documentation. Prepares test data for unit, string and parallel testing. Evaluates and recommends software and hardware solutions to meet user needs. Resolves customer issues with software solutions and responds to suggestions for improvements and enhancements. Works with business and development teams to clarify requirements to ensure testability. Drafts, revises, and maintains test plans, test cases, and automated test scripts. Executes test procedures according to software requirements specifications Logs defects and makes recommendations to address defects. Retests software corrections to ensure problems are resolved. Documents evolution of testing procedures for future replication. May conduct performance and scalability testing. RESPONSIBILITIES: Plans, conducts and leads assignments generally involving moderate, high budgets projects or more than one project. Manages user expectations regarding appropriate milestones and deadlines. Assists in training, work assignment and checking of less experienced developers. Serves as technical consultant to leaders in the IT organization and functional user groups. Subject matter expert in one or more technical programming specialties; employs expertise as a generalist of a specialist. Performs estimation efforts on complex projects and tracks progress. Works on the highest level of problems where analysis of situations or data requires an in-depth evaluation of various factors. Documents, evaluates and researches test results; documents evolution of testing scripts for future replication. Identifies, recommends and implements changes to enhance the effectiveness of quality assurance strategies. Description Comments Additional Details Description Comments : Skills: Python, PySpark and SQL 8+ years of experience in Spark, Scala, PySpark for big data processing Proficiency in Python programming for data manipulation and analysis. Experience with Python libraries such as Pandas, NumPy. Knowledge of Spark architecture and components (RDDs, DataFrames, Spark SQL). Strong knowledge of SQL for querying databases. Experience with database systems like Lakehouse, PostgreSQL, Teradata, SQL Server. Ability to write complex SQL queries for data extraction and transformation. Strong analytical skills to interpret data and provide insights. Ability to troubleshoot and resolve data-related issues. Strong problem-solving skills to address data-related challenges Effective communication skills to collaborate with cross-functional teams.Role/Responsibilities: Work on development activities along with lead activities Coordinate with the Product Manager (PdM) and Development Architect (Dev Architect) and handle deliverables independently Collaborate with other teams to understand data requirements and deliver solutions. Design, develop, and maintain scalable data pipelines using Python and PySpark. Utilize PySpark and Spark scripting for data processing and analysis Implement ETL (Extract, Transform, Load) processes to ensure data is accurately processed and stored. Develop and maintain Power BI reports and dashboards. Optimize data pipelines for performance and reliability. Integrate data from various sources into centralized data repositories. Ensure data quality and consistency across different data sets. Analyze large data sets to identify trends, patterns, and insights. Optimize PySpark applications for better performance and scalability. Continuously improve data processing workflows and infrastructure. Not to Exceed Rate : (No Value)
Posted 1 month ago
5.0 - 10.0 years
10 - 15 Lacs
Bengaluru
Work from Office
About us: As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up. Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about Target in India At Target, we have a timeless purpose and a proven strategy. And that hasn t happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Target s global team and has more than 4,000 team members supporting the company s global strategy and operations. About the Role As a Senior RBX Data Specialist at Target in India, involves the end-to-end management of data, encompassing building and maintaining pipelines through ETL/ELT and data modeling, ensuring data accuracy and system performance, and resolving data flow issues. It also requires analyzing data to generate insights, creating visualizations for stakeholders, automating processes for efficiency, and effective collaboration across both business and technical teams. You will also answer ad-hoc questions from your business users by conducting quick analysis on relevant data, identify trends and correlations, and form hypotheses to explain the observations. Some of this will lead to bigger projects of increased complexity, where you will have to work as a part of a bigger team, but also independently execute specific tasks. Finally, you are expected to always adhere to project schedule and technical rigor as well as requirements for documentation, code versioning, etc Key Responsibilities Data Pipeline and MaintenanceMonitor data pipelines and warehousing systems to ensure optimal health and performance. Ensure data integrity and accuracy throughout the data lifecycle. Incident Management and ResolutionDrive the resolution of data incidents and document their causes and fixes, collaborating with teams to prevent recurrence. Automation and Process ImprovementIdentify and implement automation opportunities and Data Ops best practices to enhance the efficiency, reliability, and scalability of data processes. Collaboration and CommunicationWork closely with data teams and stakeholders, to understand data pipeline architecture and dependencies, ensuring timely and accurate data delivery while effectively communicating data issues and participating in relevant discussions. Data Quality and GovernanceImplement and enforce data quality standards, monitor metrics for improvement, and support data governance by ensuring policy compliance. Documentation and ReportingCreate and maintain clear and concise documentation of data pipelines, processes, and troubleshooting steps. Develop and generate reports on data operations performance and key metrics. Core responsibilities are described within this job description. Job duties may change at any time due to business needs. About You B.Tech / B.E. or equivalent (completed) degree 5+ years of relevant work experience Experience in Marketing/Customer/Loyalty/Retail analytics is preferable Exposure to A/B testing Familiarity with big data technologies, data languages and visualization tools Exposure to languages such as Python and R for data analysis and modelling Proficiency in SQL for data extraction, manipulation, and analysis, with experience in big data query frameworks such as Hive, Presto, SQL, or BigQuery Solid foundation knowledge in mathematics, statistics, and predictive modelling techniques, including Linear Regression, Logistic Regression, time-series models, and classification techniques. Ability to simplify complex technical and analytical methodologies for easier comprehension for broad audiences. Ability to identify process and tool improvements and implement change Excellent written and verbal English communication skills for Global working Motivation to initiate, build and maintain global partnerships Ability to function in group and/or individual settings. Willing and able to work from our office location (Bangalore HQ) as required by business needs and brand initiatives Useful Links- Life at Target- https://india.target.com/ Benefits- https://india.target.com/life-at-target/workplace/benefits Culture- https://india.target.com/life-at-target/belonging
Posted 1 month ago
4.0 - 5.0 years
15 - 19 Lacs
Mumbai
Work from Office
we're looking for a Senior AI Engineer to join our AI team, where you'll design and develop highly scalable, robust systems that drive our AI initiatives and data operations. Success in this role depends on strong software engineering skills, familiarity with large-scale distributed systems, and expertise in AI technologies. Our ideal candidate has a proven ability to build reliable platforms (rather than standalone applications) and to iterate effectively over multiple release cycles. While AI experience isn a required, you should also be enthusiastic about software engineering, and building scalable platforms and cloud services. A significant aspect of the job involves collaborating with cross-functional teams to translate complex requirements into scalable, efficient code, with responsibilities including the implementation and maintenance of software infrastructure for our AI platform. you'll regularly work alongside AI specialists, domain leads, and other engineering teams. Throughout your work, you'll apply best practices in software engineering and system architecture. If you're passionate about delivering high-impact AI solutions in a dynamic environment, this is an exciting opportunity to have a substantial influence on our AI capabilities through your expertise. Your Key Responsibilities Design and implement scalable distributed systems Architect solutions that can handle large volumes of data for real-time and batch processing Design and develop efficient AI pipelines with automation and reliability across the platform Integrate agentic workflows and AI agents into data extraction processes, and enable systems to perform multi-step reasoning and tool usage to improve accuracy and efficiency of data extraction. Deploy, monitor, and maintain the LLM-based extraction systems in production, ensuring reliability and scalability. Set up appropriate monitoring, logging, and evaluation metrics to track performance, and perform continual tuning and improvements based on human in the loop feedback. Conduct applied research and experimentation with the latest generative AI models and techniques to enhance extraction capabilities. Prototype new approaches and iterate quickly to integrate successful methods into the production pipeline. Collaborate with cross-functional teams (data engineers, product managers, domain experts) to gather requirements, align AI solutions with business needs. Your skills and experience that will help you excel 4-5 years of experience in applied AI or machine learning engineering, with a track record of building and deploying AI solutions (especially in NLP). Hands-on experience with using Generative AI models and APIs/frameworks (eg, OpenAI GPT-4, Google Gemini). Ability to build Agentic AI systems where LLMs interact with tools or perform multi-step workflows. Proficiency in Python (preferred) and experience deploying machine learning models or pipelines at scale. Good understanding of embeddings, LLM models and experience with retrieval-augmented generation (RAG) workflows to incorporate external knowledge into LLM-based systems. Knowledge of LLMOps and cloud services (Azure, GCP, or similar) for deploying and managing AI solutions. Experience with containerization, orchestration, and monitoring of ML models in a production cloud environment. Excellent collaboration and communication skills, with the ability to work effectively in a team, translate complex technical concepts to non-technical stakeholders, and document work clearly. About MSCI What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall we'llbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followe'd by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women s Leadership Forum
Posted 1 month ago
4.0 - 6.0 years
7 - 11 Lacs
Pune
Work from Office
Job Information Job Opening ID ZR_2109_JOB Date Opened 09/02/2024 Industry Technology Job Type Work Experience 4-6 years Job Title CXO Analyst City Pune City Province Maharashtra Country India Postal Code 411001 Number of Positions 4 LocationBangalore, Mumbai, Pune, Chennai, Hyderabad, Kolkata, Delhi Technical Skills: Data strategist with expertise in Product Growth, Conversion Rate Optimization (experiments) Personalisation using digital analytics skills such as Adobe Analytics, Google Analytics, and others (Quantitative and Qualitative)Proven success in acquisition marketing and retention marketing by leveraging strategic and tactical implementation of Conversion Rate Optimization (A/B Testing) and Product Optimization (Landing Page Optimization, Product-Market Fit) through data insights.Personalisation - It includes customer segmentation, targeted messaging, dynamic content, recommendations, behavioral targeting, and AI-powered personalization.Skilled in leveraging advanced analytics tools for actionable data extraction from extensive datasets such as SQL, Big Query, Excel, Python for data analysis, Power BI, and Data StudioProficient in implementing digital analytics measurement across diverse domains using tools such as Adobe Launch, Google Tag Manager, and TealiumSoft skills:Experience in client-facing projects & stakeholder mgmt, excellent communication skillsCollaborative team player, aligning product vision with business objectives cross-functionallyAvid learner, staying current with industry trends and emerging technologiesCommitted to delivering measurable results through creative thinking, exceeding performance metrics, and fostering growth check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 1 month ago
9.0 - 12.0 years
10 - 15 Lacs
Noida
Hybrid
This person will be responsible for producing reports and dashboards which will enable the leadership to take informed decisions by analysing the data. Developing visual reports, KPI scorecards, and dashboards using Power BI desktop. Required Candidate profile Connecting data sources, importing data, and transforming data for Business intelligence. Analytical thinking for translating data into informative reports and visuals.
Posted 1 month ago
4.0 - 6.0 years
5 - 6 Lacs
Pune
Work from Office
Major responsibilities: Data extraction and data updating in SAP Create 3D model & 2D drg. in UG NX for required task Ensuring completeness and correctness of data as per provided guidelines checklists Maintaining Quality and Schedule / Budget guidelines for given assignments. Requirement profile: Formal education: Diploma in Mechanical Engineering Professional experience: (In Years) At least 1 year Defined competencies: Good data preparation and product data upkeep skill. Knowledge of MS Excel and Power Point Presentation Good Communication skill Could have UG NX knowledge
Posted 1 month ago
3.0 - 8.0 years
12 - 22 Lacs
Pune, Bengaluru
Work from Office
Gen AI+AWS Skill-Gen AI,AWS,AI Platform,Python,Tensorflow,Langchain,LLM Model,ML Development, AWS Lambda,AWS Bedrock,Data Extraction Exp-3-9YRS In Gen AI AWS PKG Upto-25LPA Loc- Bang, Pune NP-Imm-30Days Ritika-8587970773 ritikab.imaginators@gmail.com Required Candidate profile Gen AI + AWS Mandatory Skill-Gen AI, AWS,AI Platform, Python, Tensor flow, Lang chain, LLM Model, ML Development, AWS Lambda, AWS Bedrock, AWS AI, Data Extraction, GEN AI Solution, RPA Tool
Posted 1 month ago
4.0 - 9.0 years
12 - 22 Lacs
Pune, Bengaluru
Work from Office
Gen AI+AWS Skill-Gen AI,AWS,AI Platform,Python,Tensorflow,Langchain,LLM Model,ML Development, AWS Lambda,AWS Bedrock,Data Extraction Exp-4-12YRS In Gen AI AWS PKG Upto-25LPA Loc-Bang, Pune NP-Imm-30Days Ritika-8587970773 ritikab.imaginators@gmail.com Required Candidate profile Gen AI + AWS Mandatory Skill-Gen AI, AWS,AI Platform, Python, Tensor flow, Lang chain, LLM Model, ML Development, AWS Lambda, AWS Bedrock, AWS AI, Data Extraction, GEN AI Solution, RPA Tool
Posted 1 month ago
0.0 - 1.0 years
1 - 2 Lacs
Ahmedabad
Work from Office
Apprenticeship Duration: 12 months Apprentice Requirement Candidates must not be already registered on the NAPS (National Apprenticeship Promotion Scheme) portal as apprentices Candidates should not have an active UAN/PF account Our Apprenticeship Program The apprenticeship is a skill training program governed by The Apprenticeship Act, 1961 (India), where individuals have the chance to partner with our industry experts, with on-the-job experience focusing on high impact work that allows you to apply and develop your skills. Networking, business insights and tailored learning opportunities all support your growth and development, setting you up for success as you begin your career. Candidates with no prior experience as Apprentice can only apply for the role S&P Global is looking to hire young fresh graduates and postgraduates, specifically 2024 and 2025 pass-outs, for a 12-month apprenticeship program. The Team: The Apprentice will be part of a team who works on various research reports and company documents to collect information & generate meaningful consensus from the collected data. This effort is coupled with real time monitoring of global industry trade publications and websites/news aggregators. Different Cluster supports business lines like Security Management, People Data, Fundamentals (Industry, General Fundamentals), Estimates, Market Data, Filings & Sourcing, Translations and many more People Data Team: This team is focused on collecting Officers & Directors data from company-filed documents and websites, which is then published on our product. Engage in collecting and updating People Data for public and private companies worldwide, following established data collection procedures. Gain insights into various management structures across different companies and countries. Security Management Team: Security Management team aims to create an integrated risk management framework by interconnecting security reference identifiers. We work on product focused integration processes aimed at building a cross referenced framework which help create value for our clients. The team deals in ingestion of global reference identifiers from multiple vendors daily , along with running multiple checks on the database for various metrics. The team is also involved in resolving client issues with real time data corrections through vendor interactions. A section of the team is responsible for quality of the database by following sigma approach, making the database better than the industry standard. Public Ownership team: The ownership teams main mission is to provide our clients through our platform with the most accurate data in a timely manner. Our goal is to study the markets legislation and the different public financial data sources that will allow us, together with the tech team, to build repeatable processes and deliver new solutions to the market. Success is measured through our capacity to find new ways to broaden our data coverage, as well as enhancing the collection procedures. Responsibilities and Impact: High quality data (Financial Non-Financial Data) collation, analysis, extraction and entering the data in work tools as per guideline specifications for assigned vertical Understand the working of the dataset, be aware of the workflows and have strong working knowledge of work tools Providing input and ideas for new collection methods and product enhancements related to the dataset Deliver on predefined individual and team targets including delivering outcomes with quality and excellence. Create tech expertise within department Troubleshoots problems or issues and support team in enhancing the workflow/processes for department Reviewing feedback involving transactions content to help correct errors and establish or refine procedures and processes to improve accuracy. Basic Required Qualifications: Fresher - BBA/B.Com graduated in 2024/2025 OR PGDM/MBA specializing in Finance graduating in 20242025 Knowledge of corporate finance accountancy i.e., financial statements and annual reports is preferred Excellent communication skills, both written and oral Willing to work in 24*5 environment on rotational shifts Hybrid work environment, requiring apprentices to work from the base location for 3-4 days a week or in-office as per business requirements.
Posted 1 month ago
3.0 - 7.0 years
15 - 30 Lacs
Bengaluru
Hybrid
What lands you in this role Prepare and transform datasets through thorough data wrangling, cleansing, and preprocessing to enable high-quality analysis. Build and maintain efficient data ingestion pipelines to seamlessly integrate data from multiple sources. Conduct deep exploratory data analysis (EDA) to identify trends, patterns, and actionable insights. Leverage Python and SQL extensively for data manipulation, querying, and deploying analytical models. Use Excel and BI tools (Tableau, Power BI etc) for dashboards and data analysis. Translate complex model outputs into clear, actionable insights for both technical and business stakeholders. Partner closely with data engineers, analysts, and business teams to embed analytical solutions into existing workflows. Contribute to the development and optimization of statistical models and machine learning algorithms for targeted business problems. Support the design and validation of forecasting models to anticipate future business outcomes. Keep abreast of emerging techniques and innovations in data science, machine learning, and statistical modeling. Required Skills & Qualifications: 3 to 6 years of experience in Business Analysis, statistics, Data cleaning and transformation and related fields. Good proficiency in statistical modeling and hypothesis testing. Hands-on experience with Python (NumPy, Pandas, Scikit-learn, etc.). Advanced SQL skills for querying and manipulating large datasets. Experience in Excel and creating Dashboards is a plus. Experience in data wrangling, preprocessing, and feature engineering. Knowledge of data ingestion techniques and pipeline development. Ability to interpret predictive models, classification, regression, and clustering algorithms. Strong analytical and problem-solving skills with a keen eye for detail. Excellent communication and storytelling skills to present insights effectively.
Posted 1 month ago
4.0 - 9.0 years
6 - 11 Lacs
Bengaluru
Work from Office
About the Role: Grade Level (for internal use): 11 S&P Global Mobility The Role: Senior Data Engineer( AWS Cloud, Python) We are seeking a Senior Data Engineer with deep expertise in AWS Cloud Development to join our fast-paced data engineering organization. This role is critical to both the development of new data products and the modernization of existing platforms. The ideal candidate is a seasoned data engineer with hands-on experience designing, building, and optimizing large-scale data pipelines and architectures in both on-premises (e.g., Oracle) and cloud environments (especially AWS). This individual will also serve as a Cloud Development expert , mentoring and guiding other data engineers as they enhance their cloud skillsets. Responsibilities Data Engineering & Architecture Design, build, and maintain scalable data pipelines and data products. Develop and optimize ELT/ETL processes using a variety of data tools and technologies. Support and evolve data models that drive operational and analytical workloads. Modernize legacy Oracle-based systems and migrate workloads to cloud-native platforms. Cloud Development & DevOps (AWS-Focused) Build, deploy, and manage cloud-native data solutions using AWS services (e.g., S3, Lambda, Glue, EMR, Redshift, Athena, Step Functions). Implement CI/CD pipelines, IaC (e.g., Terraform or CloudFormation), and monitor cloud infrastructure for performance and cost optimization. Ensure data platform security, scalability, and resilience in the AWS cloud. Technical Leadership & Mentoring Act as a subject matter expert on cloud-based data development and DevOps best practices. Mentor data engineers on AWS architecture, infrastructure as code, and cloud-first design patterns. Participate in code and architecture reviews, enforcing best practices and high-quality standards. Cross-functional Collaboration Work closely with product managers, data analysts, software engineers, and other stakeholders to understand business needs and deliver end-to-end solutions. Support and evolve the roadmap for data platform modernization and new product delivery. What We're looking for: Required Qualifications 7+ years of experience in data engineering or equivalent technical role. 5+ years of hands-on experience with AWS Cloud Development and DevOps. Strong expertise in SQL , data modeling , and ETL/ELT pipelines . Deep experience with Oracle (PL/SQL, performance tuning, data extraction). Proficiency in Python and/or Scala for data processing tasks. Strong knowledge of cloud infrastructure (networking, security, cost optimization). Experience with infrastructure as code (Terraform). Familiarity with CI/CD pipelines and DevOps tooling (e.g., Jenkins, GitHub Actions). Preferred (Nice to Have) Experience with Google Cloud Platform (GCP), Snowflake Knowledge of containerization and orchestration tools (e.g., Docker, Kubernetes). Experience with modern orchestration tools (e.g., Airflow, dbt). Exposure to data cataloging, governance, and quality tools.
Posted 1 month ago
5.0 - 10.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Workday Data Mapping & Conversions Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with teams to develop solutions and ensure applications align with business needs. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead data mapping and conversion activities Develop and maintain data mapping documentation Ensure data integrity and accuracy in conversions Professional & Technical Skills: Must To Have Skills: Proficiency in Workday Data Mapping & Conversions Strong understanding of data integration concepts Experience with ETL tools for data transformation Knowledge of Workday HCM modules Hands-on experience with Workday Studio Ability to troubleshoot data mapping issues Additional Information: The candidate should have a minimum of 5 years of experience in Workday Data Mapping & Conversions This position is based at our Bengaluru office A 15 years full-time education is required Qualification 15 years full time education
Posted 2 months ago
7.0 - 12.0 years
9 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Workday Data Mapping & Conversions Good to have skills : No Function Specialty Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be involved in designing, building, and configuring applications to meet business process and application requirements. Your typical day will revolve around creating solutions that align with business needs and application specifications. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Provide solutions to problems for their immediate team and across multiple teams Lead data mapping and conversion projects effectively Develop and maintain data mapping documentation Ensure data integrity and accuracy throughout the conversion process Professional & Technical Skills: Must To Have Skills:Proficiency in Workday Data Mapping & Conversions Strong understanding of data integration concepts Experience with ETL tools for data transformation Knowledge of Workday HCM and Financials modules Hands-on experience in data migration projects Additional Information: The candidate should have a minimum of 7.5 years of experience in Workday Data Mapping & Conversions This position is based at our Bengaluru office A 15 years full time education is required Qualifications 15 years full time education
Posted 2 months ago
5.0 - 7.0 years
20 - 25 Lacs
Gurugram
Work from Office
This is a techno-functional position that combines strong technical skills with a deep understanding of business needs and requirements with 5-7 years or experience. The role focuses on developing and maintaining advanced data engineering solutions for pre-sales value quantification within the Services business unit. As a Sr. Analyst, you will be responsible for creating and optimizing data pipelines, managing large datasets, and ensuring the integrity and accessibility of data to support Mastercard s internal teams in quantifying the value of services, enhancing customer engagement, and driving business outcomes. The role requires close collaboration across teams to ensure data solutions meet business needs and deliver measurable impact. Role Responsibilities: Data Engineering & Pipeline Development: Develop and maintain robust data pipelines to support the value quantification process. Utilize tools such as Apache NiFi, Azure Data Factory, Pentaho, Talend, SSIS, and Alteryx to ensure efficient data integration and transformation. Data Management and Analysis: Manage and analyze large datasets using SQL, Hadoop, and other database management systems. Perform data extraction, transformation, and loading (ETL) to support value quantification efforts. Advanced Analytics Integration: Use advanced analytics techniques, including machine learning algorithms, to enhance data processing and generate actionable insights. Leverage programming languages such as Python (Pandas, NumPy, PySpark) and Impala for data analysis and model development. Business Intelligence and Reporting: Utilize business intelligence platforms such as Tableau and Power BI to create insightful dashboards and reports that communicate the value of services. Generate actionable insights from data to inform strategic decisions and provide clear, data-backed recommendations. Cross-Functional Collaboration & Stakeholder Engagement: Collaborate with Sales, Marketing, Consulting, Product, and other internal teams to understand business needs and ensure successful data solution development and deployment. Communicate insights and data value through compelling presentations and dashboards to senior leadership and internal teams, ensuring tool adoption and usage. All About You: Data Engineering Expertise: Proficiency in data engineering tools and techniques to develop and maintain data pipelines. Experience with data integration tools such as Apache NiFi, Azure Data Factory, Pentaho, Talend, SSIS, and Alteryx. Advanced SQL Skills: Strong skills in SQL for querying and managing large datasets. Experience with database management systems and data warehousing solutions. Programming Proficiency: Knowledge of programming languages such as Python (Pandas, NumPy, PySpark) and Impala for data analysis and model development. Business Intelligence and Reporting: Experience in creating insightful dashboards and reports using business intelligence platforms such as Tableau and Power BI. Statistical Analysis: Ability to perform statistical analysis to identify trends, correlations, and insights that support strategic decision-making. Cross-Functional Collaboration: Strong collaboration skills to work effectively with Sales, Marketing, Consulting, Product, and other internal teams to understand business needs and ensure successful data solution development and deployment. Communication and Presentation: Excellent communication skills to convey insights and data value through compelling presentations and dashboards to senior leadership and internal teams. Execution Focus: A results-driven mindset with the ability to balance strategic vision with tactical execution, ensuring that data solutions are delivered on time and create measurable business value. Education: Bachelor s degree in Data Science, Computer Science, Business Analytics, Economics, Finance, or a related field. Advanced degrees or certifications in analytics, data science, AI/ML, or an MBA are preferred.
Posted 2 months ago
4.0 - 9.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Person should have 4+yrs working experience in SQL Power BI. Should have exposure to end to end development from data extraction to Dashboard development. Should have strong SQL skills along with Power BI. Knowledge in DWH is an added advantage. Understand business requirement In-depth knowledge of creating DAX queries, filters, what-if analysis, performance tuning, building different types of visualizations Required Qualifications and Experience 4-5+ years of relevant work experience in SQL Power BI Bachelor s/BSc/MSc
Posted 2 months ago
1.0 - 5.0 years
3 - 6 Lacs
Gurugram
Work from Office
Title: Executive/Senior Executive Control Tower About the Team & Role: Control Tower team is working towards operational efficiency and customer satisfaction. It provides real- time and end-to-end visibility across the supply chain. This team works closely with the stakeholders to fulfil customer requirements with more efficiency. This role will report to Assistant Manager of the department. To Succeed in this role, you should have the following: Experience in working on High volume of Data Expertise in MS office and advance MS excel skills You will be responsible for: Data Accuracy and Integrity: Monitor the accuracy and integrity of data within systems, ensuring that information used for decision-making and reporting is reliable and consistent. Monitoring Operations: Monitoring the flow of Orders, information, and Performance of All business. This involves keeping track of inventory levels, Logistics Performance, and production schedules. Stake holder Coordination : Coordinating activities among different departments within the organization to ensure seamless operations and timely delivery of products. Keep working on RCAs closely with respective stake holder and identify the issues causing the escalations. Monitor and Track All Input KPIs like Inventory, Pendency Orders flow & Timely give insight if any KPI is not on Track for all LKs Factories. Coordinate with Tech and Analytics team for all Plant requirements and Timely resolutions of Issues as per TATs. Insights and Recommendations : Must have good skills to share insights and solutions to complex problems.
Posted 2 months ago
4.0 - 6.0 years
10 - 20 Lacs
Jaipur, Bengaluru
Work from Office
Responsibilities: Implement, and optimize AI/ML models and algorithms with a focus on Agentic AI and Retrieval-Augmented Generation (RAG). Implement scalable machine learning models and algorithms to solve complex challenges. Research and integrate the latest AI tools and frameworks into existing systems. Stay up-to-date with advancements in AI/ML research and apply them to practical problems. Experiment with and implement Generative AI technologies across multiple domains. Leverage cloud platforms like Google Cloud, AWS, and Azure for scalable AI/ML deployments. Utilize open-source large language models (LLMs) such as LLama3 in AI application development. Key Skills: Strong understanding of Agentic AI concepts and applications. Hands-on experience with Retrieval-Augmented Generation (RAG) models. Expertise in machine learning and deep learning frameworks (e.g., TensorFlow, PyTorch). Experience with Large language models (e.g., GPT-4, LLama, Amazon Bedrock) and integrating them into production systems. Proficiency in programming languages such as Python, and familiarity with libraries like Hugging Face Transformers, langchain and llama index. Experience in building and deploying AI/ML solutions using cloud platforms (AWS, GCP, Azure). Knowledge of data retrieval, natural language processing (NLP), and reinforcement learning techniques. Strong problem-solving skills and ability to work in a fast-paced, collaborative environment. Experience with model optimization, fine-tuning, and inference strategies to improve system performance. Preferred Skills: Experience with advanced NLP tasks like text generation, summarization, and question answering, familiarity with reinforcement learning algorithms and their application in intelligent agent design. Familiarity with containerization technologies (e.g., Docker, Kubernetes). Background in computer vision and image generation techniques. If you're passionate about AI/ML and excited about contributing to the future of intelligent systems with Generative AI technologies, we encourage you to apply! Join us in shaping the next generation of AI-driven innovation.
Posted 2 months ago
1.0 - 5.0 years
3 - 7 Lacs
Pune
Work from Office
The future is our choice At Atos, as the global leader in secure and decarbonized digital, our purpose is to help design the future of the information space. Together we bring the diversity of our people s skills and backgrounds to make the right choices with our clients, for our company and for our own futures. Roles & Responsibilities: - Should have solid experience on Databricks. Overall 6+ years of experience. Requirements: Good experience on the medallion architecture Very good knowledge in pyspark, python n sql Lake flow connectors, DLT, Delta sharing. Data extraction from various source application like sfdc, sap, and cloud apps Data integration from databricks to various downstream Need to have good understanding and hands-on of data warehousing concepts. Our Offering: Global cutting-edge IT projects that shape the future of digital and have a positive impact on environment. Wellbeing programs & work-life balance - integration and passion sharing events. Attractive Salary and Company Initiative Benefits Courses and conferences. Attractive Salary. Hybrid work culture. Here at Atos, diversity and inclusion are embedded in our DNA. Read more about our commitment to a fair work environment for all. Atos is a recognized leader in its industry across Environment, Social and Governance (ESG) criteria. Find out more on our CSR commitment. Choose your future. Choose Atos.
Posted 2 months ago
3.0 - 5.0 years
5 - 7 Lacs
Faridabad
Work from Office
We are seeking a skilled Pyramid BI Analyst to join our dynamic team. In this role, you will leverage Pyramid Analytics platform to transform complex data into actionable insights, driving informed business decisions. Your expertise will empower stakeholders across the organization to make data-driven choices with confidence. Responsibilities: Develop and maintain dashboards, reports, and data visualizations using Pyramid Analytics tools to support business decision-making. Identify trends, patterns, and anomalies in data to provide insights and recommendations. Administer and manage Pyramid BI platform, ensuring optimal performance and availability. Monitor system performance, identify issues, and implement solutions to enhance efficiency. Design and implement data models that integrate data from sources like Salesforce, AWS etc. Replicate reports and dashboards from Tableau CRM and other tools to Pyramid Analytics Collaborate with stakeholders to understand their data needs and provide tailored reports/dashboards. Provide training and support to end-users on Pyramid BI functionalities. Stay updated with the latest Pyramid BI features and best practices. Requirements: Strong knowledge of Pyramid BI architecture, components, analytic models and functionalities. Familiarity with Pyramids PYRANA query engine is a plus. Strong experience with Pyramid Analytics, including data preparation, business analytics, and data science modules. Proficiency in SQL for data extraction and manipulation. Experience with other BI tools like Power BI or Tableau is advantageous. Excellent understanding of Pyramid licenses, user types and access controls. Excellent problem-solving skills and attention to detail. Strong communication and interpersonal skills. Ability to work independently and as part of a team. Preferred Qualifications: 3+ years experience in a business intelligence or data analytics role, with hands-on experience using Pyramid Analytics. Certification in Pyramid BI or related technologies. Experience with other BI tools (e.g., Power BI, Tableau). Knowledge of data warehousing concepts. Experience in the Automobile sector is plus
Posted 2 months ago
3.0 - 7.0 years
5 - 9 Lacs
Bengaluru
Work from Office
JD for SAP BODS. Key Responsibilities: Design, develop, and deploy ETL processes using SAP BODS for data extraction, transformation, and loading. Work closely with functional teams to understand business requirements and translate them into technical specifications. Perform data profiling, data quality checks, and transformations to ensure high data integrity. Optimize BODS jobs for performance, scalability, and reliability. Create and maintain technical documentation for ETL jobs and data flows. Support data migration activities from legacy systems to SAP or other target systems. Troubleshoot and resolve issues related to ETL jobs, data loads, and performance bottlenecks. Collaborate with other data engineers, BI developers, and stakeholders across various teams. Implement best practices for error handling, logging, and scheduling of jobs using SAP Management Console or equivalent tools. Work on version control and deployment strategies across development, QA, and production environments.
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39815 Jobs | Dublin
Wipro
19317 Jobs | Bengaluru
Accenture in India
15105 Jobs | Dublin 2
EY
14860 Jobs | London
Uplers
11139 Jobs | Ahmedabad
Amazon
10431 Jobs | Seattle,WA
IBM
9214 Jobs | Armonk
Oracle
9174 Jobs | Redwood City
Accenture services Pvt Ltd
7676 Jobs |
Capgemini
7672 Jobs | Paris,France