Jobs
Interviews

9452 Extract Jobs - Page 7

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 - 8.0 years

0 Lacs

Rajasthan

On-site

Aditya Birla Finance Limited ABFL-Area Sales Head-SME EM-Udaipur Location: Udaipur, Rajasthan 4) Key Result Areas: Write the key results expected from the job and the supporting actions for each of these key result areas (For a majority of jobs typically there could be 4- 7 key result areas)- Maximum 10 KRAs can be updated Key Result Areas (Max 1325 Characters) Supporting Actions (Max 1325 Characters) Branch Sales & Market Penetration Management Defines the branch strategy to drives sales and achievement of targets in terms of product mix, customer segments, channel/ distributor mix and budgets for marketing/ lead generation initiatives. Sets business targets for self and RMs/ SRMs in the team and works towards achievement of the same Identifies business growth opportunities in the region in terms of channel partners and key customer segments and directs RMs/ SRMs towards tapping these opportunities Engages with Channel partners, DSAs and other distributors to understand the proposals submitted and extract critical supporting documentation. Increases branch revenues through strong focus on cross-selling initiatives & innovative product mixes. Enables and drives contests/marketing campaigns to spread brand and product awareness and expand business volumes for the branch Monitors and achieves target book size through the above activities Branch Profitability Management Ensures achievement of branch book size, revenues, NII & PF targets from direct / channel sales by aligning sales actions with branch business strategy Maximizes profits by ensuring targeted fee income & effective cost management Ensures branch budgets are adhered to and optimally utilized for maximum returns Distribution Expansion Operations Monitors local market trends and competitive offerings & identifies opportunities for distribution expansion for the branch Engages regularly with key channel partners & develops consistent touch points with them to enable quicker and better customer connectivity Devises a strategy to enable branch channel partners through knowledge sharing via engagement programs & sales training, in order to build long term partnerships and capabilities Monitors SLAs & sales efficiencies of channels & ROI of channels Effectively deploys schemes & prioritize sales of high revenue products/ structures through distribution network Branch Customer Servicing & Relationship Management Monitors client servicing metrics for the branch and sets standards for the same. Mentors and develops RMs/ SRMs to achieve client centricity in their interactions Supports RMs/ SRMs in moving exception cases through the credit risk team within regulatory and compliance guidelines Identifies and implements market best practices for enhancing operational efficiency, productivity and customer satisfaction across branch operations Enables RMs/ SRMs to develop strong client relationships in order to carry out pre sanction due diligence/ post sanction surveillance from a de-risking perspective Branch Sales Operations & Internal Compliance Drives faster TATs on deal closures, tighter due diligence & compliant operations to improve branch operations metrics Shares policy inputs and updates based on market intelligence of the region/ market dynamics with relevant internal stakeholders Recommends process changes/improvements to enhance operational efficiencies and strengthen process controls Supports branch audit activities and addresses observations , if any, with appropriate levels of urgency Acts as a point of escalation on delinquent cases/ potential NPAs and closely monitors these through the team for collection dues. Branch Sales MIS & Reporting Ensures all branch sales metrics are shared and reported in a timely, accurate and compliant manner to the RSM/ ARSM and Business Analyst- Mortgages Monitors discrepancies/ variances in reporting and ensures they are corrected and reconciled with actual target achievement numbers Leverages sales MIS to track branch progress against targets on Book size, NII and PF and overall P&L People Management Evaluates Branch manpower plans & ensure effective retention through developing performance linked incentive structures Oversees sourcing, recruitment, on-boarding and capability development of team members to drive productivity Guides RMs/ SRMs for better customer acquisition, retention & helps them achieve superior outcomes by setting performance standards Trains RMs/ SRMs on product structuring and business finance to enable greater customer connect and increase the perception of their credibility as financial advisors/ representatives Ensures optimal work allocation within the branch team and drives accountability for results 5) Job Purpose of Direct Reports: Describe the job purpose of the direct report/s to the job (in 2-3 lines for each report) RM To effectively contribute towards building the asset of the Mortgages Division by marketing/ selling targeted Home Loan products and solutions to potential and existing customers at targeted yields. Ensure client and channel acquisition through effective networking, organizing local area programs, direct builder network and cross selling through group systems (ABMM/ABG group companies). SRM To effectively contribute towards building the Mortgages line of business and loan book by marketing/ selling all products (LAP/LRD/HL) and solutions to potential and existing customers at targeted yields and fee through a strong distribution network To strengthen the distribution network through the identification, empanelment and activation of able DSAs through knowledge sharing and capability building To ensure client and channel acquisition through effective networking, organizing local area programs, direct builder network, cross-selling through group (ABG group companies) To establish and nurture strong customer relationships through effective customer management and relationships building measures and techniques To leverage an understanding of local markets and preferences and facilitate structuring of loans in accordance To ensure all necessary due diligence is conducted to prevent fraudulent loans and ensure all sales processes are carried out in keeping with internal and regulatory guidelines, 6) Relationships: Describe the nature and purpose of most important contacts or relationship (except superior/team members) with individuals, departments, organizations inside and outside of the organization, that job is required to interact with in order to deliver the job objectives Relationship Type (Max 80 Characters) Frequency Nature (Max 1325 Characters) Internal RSM/ ARSM Credit Risk team Operations Team Business Analyst- Mortgages Daily Daily Daily Daily New client development, deal closures, market trend analysis, new market potential Loan proposals, documentation execution, loan sanctions Timely disbursements, monitoring for any deviations Monitoring Targets/ Sales MIS External Group & Non-group clients Key Channel Partners Daily Daily Customer relationship management (CRM), lead generation Lead generation, referrals, market & competitive intelligence 7) Organizational Relationships: Provide the structure for a level above and below the position for which this job description is written. Use position titles in the structured and indicate all the reports of the position. Minimum Experience Level 4 - 8 years Job Qualifications B.Com Honours,Graduate

Posted 1 day ago

Apply

8.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Category Engineering Experience Sr. Manager Primary Address Bangalore, Karnataka Overview Voyager (94001), India, Bangalore, Karnataka Senior Manager- Machine Learning Engineering Our mission at Capital One is to create trustworthy, reliable and human-in-the-loop AI systems, changing banking for good. For years, Capital One has been leading the industry in using machine learning to create real-time, intelligent, automated customer experiences. From informing customers about unusual charges to answering their questions in real time, our applications of AI & ML are bringing humanity and simplicity to banking. Because of our investments in public cloud infrastructure and machine learning platforms, we are now uniquely positioned to harness the power of AI. We are committed to building world-class applied science and engineering teams and continue our industry leading capabilities with breakthrough product experiences and scalable, high-performance AI infrastructure. At Capital One, you will help bring the transformative power of emerging AI capabilities to reimagine how we serve our customers and businesses who have come to love the products and services we build. We are looking for an experienced Senior Manager, Machine Learning Engineering in MLX Platform to help us build the Model Governance and Observability systems. In this role you will work on to build robust SDKs, platform components to collect metadata, traces and parameters of models running at scale and work on cutting edge Gen AI frameworks and their instrumentation. You will also lead the teams to analyze and optimize model performance, latency, and resource utilization to maintain high standards of efficiency, reliability and compliance. You will build and lead a highly talented software engineering team to unlock innovation, speed to market and real time processing. This leader must be a deep technical expert and thought leaders that help accelerate adoption of the engineering practices, up skill themselves with the industry innovations, trends and practices in Software Engineering and Machine Learning. Success in the role requires an innovative mind, a proven track record of delivering highly available, scalable and resilient governance and observability platforms. What You’ll Do Lead, manage and grow multiple teams of product focused software engineers and managers to build and scale Machine Learning Model Governance and AI Observability platforms & SDK’s Mentor and guide professional and technical development of engineers on your team Work with product leaders to define the strategy, roadmap and destination architecture Bring a passion to stay on top of tech trends, experiment with and learn new technologies, participate in internal & external technology communities, and mentor other members of the engineering community Encourage innovation, implementation of state of the art ( SOTA) research technologies, inclusion, outside-of-the-box thinking, teamwork, self-organization, and diversity Work on cutting edge Gen AI frameworks/LLMs and provide observability using open Telemetry Lead the team in Search ( semantic and key-word based ) and required pipelines to extract the data , ingest the data , convert into embeddings and expose the APIs Analyze and optimize model performance, latency, and resource utilization to maintain high standards of efficiency, reliability and compliance Collaborate as part of a cross-functional Agile team to create and enhance software that enables state of the art, next generation big data and machine learning applications. Basic Qualifications: Bachelor's degree in Computer Science, Computer Engineering or a technical field At least 15 years of experience programming with Python, Go, Scala, or C/C++ At least 5 years of experience designing and building and deploying enterprise AI or ML applications or platforms. At least 3 years of experience implementing full lifecycle ML automation using MLOps(scalable development to deployment of complex data science workflows) At least 4 years of experience leading teams developing Machine Learning solutions and scaling At least 8 years of people management experience and experience in managing managers. Preferred Qualifications: Master’s degree or PhD in Engineering, Computer Science, a related technical field, or equivalent practical experience with a focus on modern AI techniques. Strong problem solving and analytical skills with the ability to work independently with ownership, and as a part of a team with a strong sense of responsibilities. Experience designing large-scale distributed platforms and/or systems in cloud environments such as AWS, Azure, or GCP. Experience architecting cloud systems for security, availability, performance, scalability, and cost. Experience with delivering very large models through the MLOps life cycle from exploration to serving Ability to move fast in an environment with ambiguity at times, and with competing priorities and deadlines. Experience at tech and product-driven companies/startups preferred. Ability to iterate rapidly with researchers and engineers to improve a product experience while building the core platform components for Observability and Model Governance Experience with one or multiple areas of Gen AI technology stack including prompt engineering, guardrails, vector databases/knowledge bases, LLMs hosting, Retrieval, pre-training and fine-tuning , understanding of Observability of Gen AI stack ( Agentic AI, Opensource Gen AI Observability frameworks) and open Telemetry) No agencies please. Capital One is an equal opportunity employer (EOE, including disability/vet) committed to non-discrimination in compliance with applicable federal, state, and local laws. Capital One promotes a drug-free workplace. Capital One will consider for employment qualified applicants with a criminal history in a manner consistent with the requirements of applicable laws regarding criminal background inquiries, including, to the extent applicable, Article 23-A of the New York Correction Law; San Francisco, California Police Code Article 49, Sections 4901-4920; New York City’s Fair Chance Act; Philadelphia’s Fair Criminal Records Screening Act; and other applicable federal, state, and local laws and regulations regarding criminal background inquiries. If you have visited our website in search of information on employment opportunities or to apply for a position, and you require an accommodation, please contact Capital One Recruiting at 1-800-304-9102 or via email at RecruitingAccommodation@capitalone.com. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodations. For technical support or questions about Capital One's recruiting process, please send an email to Careers@capitalone.com Capital One does not provide, endorse nor guarantee and is not liable for third-party products, services, educational tools or other information available through this site. Capital One Financial is made up of several different entities. Please note that any position posted in Canada is for Capital One Canada, any position posted in the United Kingdom is for Capital One Europe and any position posted in the Philippines is for Capital One Philippines Service Corp. (COPSSC).

Posted 1 day ago

Apply

5.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Leads projects for design, development and maintenance of a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with key business stakeholders, IT experts and subject-matter experts to plan, design and deliver optimal analytics and data science solutions. Works on one or many product teams at a time. Key Responsibilities Designs and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Designs and implements framework to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Designs and provide guidance on building reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Designs and implements physical data models to define the database structure. Optimizing database performance through efficient indexing and table relationships. Participates in optimizing, testing, and troubleshooting of data pipelines. Designs, develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity. Assists with renovating the data management infrastructure to drive automation in data integration and management. Ensures the timeliness and success of critical analytics initiatives by using agile development technologies such as DevOps, Scrum, Kanban Coaches and develops less experienced team members. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience Intermediate experience in a relevant discipline area is required. Knowledge of the latest technologies and trends in data engineering are highly preferred and includes: 5-8 years of experince Familiarity analyzing complex business systems, industry requirements, and/or data regulations Background in processing and managing large data sets Design and development for a Big Data platform using open source and third-party tools SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Experience developing applications requiring large file movement for a Cloud-based environment and other data extraction tools and methods from a variety of sources Experience in building analytical solutions Intermediate Experiences In The Following Are Preferred Experience with IoT technology Experience in Agile software development Qualifications Work closely with business Product Owner to understand product vision. Play a key role across DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. Independently design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. Responsible for creation, maintenence and management of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP). Take part in evaluation of new data tools, POCs and provide suggestions. Take full ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization. Proactively address and resolve issues that compromise data accuracy and usability. Preferred Skills Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. Data Replication: Working knowledge of replication technologies like Qlik Replicate is a plus API: Working knowledge of API to consume data from ERP, CRM

Posted 1 day ago

Apply

50.0 years

0 Lacs

Pune, Maharashtra, India

On-site

About Client :- Our client is a French multinational information technology (IT) services and consulting company, headquartered in Paris, France. Founded in 1967, It has been a leader in business transformation for over 50 years, leveraging technology to address a wide range of business needs, from strategy and design to managing operations. The company is committed to unleashing human energy through technology for an inclusive and sustainable future, helping organizations accelerate their transition to a digital and sustainable world. They provide a variety of services, including consulting, technology, professional, and outsourcing services. Job Details :- Position: Data Analyst - AI& Bedrock Experience Required: 6-10yrs Notice: immediate Work Location: Pune Mode Of Work: Hybrid Type of Hiring: Contract to Hire Job Description:- FAS - Data Analyst - AI & Bedrock Specialization About Us: We are seeking a highly experienced and visionary Data Analyst with a deep understanding of artificial intelligence (AI) principles and hands-on expertise with cutting-edge tools like Amazon Bedrock. This role is pivotal in transforming complex datasets into actionable insights, enabling data-driven innovation across our organization. Role Summary: The Lead Data Analyst, AI & Bedrock Specialization, will be responsible for spearheading advanced data analytics initiatives, leveraging AI and generative AI capabilities, particularly with Amazon Bedrock. With 5+ years of experience, you will lead the design, development, and implementation of sophisticated analytical models, provide strategic insights to stakeholders, and mentor a team of data professionals. This role requires a blend of strong technical skills, business acumen, and a passion for pushing the boundaries of data analysis with AI. Key Responsibilities: • Strategic Data Analysis & Insight Generation: o End-to-end data analysis projects, from defining business problems to delivering actionable insights that influence strategic decisions. o Utilize advanced statistical methods, machine learning techniques, and AI-driven approaches to uncover complex patterns and trends in large, diverse datasets. o Develop and maintain comprehensive dashboards and reports, translating complex data into clear, compelling visualizations and narratives for executive and functional teams. • AI/ML & Generative AI Implementation (Bedrock Focus): o Implement data analytical solutions leveraging Amazon Bedrock, including selecting appropriate foundation models (e.g., Amazon Titan, Anthropic Claude) for specific use cases (text generation, summarization, complex data analysis). o Design and optimize prompts for Large Language Models (LLMs) to extract meaningful insights from unstructured and semi-structured data within Bedrock. o Explore and integrate other AI/ML services (e.g., Amazon SageMaker, Amazon Q) to enhance data processing, analysis, and automation workflows. o Contribute to the development of AI-powered agents and intelligent systems for automated data analysis and anomaly detection. • Data Governance & Quality Assurance: o Ensure the accuracy, integrity, and reliability of data used for analysis. o Develop and implement robust data cleaning, validation, and transformation processes. o Establish best practices for data management, security, and governance in collaboration with data engineering teams. • Technical Leadership & Mentorship: o Evaluate and recommend new data tools, technologies, and methodologies to enhance analytical capabilities. o Collaborate with cross-functional teams, including product, engineering, and business units, to understand requirements and deliver data-driven solutions. • Research & Innovation: o Stay abreast of the latest advancements in AI, machine learning, and data analytics trends, particularly concerning generative AI and cloud-based AI services. o Proactively identify opportunities to apply emerging technologies to solve complex business challenges. Required Skills & Qualifications: • Bachelor's or Master's degree in Computer Science, Data Science, Statistics, Mathematics, Engineering, or a related quantitative field. • 5+ years of progressive experience as a Data Analyst, Business Intelligence Analyst, or similar role, with a strong portfolio of successful data-driven projects. • Proven hands-on experience with AI/ML concepts and tools, with a specific focus on Generative AI and Large Language Models (LLMs). • Demonstrable experience with Amazon Bedrock is essential, including knowledge of its foundation models, prompt engineering, and ability to build AI-powered applications. • Expert-level proficiency in SQL for data extraction and manipulation from various databases (relational, NoSQL). • Advanced proficiency in Python (Pandas, NumPy, Scikit-learn, etc.) or R for data analysis, statistical modeling, and scripting. • Strong experience with data visualization tools such as Tableau, Power BI, Qlik Sense, or similar, with a focus on creating insightful and interactive dashboards. • Experience with cloud platforms (AWS preferred) and related data services (e.g., S3, Redshift, Glue, Athena). • Excellent analytical, problem-solving, and critical thinking skills. • Strong communication and presentation skills, with the ability to convey complex technical findings to non-technical stakeholders. • Ability to work independently and collaboratively in a fast-paced, evolving environment. Preferred Qualifications: • Experience with other generative AI frameworks or platforms (e.g., OpenAI, Google Cloud AI). • Familiarity with data warehousing concepts and ETL/ELT processes. • Knowledge of big data technologies (e.g., Spark, Hadoop). • Experience with MLOps practices for deploying and managing AI/ML models. Learn about building AI agents with Bedrock and Knowledge Bases to understand how these tools revolutionize data analysis and customer service.

Posted 1 day ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Level AI was founded in 2019 and is a Series C startup headquartered in Mountain View, California. Level AI revolutionizes customer engagement by transforming contact centers into strategic assets. Our AI-native platform leverages advanced technologies such as Large Language Models to extract deep insights from customer interactions. By providing actionable intelligence, Level AI empowers organizations to enhance customer experience and drive growth. Consistently updated with the latest AI innovations, Level AI stands as the most adaptive and forward-thinking solution in the industry. We are a fast-growing startup looking for a top-notch QA engineer with a proven track record of building test automation for an enterprise product. You will be at the core of every new feature, every product decision, and every touchpoint of our users. You will directly work with a small, experienced team of technologists to identify and solve a new set of problems. Our team is composed of leaders from Amazon Alexa, Zoho, Facebook, and other tech companies. If you are a problem solver and enjoy thinking creatively, you would love to be on this team!! AI adoption is not optional—it’s foundational to how we work, learn, and build better. Your responsibilities at Level AI include, but are not limited to Product expert with deep understanding of all user scenarios Work cross functionally to gather input on test scenarios Define, develop and implement test requirements for product features Assess overall product quality and define a strategy to deliver high-quality features Develop new automated tests to reduce time to release product features Developing framework to do load and stress testing of features Plan, execute, monitor, and validate testing processes Supporting manual functional testing such as creating and executing test cases when required Developing new builds and release pipelines Designing and executing tools and scripts to develop multiple product versions Maintaining and evaluating tools supporting process automation for product release Correcting build errors and maintaining formal release records to track release content Continuously explore, adopt, and leverage AI tools and techniques to improve engineering workflows, code quality, testing, and delivery speed We'd love to explore more about you if you have 5+ years of experience working as a Quality Engineer with Automation and Manual Testing Strong understanding of programming language(python) Experience with Web application standards and API Strong knowledge of SaaS model/cloud-delivered application testing Good understanding of Database concepts Experience with Test and defect management tools Excellent analytical and program solving skills, excellent written and oral communication, self-starter, and highly motivated To learn more visit : https://thelevel.ai/ Funding : https://www.crunchbase.com/organization/level-ai LinkedIn : https://www.linkedin.com/company/level-ai/

Posted 1 day ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

Remote

As the global leader in high-speed connectivity, Ciena is committed to a people-first approach. Our teams enjoy a culture focused on prioritizing a flexible work environment that empowers individual growth, well-being, and belonging. We’re a technology company that leads with our humanity—driving our business priorities alongside meaningful social, community, and societal impact. Why Ciena We are big proponents of life-work integration and provide the flexibility and tools to make it a reality with remote work and potentially, part-time work. We believe an inclusive, diverse, and barrier-free work environment makes for empowered and committed employees. We recognize the importance of well-being and offer programs and benefits to support and sustain the mental and physical health of our employees and their families and also offer a variety of paid family leave programs. We are committed to employee development, offering tuition reimbursement and a variety of in-house learning and mentorship opportunities. We know that financial security is important. We offer competitive salaries and incentive programs, RSU’s (job level specific) and an employee share option purchase program. We realize time away to recharge is important. We offer flexible paid time off! Great work deserves recognition. We have a robust recognition program, with ongoing and enhanced awards for exemplary performance. How You Will Contribute Reporting to Senior Manager, IT Applications, as DocuSign Insight Developer professional, you will play a critical role in ensuring the system is available for business and functioning. You will be responsible for providing system access, ensuring the security profiles are set up correctly and configured as per the roles and responsibilities. Your technology expertise in replicating the problem, find the solution, working with internal team and external vendors to resolve the issues. Automatically extract critical clauses and terms. Support: Collaborate with the End User Support team and internal Legal Application Support team to provide the DocuSign Insight Support to handle the incoming day to day issues in Service Now ticketing tool. User Access: Understand the system security and provide the correct role assignment to users to perform the tasks in tool Ticket Management: Monitor the tickets in ServiceNow and assign the tickets as level 2 support for all tickets related to DocuSign Insight Support group. Environment Management: Work with vendor support team to report and resolve the system performance issues and product bugs. Also testing the software upgrades releases from DocuSign. Continuous Improvement: Stay up-to-date with the latest industry trends, tools, and best practices in application support. Identify areas for process improvement and propose innovative solutions to enhance the product and internal process. What Does Ciena Expect of You? Initiative – you’re a self-starter who works with limited direction and is committed to delivering against aggressive deadlines. A customer first mentality – what’s important to the customer is also important to you. Agility – with an ability to flex between the strategic and tactical, you manage competing and ever-changing priorities and maintain a balanced and methodical approach to problem solving. Communication expertise – you possess the ability to tailor your message and ideas to the audience to ensure understanding and consensus. The flexibility to work independently and as part of a broader team – you thrive in a team environment, are comfortable working independently, and know how to get things done in a virtual environment. Relationship builder – with a proven ability to influence at all levels, you’re able to quickly develop trusted connections and get work done through others. A commitment to innovation – you keep abreast of competitive developments and are always keen to formulate new ideas and problem solve. The Must Haves Bachelor's degree in Computer Science, Engineering. Programing Language Experience in at least one core languages like JAVA or C-SHARP Strong knowledge on Microsoft Office (MS Excel) Strong knowledge of software development and UI/UX Excellent analytical and Logical thinking and problem-solving skills. Strong attention to detail and accuracy. Effective communication and collaboration skills. Ability to work independently and within a team in a fast-paced, dynamic environment. Not ready to apply? Join our Talent Community to get relevant job alerts straight to your inbox. At Ciena, we are committed to building and fostering an environment in which our employees feel respected, valued, and heard. Ciena values the diversity of its workforce and respects its employees as individuals. We do not tolerate any form of discrimination. Ciena is an Equal Opportunity Employer, including disability and protected veteran status. If contacted in relation to a job opportunity, please advise Ciena of any accommodation measures you may require.

Posted 1 day ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

Remote

As the global leader in high-speed connectivity, Ciena is committed to a people-first approach. Our teams enjoy a culture focused on prioritizing a flexible work environment that empowers individual growth, well-being, and belonging. We’re a technology company that leads with our humanity—driving our business priorities alongside meaningful social, community, and societal impact. Why Ciena We are big proponents of life-work integration and provide the flexibility and tools to make it a reality with remote work and potentially, part-time work. We believe an inclusive, diverse, and barrier-free work environment makes for empowered and committed employees. We recognize the importance of well-being and offer programs and benefits to support and sustain the mental and physical health of our employees and their families and also offer a variety of paid family leave programs. We are committed to employee development, offering tuition reimbursement and a variety of in-house learning and mentorship opportunities. We know that financial security is important. We offer competitive salaries and incentive programs, RSU’s (job level specific) and an employee share option purchase program. We realize time away to recharge is important. We offer flexible paid time off! Great work deserves recognition. We have a robust recognition program, with ongoing and enhanced awards for exemplary performance. How You Will Contribute Reporting to Senior Manager, IT Applications, as DocuSign Insight Developer professional, you will play a critical role in ensuring the system is available for business and functioning. You will be responsible for providing system access, ensuring the security profiles are set up correctly and configured as per the roles and responsibilities. Your technology expertise in replicating the problem, find the solution, working with internal team and external vendors to resolve the issues. Automatically extract critical clauses and terms. Support: Collaborate with the End User Support team and internal Legal Application Support team to provide the DocuSign Insight Support to handle the incoming day to day issues in Service Now ticketing tool. User Access: Understand the system security and provide the correct role assignment to users to perform the tasks in tool Ticket Management: Monitor the tickets in ServiceNow and assign the tickets as level 2 support for all tickets related to DocuSign Insight Support group. Environment Management: Work with vendor support team to report and resolve the system performance issues and product bugs. Also testing the software upgrades releases from DocuSign. Continuous Improvement: Stay up-to-date with the latest industry trends, tools, and best practices in application support. Identify areas for process improvement and propose innovative solutions to enhance the product and internal process. What Does Ciena Expect of You? Initiative – you’re a self-starter who works with limited direction and is committed to delivering against aggressive deadlines. A customer first mentality – what’s important to the customer is also important to you. Agility – with an ability to flex between the strategic and tactical, you manage competing and ever-changing priorities and maintain a balanced and methodical approach to problem solving. Communication expertise – you possess the ability to tailor your message and ideas to the audience to ensure understanding and consensus. The flexibility to work independently and as part of a broader team – you thrive in a team environment, are comfortable working independently, and know how to get things done in a virtual environment. Relationship builder – with a proven ability to influence at all levels, you’re able to quickly develop trusted connections and get work done through others. A commitment to innovation – you keep abreast of competitive developments and are always keen to formulate new ideas and problem solve. The Must Haves Bachelor's degree in Computer Science, Engineering. Programing Language Experience in at least one core languages like JAVA or C-SHARP Strong knowledge on Microsoft Office (MS Excel) Strong knowledge of software development and UI/UX Excellent analytical and Logical thinking and problem-solving skills. Strong attention to detail and accuracy. Effective communication and collaboration skills. Ability to work independently and within a team in a fast-paced, dynamic environment. Not ready to apply? Join our Talent Community to get relevant job alerts straight to your inbox. At Ciena, we are committed to building and fostering an environment in which our employees feel respected, valued, and heard. Ciena values the diversity of its workforce and respects its employees as individuals. We do not tolerate any form of discrimination. Ciena is an Equal Opportunity Employer, including disability and protected veteran status. If contacted in relation to a job opportunity, please advise Ciena of any accommodation measures you may require.

Posted 1 day ago

Apply

0 years

0 Lacs

Sahibzada Ajit Singh Nagar, Punjab, India

On-site

About Us NestorBird is recognized as one of the top ten Frappe Certified Partners worldwide, proudly showcasing over six years of experience in providing services to clients in ERPNext Frappe. We have successfully executed over 200 ERP projects across the globe. Our proficiency extends across diverse domains, encompassing Manufacturing, Healthcare, Education, Retail, Agriculture, Food, Distribution, Trading, and Nonprofit sectors. We specialize in providing comprehensive ERP services, ranging from consultation, implementation, development, third-party software integration, and customization, to continuous support. Our team comprises certified professionals dedicated to serving our global clientele. Job Description We are seeking a results-oriented and highly motivated Outbound Sales Executive / Intern to join our growing sales team. This role is ideal for someone who thrives in a fast-paced environment and is passionate about building strong client relationships. You will be responsible for executing a wide range of outbound activities to generate leads, nurture prospects, and drive business growth. Key Responsibilities Conduct data mining and research to identify potential leads and decision-makers. direct sales calls to pitch services and understand client requirements. Execute email marketing campaigns through strategic research and targeting. Initiate client communication and outreach via Telegram and Threads. Use LinkedIn Sales Navigator to filter and engage with relevant prospects. Collaborate with the marketing and sales team to support outbound campaigns. Maintain and update the CRM system with lead data, follow-ups, and progress notes. Continuously test and optimize outreach strategies to improve conversion rates. Required Skills & Qualifications Excellent verbal and written communication skills in English and Hindi. Strong interest in or background in sales, business development, or marketing. Familiarity with tools like LinkedIn Sales Navigator, Telegram, Threads, and email automation platforms is a plus. Ability to conduct thorough online research and extract actionable insights. Proactive, self-driven, and able to work independently with minimal supervision. Basic understanding of B2B sales cycles and client engagement tactics. About Company: NestorBird stands as the pinnacle of excellence in North India, providing its exceptional expertise and innovative solutions across the globe, boasting over six years of expertise in serving clients in ERPNext and Frappe. We have successfully executed over 200 ERPNext projects across more than 20 countries. Our proficiency extends across diverse domains, encompassing manufacturing, healthcare, education, retail, agriculture, food, distribution, trading, construction, and nonprofit sectors. We specialize in providing comprehensive ERPNext services, ranging from consultation, implementation, development, third-party software integration, and customization to continuous support. Our team comprises certified professionals dedicated to serving our global clientele.

Posted 1 day ago

Apply

2.0 years

0 Lacs

Gujarat, India

Remote

About The Job About CloudLabs : CloudLabs Inc was founded in 2014 with the mission to provide exceptional IT & Business consulting services at a competitive price, to help clients realize the best value from their investments. Within a short span, CloudLabs evolved from pure-play consulting into a transformative partner for Business Acceleration Advisory, Transformative Application Development & Managed Services - enabling digital transformations, M&A transitions, Automation & Process-driven optimizations & complex Integration initiatives for enterprises across the globe. As a Strategic Planning & Implementation Partner for global companies, CloudLabs has seen a 200% uptake in winning high-value, high-impact and high-risk projects that are critical for the business. With offices in the US, Canada, Mexico & India and with the team of 200+ experienced specialists, CloudLabs is now at an inflection point and ready for its next curve of progress. What We Offer We welcome candidates rejoining the workforce after career break/parental leave and support their journey to reacclimatize too corporate. Flexible remote work. Competitive pay package. Attractive policy, medical insurance benefits, industry leading trainings. Opportunity to work remotely is available. Experience : Minimum 2-3 years of relevant experience. Job Type : Onsite Location : Gujarat. Job Description We are looking for a motivated and technically sound Data Engineer with 2 to 3 years of experience to join our data engineering team. The ideal candidate will have a solid understanding of database systems, strong SQL/PLSQL skills, and a willingness to grow in modern cloud data technologies like Snowflake. Duties And Responsibilities Design, develop, and maintain robust data pipelines and workflows. Write optimized SQL/PLSQL scripts to extract, transform, and load data. Support data integration across systems and ensure high data quality. Collaborate with cross-functional teams to understand data needs and deliver solutions. Participate in performance tuning, data modeling, and code reviews. Continuously explore and adopt cloud data technologies to improve systems and workflows. Ensure timely delivery of data solutions and documentation. Work from the Gujarat office (minimum 4 days per week) as part of a collaborative team environment. What Were Looking For 2 to 3 years of experience in data engineering or database development roles. Strong understanding of database concepts and relational data modeling. Ability to write and troubleshoot complex SQL and PL/SQL queries. Hands-on Experience in Python. This role requires working from our Gujarat office 4 days a week. Preferred But Not Required Qualifications Exposure to ETL processes and tools. Experience working with Snowflake or other cloud data warehouse platforms. Strong written and verbal communication skills. Willingness to learn and complete certifications in cloud data warehouse technologies (e. , Snowflake) with minimal supervision (ref:hirist.tech)

Posted 1 day ago

Apply

6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Microsoft is a company where passionate innovators come to collaborate, envision what can be and take their careers further. This is a world of more possibilities, more innovation, more openness, and the sky is the limit thinking in a cloud-enabled world. Microsoft’s Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. Within Azure Data, the data integration team builds data gravity on the Microsoft Cloud. Massive volumes of data are generated – not just from transactional systems of record, but also from the world around us. Our data integration products – Azure Data Factory and Power Query make it easy for customers to bring in, clean, shape, and join data, to extract intelligence. Microsoft’s Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. We’re the team that developed the Mashup Engine (M) and Power Query. We already ship monthly to millions of users across Excel, Power/Pro BI, Flow, and PowerApps; but in many ways we’re just getting started. We’re building new services, experiences, and engine capabilities that will broaden the reach of our technologies to several new areas – data “intelligence”, large-scale data analytics, and automated data integration workflows. We plan to use example-based interaction, machine learning, and innovative visualization to make data access and transformation even more intuitive for non-technical users. We do not just value differences or different perspectives. We seek them out and invite them in so we can tap into the collective power of everyone in the company. As a result, our customers are better served. Responsibilities Engine layer: designing and implementing components for dataflow orchestration, distributed querying, query translation, connecting to external data sources, and script parsing/interpretation Service layer: designing and implementing infrastructure for a containerized, micro services based, high throughput architecture UI layer: designing and implementing performant, engaging web user interfaces for datavisualization/exploration/transformation/connectivity and dataflow management Embody our culture and values Qualifications Required/Minimum Qualifications Bachelor's Degree in Computer Science, or related technical discipline AND 6+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python OR equivalent experience Experience in data integration or migrations or ELT or ETL tooling is mandatory Preferred/Additional Qualifications BS degree in Computer Science Engine role: familiarity with data access technologies (e.g. ODBC, JDBC, OLEDB, ADO.Net, OData), query languages (e.g. T-SQL, Spark SQL, Hive, MDX, DAX), query generation/optimization, OLAP UI role: familiarity with JavaScript, TypeScript, CSS, React, Redux, webpack Service role: familiarity with micro-service architectures, Docker, Service Fabric, Azure blobs/tables/databases, high throughput services Full-stack role: a mix of the qualifications for the UX/service/backend roles Other Requirements Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include, but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft Cloud background check upon hire/transfer and every two years thereafter. Equal Opportunity Employer (EOP) #azdat #azuredata #azdat #azuredata #microsoftfabric #dataintegration Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 1 day ago

Apply

3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Level AI was founded in 2019 and is a Series C startup headquartered in Mountain View, California. Level AI revolutionizes customer engagement by transforming contact centers into strategic assets. Our AI-native platform leverages advanced technologies such as Large Language Models to extract deep insights from customer interactions. By providing actionable intelligence, Level AI empowers organizations to enhance customer experience and drive growth. Consistently updated with the latest AI innovations, Level AI stands as the most adaptive and forward-thinking solution in the industry. About the Role:As an Implementation Manager, you will lead our clients’ onboarding and implementation process, ensuring they unlock the full potential of Level AI to enhance the customer experience.You will be responsible for understanding client business requirements, facilitating data integrations, configuring and training on the Level AI products including Auto-QA, Analytics, Voice of the Customer, Agent Assist, and Screen Recording among others, all while driving efficient time to value. Key Responsibilities :Serve as the primary point of contact for key client accounts, building and maintaining strong relationships with clients.Successfully handle onboarding of multiple clients simultaneouslyUnderstand clients' business objectivesUnderstand clients' technical requirements which may require leading technical discovery sessions to ensure that our AI-powered customer support solutions are configured appropriately to meet their needsCollaborate with internal teams, including sales, product, engineering, and customer support, to address client needs and resolve technical issues.Develop and maintain a deep understanding of our AI-powered customer support solutions, and effectively communicate technical information to clients.Identify opportunities for upselling and cross-selling our solutions to existing clients.Track and report on key account metrics, such as customer satisfaction and product usage, and use this information to drive improvements in our solutions. Requirements : Bachelor's degree in Computer Science, Information Systems related field OR equivalent experience 3+ years of experience in a hands on technical role; 1-2+ years of experience delivering successful customer implementations Strong technical background with knowledge of SaaS platforms, APIs, and cloud services Excellent project management skills with the ability to juggle multiple projects simultaneously Ability to translate complex concepts into actionable items to non-technical stakeholders Strong communication skills in English (both written and verbal) Entrepreneurial & Problem-Solving Attitude - Self-motivated, adaptable, and resourceful in tackling implementation challenges Comfortable working in US hours Optional Requirements : Experience interacting with APIs and using cloud services Experience with integrating with CRMs such as Salesforce Familiarity with intent-based and generative artificial intelligence Experience with Telephony Systems such as AWS Connect, Five9 and Genesys

Posted 1 day ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Microsoft’s Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. Within Azure Data, the data integration team builds data gravity on the Microsoft Cloud. Massive volumes of data are generated – not just from transactional systems of record, but also from the world around us. Our data integration products – Azure Data Factory and Power Query make it easy for customers to bring in, clean, shape, and join data, to extract intelligence. We do not just value differences or different perspectives. We seek them out and invite them in so we can tap into the collective power of everyone in the company. As a result, our customers are better served. Responsibilities Build cloud scale products with focus on efficiency, reliability and security Build and maintain end-to-end Build, Test and Deployment pipelines Deploy and manage massive Hadoop, Spark and other clusters Contribute to the architecture & design of the products Triaging issues and implementing solutions to restore service with minimal disruption to the customer and business. Perform root cause analysis, trend analysis and post-mortems Owning the components and driving them end to end, all the way from gathering requirements, development, testing, deployment to ensuring high quality and availability post deployment Embody our culture and values Qualifications Required/Minimum Qualifications Bachelor's Degree in Computer Science, or related technical discipline AND 4+ years technical engineering experience with coding in languages like C#, React, Redux, TypeScript, JavaScript, Java or Python OR equivalent experience Experience in data integration or data migrations or ELT or ETL tooling is mandatory Other Requirements Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include, but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft Cloud background check upon hire/transfer and every two years thereafter. Equal Opportunity Employer (EOP) #azdat #azuredata #azdat #azuredata #microsoftfabric #dataintegration Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 1 day ago

Apply

0 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

At Iron Mountain we know that work, when done well, makes a positive impact for our customers, our employees, and our planet. That’s why we need smart, committed people to join us. Whether you’re looking to start your career or make a change, talk to us and see how you can elevate the power of your work at Iron Mountain. We provide expert, sustainable solutions in records and information management, digital transformation services, data centers, asset lifecycle management, and fine art storage, handling, and logistics. We proudly partner every day with our 225,000 customers around the world to preserve their invaluable artifacts, extract more from their inventory, and protect their data privacy in innovative and socially responsible ways. Are you curious about being part of our growth stor y while evolving your skills in a culture that will welcome your unique contributions? If so, let's start the conversation. About The Role As a Management Trainee – Data Analysis , you will play a key role in transforming raw data into actionable business insights. This role is designed to give you hands-on experience across the data lifecycle—from collection and analysis to visualization and reporting—while helping drive decision-making and operational excellence. You will work closely with cross-functional teams, support business intelligence initiatives, and contribute to process improvements. This is an excellent opportunity for someone looking to build a strong foundation in data analytics, MIS reporting, and business operations. Key Responsibilities Data Analysis Collect, clean, and analyze data from various sources to provide actionable insights for process optimization and business improvement Identify trends, patterns, and anomalies in data and present findings clearly and effectively MIS Development Design, develop, and maintain MIS reports and dashboards to monitor key performance indicators (KPIs), operational metrics, and business metrics Ensure data accuracy and completeness through regular validation and updates Data Visualization Create clear and engaging visualizations using tools like Tableau, Power BI, or similar platforms Present complex data in an easy-to-understand format for stakeholders and leadership Data Hosting and Management Manage data repositories and databases, ensuring security, integrity, and accessibility for authorized users Implement data governance protocols and ensure compliance with relevant data privacy standards Collaboration and Communication Collaborate with cross-functional teams to understand data and reporting needs Translate analytical findings into business insights for non-technical stakeholders Process Improvement Identify and implement improvements in data collection, processing, and reporting workflows Develop tools or methods to enhance data efficiency and accuracy Qualifications Master’s degree in a relevant field (e.g., Data Analytics, Statistics, Business, or Engineering) Proven experience in data analysis, visualization, and MIS development—preferably in operations or business support functions Proficiency in tools and platforms such as Google Suite (Sheets, Slides, Data Studio), Excel, and dashboards Strong analytical and problem-solving skills with a keen eye for detail Ability to work both independently and in a collaborative team setting Excellent verbal and written communication skills Knowledge of data security, governance, and compliance standards Experience with data visualization platforms like Tableau, Power BI, or similar tools (preferred) Experience in database management and data warehousing (preferred) Category: Administrative Services Iron Mountain is a global leader in storage and information management services trusted by more than 225,000 organizations in 60 countries. We safeguard billions of our customers’ assets, including critical business information, highly sensitive data, and invaluable cultural and historic artifacts. Take a look at our history here. Iron Mountain helps lower cost and risk, comply with regulations, recover from disaster, and enable digital and sustainable solutions, whether in information management, digital transformation, secure storage and destruction, data center operations, cloud services, or art storage and logistics. Please see our Values and Code of Ethics for a look at our principles and aspirations in elevating the power of our work together. If you have a physical or mental disability that requires special accommodations, please let us know by sending an email to accommodationrequest@ironmountain.com. See the Supplement to learn more about Equal Employment Opportunity. Iron Mountain is committed to a policy of equal employment opportunity. We recruit and hire applicants without regard to race, color, religion, sex (including pregnancy), national origin, disability, age, sexual orientation, veteran status, genetic information, gender identity, gender expression, or any other factor prohibited by law. To view the Equal Employment Opportunity is the Law posters and the supplement, as well as the Pay Transparency Policy Statement, CLICK HERE Requisition: J0090426

Posted 1 day ago

Apply

10.0 years

0 Lacs

Gurugram, Haryana, India

On-site

At Iron Mountain we know that work, when done well, makes a positive impact for our customers, our employees, and our planet. That’s why we need smart, committed people to join us. Whether you’re looking to start your career or make a change, talk to us and see how you can elevate the power of your work at Iron Mountain. We provide expert, sustainable solutions in records and information management, digital transformation services, data centers, asset lifecycle management, and fine art storage, handling, and logistics. We proudly partner every day with our 225,000 customers around the world to preserve their invaluable artifacts, extract more from their inventory, and protect their data privacy in innovative and socially responsible ways. Are you curious about being part of our growth stor y while evolving your skills in a culture that will welcome your unique contributions? If so, let's start the conversation. We are seeking an experienced and visionary leader to head our Insurance Vertical at the national level. The ideal candidate will have a proven track record in the insurance sector, particularly in working with corporate insurance clients. This role requires deep knowledge of industry operations, a strong understanding of digital transformation trends, and significant leadership experience in managing high-performing, cross-functional teams. The candidate will be responsible for driving strategic growth, deepening client relationships, and shaping the future of our insurance offerings in a rapidly evolving market. Key Responsibilities Strategic Leadership Define and implement the growth strategy for the insurance vertical in alignment with the company’s overall business objectives Leverage industry insights and emerging digital trends to stay ahead of the competition Client Relationship Management Build and nurture long-term relationships with key corporate clients, including insurers, brokers, and other ecosystem players Position the company as a trusted strategic partner in their digital transformation initiatives Business Growth Drive revenue growth through new client acquisition and account expansion Focus on offerings across life, health, and general insurance sectors Team Leadership & Development Lead, mentor, and develop a high-performing team including business development managers, solution architects, and account executives Foster a collaborative, accountable, and innovation-driven team culture Market Intelligence Monitor regulatory changes, industry trends, and technological advancements in the insurance sector Adapt business strategy based on insights to maintain market relevance and competitive edge Cross-Functional Collaboration Partner with product, marketing, and operations teams to ensure the successful delivery of tailored solutions Act as the voice of the customer internally, aligning solutions to client pain points and industry needs Qualifications & Skills Bachelor’s degree in Business, Technology, Insurance, or related field; MBA or equivalent preferred 10+ years of experience in the insurance sector, with demonstrated leadership in digital transformation and IT-enabled solutions Proven success in managing and scaling high-performing teams Deep understanding of insurance business models, regulations, and client expectations Strong experience in consultative/solution selling and strategic account management Excellent communication, stakeholder management, and executive presentation skills Ability to lead complex cross-functional initiatives and deliver tangible business outcomes Preferred Attributes Experience with enterprise technology solutions relevant to the insurance industry (e.g., document management, workflow automation, AI/ML, data analytics) Existing relationships with top-tier insurance companies, brokers, and solution partners Demonstrated success in achieving revenue targets and market expansion in competitive environments Track record of driving innovation and managing organizational change effectively Category: General Management Iron Mountain is a global leader in storage and information management services trusted by more than 225,000 organizations in 60 countries. We safeguard billions of our customers’ assets, including critical business information, highly sensitive data, and invaluable cultural and historic artifacts. Take a look at our history here. Iron Mountain helps lower cost and risk, comply with regulations, recover from disaster, and enable digital and sustainable solutions, whether in information management, digital transformation, secure storage and destruction, data center operations, cloud services, or art storage and logistics. Please see our Values and Code of Ethics for a look at our principles and aspirations in elevating the power of our work together. If you have a physical or mental disability that requires special accommodations, please let us know by sending an email to accommodationrequest@ironmountain.com. See the Supplement to learn more about Equal Employment Opportunity. Iron Mountain is committed to a policy of equal employment opportunity. We recruit and hire applicants without regard to race, color, religion, sex (including pregnancy), national origin, disability, age, sexual orientation, veteran status, genetic information, gender identity, gender expression, or any other factor prohibited by law. To view the Equal Employment Opportunity is the Law posters and the supplement, as well as the Pay Transparency Policy Statement, CLICK HERE Requisition: J0091093

Posted 1 day ago

Apply

1.0 - 31.0 years

1 - 1 Lacs

Jubilee Hills, Hyderabad Region

On-site

We are looking for a detail-oriented MIS Executive to collect, analyze, and manage data from various web sources and internal tools. The ideal candidate should have a strong understanding of computers, be proficient in Google Sheets and Excel, and assist in generating reports to support business decisions. Key Responsibilities:Data Collection & Management: Extract and compile data from websites, APIs, and other online sources. Maintain and update databases with accurate and relevant information. Ensure data integrity by performing regular checks and validations. Reporting & Analysis: Generate daily/weekly/monthly reports for management. Analyze trends and provide actionable insights from collected data. Assist in preparing presentations with data-driven findings. Required Skills & Qualifications:Education: Bachelor’s degree in Computer Science, IT, Business Analytics, or related field. Experience: 1-2 years in MIS, data handling, or a similar role. Technical Skills: Strong proficiency in Google Sheets & Excel (Advanced Functions, Macros, Automation). Basic knowledge of web scraping tools (e.g., ImportXML, Python, or no-code scrapers) is a plus. Familiarity with SQL, APIs, or data visualization tools (Power BI, Tableau) is an advantage. Analytical Skills: Ability to interpret data and generate insights. Attention to Detail: High accuracy in data entry and reporting. Communication: Good verbal and written communication skills

Posted 1 day ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

To get the best candidate experience, please consider applying for a maximum of 3 roles within 12 months to ensure you are not duplicating efforts. Job Category Finance Job Details About Salesforce Salesforce is the #1 AI CRM, where humans with agents drive customer success together. Here, ambition meets action. Tech meets trust. And innovation isn’t a buzzword — it’s a way of life. The world of work as we know it is changing and we're looking for Trailblazers who are passionate about bettering business and the world through AI, driving innovation, and keeping Salesforce's core values at the heart of it all. Ready to level-up your career at the company leading workforce transformation in the agentic era? You’re in the right place! Agentforce is the future of AI, and you are the future of Salesforce. Description Salesforce’s Quote to Cash (QTC) Enterprise Strategy & Solutions team is hiring a Business Analyst. We’re looking for critical thinkers that want to roll up their sleeves and work on some of the most complex and visible projects currently underway. As a member of the Global Business Strategy and Operations organization, Analysts will perform a variety of responsibilities on enterprise level projects to improve and scale our internal Quote-To-Cash operations. We are seeking proactive, self-motivated individuals who are comfortable navigating ambiguity, take initiative, and consistently drive project success with minimal oversight. This role requires close, real-time collaboration with US-based counterparts—including Functional Leads, Senior Analysts, Technical Architects, and Product Managers—which necessitates aligning with US business hours as per the defined shifts. Responsibilities Coordinate with Functional Leads and Senior Analysts to understand the future state vision for L2C/QTC processes and features in order to then deliver progressive capabilities towards that end-state in each release. Lead the Business Requirements gathering and documentation process by collaborating with crucial subject matter experts to transform existing processes to drive the future of quoting to our customers. Diagram as-is and to-be business processes using tools like Lucidcharts. Coordinate and lead cross-functional meetings, document decisions & follow-up on actions. Engage with Technical Architects and Product Managers to create innovative, holistic solutions to deliver upon the Business Requirements and future state needs. Project management activities including reporting escalations, tracking requirements delivery, communicating cross-functional dependencies and creating status updates. Act as a subject matter expert for Salesforce internal QTC systems and processes. Develop, document, and maintain a thorough repository and understanding of business rules and process flows. Work with training & engagement specialists to create training materials to ensure successful change management results. Ad-hoc reporting and research activities as project needs dictate. Participating in user acceptance testing (UAT). Required Skills/Experience Experience with business requirements gathering and documentation / user story experience Excellent interpersonal skills; ability to articulate verbally and in writing; willingness to appropriately debate difficult issues; ability to think quickly; excellent listening skills; organizational skills Ability to excel in a fast-paced environment delivering accuracy while managing ambiguity and deadlines where adaptability is imperative Capacity to identify and understand broader business and financial issues from an end-user’s perspective and consider cross-functional and downstream impacts Experience successfully juggling multiple projects and tasks concurrently Extreme attention to detail with an ability to work independently and demonstrate initiative Curiosity in order to extract relevant information from subject matter experts Prior experience as a Business Analyst Preferred Skills/Experience Experience related to Configure Price Quote, Contract Lifecycle and/or Order Management processes and systems Working knowledge of Lucidcharts or similar process flow documentation software Working knowledge of Smartsheets or other project management software Experience with Salesforce products a plus Exposure to enterprise level, transformational projects Prior experience with New Product Introductions processes, Business Operations, Quote to Cash Operations and/or M&A Operations Unleash Your Potential When you join Salesforce, you’ll be limitless in all areas of your life. Our benefits and resources support you to find balance and be your best , and our AI agents accelerate your impact so you can do your best . Together, we’ll bring the power of Agentforce to organizations of all sizes and deliver amazing experiences that customers love. Apply today to not only shape the future — but to redefine what’s possible — for yourself, for AI, and the world. Accommodations If you require assistance due to a disability applying for open positions please submit a request via this Accommodations Request Form. Posting Statement Salesforce is an equal opportunity employer and maintains a policy of non-discrimination with all employees and applicants for employment. What does that mean exactly? It means that at Salesforce, we believe in equality for all. And we believe we can lead the path to equality in part by creating a workplace that’s inclusive, and free from discrimination. Know your rights: workplace discrimination is illegal. Any employee or potential employee will be assessed on the basis of merit, competence and qualifications – without regard to race, religion, color, national origin, sex, sexual orientation, gender expression or identity, transgender status, age, disability, veteran or marital status, political viewpoint, or other classifications protected by law. This policy applies to current and prospective employees, no matter where they are in their Salesforce employment journey. It also applies to recruiting, hiring, job assignment, compensation, promotion, benefits, training, assessment of job performance, discipline, termination, and everything in between. Recruiting, hiring, and promotion decisions at Salesforce are fair and based on merit. The same goes for compensation, benefits, promotions, transfers, reduction in workforce, recall, training, and education.

Posted 1 day ago

Apply

5.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Job Summary Leads projects for design, development and maintenance of a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with key business stakeholders, IT experts and subject-matter experts to plan, design and deliver optimal analytics and data science solutions. Works on one or many product teams at a time. Key Responsibilities Designs and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Designs and implements framework to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Designs and provide guidance on building reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Designs and implements physical data models to define the database structure. Optimizing database performance through efficient indexing and table relationships. Participates in optimizing, testing, and troubleshooting of data pipelines. Designs, develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity. Assists with renovating the data management infrastructure to drive automation in data integration and management. Ensures the timeliness and success of critical analytics initiatives by using agile development technologies such as DevOps, Scrum, Kanban Coaches and develops less experienced team members. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience Intermediate experience in a relevant discipline area is required. Knowledge of the latest technologies and trends in data engineering are highly preferred and includes: 5-8 years of experince Familiarity analyzing complex business systems, industry requirements, and/or data regulations Background in processing and managing large data sets Design and development for a Big Data platform using open source and third-party tools SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Experience developing applications requiring large file movement for a Cloud-based environment and other data extraction tools and methods from a variety of sources Experience in building analytical solutions Intermediate Experiences In The Following Are Preferred Experience with IoT technology Experience in Agile software development Qualifications Work closely with business Product Owner to understand product vision. 2) Play a key role across DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). 3) Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. 4) Independently design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. 5) Responsible for creation, maintenence and management of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP). 6) Take part in evaluation of new data tools, POCs and provide suggestions. 7) Take full ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization. 8) Proactively address and resolve issues that compromise data accuracy and usability. Preferred Skills Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. Data Replication: Working knowledge of replication technologies like Qlik Replicate is a plus API: Working knowledge of API to consume data from ERP, CRM Job Systems/Information Technology Organization Cummins Inc. Role Category Remote Job Type Exempt - Experienced ReqID 2417810 Relocation Package Yes

Posted 2 days ago

Apply

5.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Job Summary Leads projects for design, development and maintenance of a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with key business stakeholders, IT experts and subject-matter experts to plan, design and deliver optimal analytics and data science solutions. Works on one or many product teams at a time. Key Responsibilities Designs and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Designs and implements framework to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Designs and provide guidance on building reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Designs and implements physical data models to define the database structure. Optimizing database performance through efficient indexing and table relationships. Participates in optimizing, testing, and troubleshooting of data pipelines. Designs, develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity. Assists with renovating the data management infrastructure to drive automation in data integration and management. Ensures the timeliness and success of critical analytics initiatives by using agile development technologies such as DevOps, Scrum, Kanban Coaches and develops less experienced team members. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience Intermediate experience in a relevant discipline area is required. Knowledge of the latest technologies and trends in data engineering are highly preferred and includes: 5-8 years of experience Familiarity analyzing complex business systems, industry requirements, and/or data regulations Background in processing and managing large data sets Design and development for a Big Data platform using open source and third-party tools SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Experience developing applications requiring large file movement for a Cloud-based environment and other data extraction tools and methods from a variety of sources Experience in building analytical solutions Intermediate Experiences In The Following Are Preferred Experience with IoT technology Experience in Agile software development Qualifications Work closely with business Product Owner to understand product vision. 2) Play a key role across DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). 3) Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. 4) Independently design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. 5) Responsible for creation, maintenence and management of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP). 6) Take part in evaluation of new data tools, POCs and provide suggestions. 7) Take full ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization. 8) Proactively address and resolve issues that compromise data accuracy and usability. Preferred Skills Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. Data Replication: Working knowledge of replication technologies like Qlik Replicate is a plus API: Working knowledge of API to consume data from ERP, CRM Job Systems/Information Technology Organization Cummins Inc. Role Category Remote Job Type Exempt - Experienced ReqID 2417809 Relocation Package Yes

Posted 2 days ago

Apply

4.0 - 5.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Job Summary Supports, develops and maintains a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with the Business and IT teams to understand the requirements to best leverage the technologies to enable agile data delivery at scale. Key Responsibilities Implements and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Implements methods to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Develops reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Develops physical data models and implements data storage architectures as per design guidelines. Analyzes complex data elements and systems, data flow, dependencies, and relationships in order to contribute to conceptual physical and logical data models. Participates in testing and troubleshooting of data pipelines. Develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses agile development technologies, such as DevOps, Scrum, Kanban and continuous improvement cycle, for data driven application. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience 4-5 Years of experience. Relevant experience preferred such as working in a temporary student employment, intern, co-op, or other extracurricular team activities. Knowledge of the latest technologies in data engineering is highly preferred and includes: Exposure to Big Data open source SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Familiarity developing applications requiring large file movement for a Cloud-based environment Exposure to Agile software development Exposure to building analytical solutions Exposure to IoT technology Qualifications Work closely with business Product Owner to understand product vision. 2) Participate in DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). 3) Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. 4) Work under limited supervision to design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. 5) Responsible for creation of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP) with guidance and help from senior data engineers. 6) Take part in evaluation of new data tools, POCs with guidance and help from senior data engineers. 7) Take ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization under limited supervision. 8) Assist to resolve issues that compromise data accuracy and usability. Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Intermediate level expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. API: Working knowledge of API to consume data from ERP, CRM Job Systems/Information Technology Organization Cummins Inc. Role Category Remote Job Type Exempt - Experienced ReqID 2417808 Relocation Package Yes

Posted 2 days ago

Apply

3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Level AI was founded in 2019 and is a Series C startup headquartered in Mountain View, California. Level AI revolutionizes customer engagement by transforming contact centers into strategic assets. Our AI-native platform leverages advanced technologies such as Large Language Models to extract deep insights from customer interactions. By providing actionable intelligence, Level AI empowers organizations to enhance customer experience and drive growth. Consistently updated with the latest AI innovations, Level AI stands as the most adaptive and forward-thinking solution in the industry. Empowering contact center stakeholders with real-time insights, our tech facilitates data-driven decision-making for contact centers, enhancing service levels and agent performance. As a vital team member, your work will be cutting-edge technologies and will play a high-impact role in shaping the future of AI-driven enterprise applications. You will directly work with people who've worked at Amazon, Facebook, Google, and other technology companies in the world. With Level AI, you will get to have fun, learn new things, and grow along with us. Ready to redefine possibilities? Join us! We'll love to explore more about you if you have B.E/B.Tech/M.E/M.Tech/PhD from Tier 1 engineering institutes only with relevant work experience with a top technology company in computer science or mathematics-related fields. 3+ years of experience in AI/ML Strong coding skills in Python and familiarity with libraries like LangChain or Transformers Interest in LLMs, agents, and the evolving open-source AI ecosystem Eagerness to learn, experiment, and grow in a fast-paced environment. Your role at Level AI includes but is not limited to Assist in building LLM-powered agents for internal tools and customer-facing products Support prompt engineering, retrieval-augmented generation (RAG), and tool integrations Collaborate on experiments with open-source and commercial LLMs (e.g., GPT, Claude, Mistral) Help implement and evaluate reasoning, planning, and memory modules for agents Work closely with senior engineers to deploy and monitor AI features in production Bonus Points Experience with open-source LLMs (LLaMA, Mistral, etc.) Basic understanding of vector search, RAG, and prompt engineering concepts Contributions to AI side projects or GitHub repos Exposure to vector databases or retrieval pipelines (e.g., FAISS, Pinecone) To Apply- https://jobs.lever.co/levelai/cc04ab77-6ee3-4078-9cfd-110cda0b1438 To learn more visit : https://thelevel.ai/ Funding : https://www.crunchbase.com/organization/level-ai LinkedIn : https://www.linkedin.com/company/level-ai/

Posted 2 days ago

Apply

4.0 - 7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Responsible for developing, optimize, and maintaining business intelligence and data warehouse systems, ensuring secure, efficient data storage and retrieval, enabling self-service data exploration, and supporting stakeholders with insightful reporting and analysis. Grade - T5 Please note that the Job will close at 12am on Posting Close date, so please submit your application prior to the Close Date Accountabilities What your main responsibilities are: Data Pipeline - Develop and maintain scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity Data Integration - Connect offline and online data to continuously improve overall understanding of customer behavior and journeys for personalization. Data pre-processing including collecting, parsing, managing, analyzing and visualizing large sets of data Data Quality Management - Cleanse the data and improve data quality and readiness for analysis. Drive standards, define and implement/improve data governance strategies and enforce best practices to scale data analysis across platforms Data Transformation - Processes data by cleansing data and transforming them to proper storage structure for the purpose of querying and analysis using ETL and ELT process Data Enablement - Ensure data is accessible and useable to wider enterprise to enable a deeper and more timely understanding of operation. Qualifications & Specifications Masters /Bachelor’s degree in Engineering /Computer Science/ Math/ Statistics or equivalent. Strong programming skills in Python/Pyspark/SAS. Proven experience with large data sets and related technologies – Hadoop, Hive, Distributed computing systems, Spark optimization. Experience on cloud platforms (preferably Azure) and it's services Azure Data Factory (ADF), ADLS Storage, Azure DevOps. Hands-on experience on Databricks, Delta Lake, Workflows. Should have knowledge of DevOps process and tools like Docker, CI/CD, Kubernetes, Terraform, Octopus. Hands-on experience with SQL and data modeling to support the organization's data storage and analysis needs. Experience on any BI tool like Power BI (Good to have). Cloud migration experience (Good to have) Cloud and Data Engineering certification (Good to have) Working in an Agile environment 4-7 years of relevant work experience needed. Experience with stakeholder management will be an added advantage. What We Are Looking For Education: Bachelor's degree or equivalent in Computer Science, MIS, Mathematics, Statistics, or similar discipline. Master's degree or PhD preferred. Knowledge, Skills And Abilities Fluency in English Analytical Skills Accuracy & Attention to Detail Numerical Skills Planning & Organizing Skills Presentation Skills Data Modeling and Database Design ETL (Extract, Transform, Load) Skills Programming Skills FedEx was built on a philosophy that puts people first, one we take seriously. We are an equal opportunity/affirmative action employer and we are committed to a diverse, equitable, and inclusive workforce in which we enforce fair treatment, and provide growth opportunities for everyone. All qualified applicants will receive consideration for employment regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy (including childbirth or a related medical condition), physical or mental disability, or any other characteristic protected by applicable laws, regulations, and ordinances. Our Company FedEx is one of the world's largest express transportation companies and has consistently been selected as one of the top 10 World’s Most Admired Companies by "Fortune" magazine. Every day FedEx delivers for its customers with transportation and business solutions, serving more than 220 countries and territories around the globe. We can serve this global network due to our outstanding team of FedEx team members, who are tasked with making every FedEx experience outstanding. Our Philosophy The People-Service-Profit philosophy (P-S-P) describes the principles that govern every FedEx decision, policy, or activity. FedEx takes care of our people; they, in turn, deliver the impeccable service demanded by our customers, who reward us with the profitability necessary to secure our future. The essential element in making the People-Service-Profit philosophy such a positive force for the company is where we close the circle, and return these profits back into the business, and invest back in our people. Our success in the industry is attributed to our people. Through our P-S-P philosophy, we have a work environment that encourages team members to be innovative in delivering the highest possible quality of service to our customers. We care for their well-being, and value their contributions to the company. Our Culture Our culture is important for many reasons, and we intentionally bring it to life through our behaviors, actions, and activities in every part of the world. The FedEx culture and values have been a cornerstone of our success and growth since we began in the early 1970’s. While other companies can copy our systems, infrastructure, and processes, our culture makes us unique and is often a differentiating factor as we compete and grow in today’s global marketplace.

Posted 2 days ago

Apply

20.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Over the last 20 years, Ares’ success has been driven by our people and our culture. Today, our team is guided by our core values – Collaborative, Responsible, Entrepreneurial, Self-Aware, Trustworthy – and our purpose to be a catalyst for shared prosperity and a better future. Through our recruitment, career development and employee-focused programming, we are committed to fostering a welcoming and inclusive work environment where high-performance talent of diverse backgrounds, experiences, and perspectives can build careers within this exciting and growing industry. Job Description Role summary: Ares is looking for an Associate Vice President / Senior Associate to join the Mumbai Investment Operations team. The Investment Operations team works closely with business stakeholders in various lines of business, as well as various corporate functions. The ideal candidate will be responsible for overseeing loan operations team, fund admins, custodians, etc., as well as processing all credit activity and restructures in WSO for loans for various business lines. Other responsibilities include research and escalation of loan operations issues and breaks, working in partnership with the Loan Settlements/Servicing teams in Los Angeles. Must have practical experience with the loan closing and loan servicing process, also processing experience in Wall Street Office is preferred. Ares, as an alternative asset manager, has an asset mix which is comprehensive and heavily concentrated in bank debt. The ideal candidate would have experience working with diverse lines of business for a global client base including pensions, insurance, and institutional investors. The role requires a dynamic, adaptive, experienced hands-on professional to ensure best practices in a fast-paced rapidly growing environment. Primary Responsibilities Specific responsibilities include, but are not limited to: Serve as primary escalation contact and day to day manager for the loan operations team in Mumbai Facilitate training and provide ongoing support for the local team Coordinate, process, and reconcile the processing of all daily servicing events, including amendments and restructures (preparation of transaction loaders, reviewing funds flows, and more) Oversee and manage loan processing in WSO of all deals Direct third-party fund administrators and custodian banks on appropriate processing and review/reconcile processing output for accuracy, including restructures, multicurrency facility processing, non pro rata activity, principal repayments with fees, etc. Daily review of credit events with third-party administrators and custodian banks Act as 1st point of escalation for high-risk breaks and identify areas for issue prevention Review daily recons between internal systems and third parties to resolve discrepancies Coordinate loan operations related audit requests Prepare KPIs on a regular basis and participate in ad hoc projects Maintain high standard of quality controls, and work with internal and external stakeholders to enhance loan operations workflows Liaise with local finance teams, offshore partners, deal teams, investment accounting, middle office, treasury, and trustees for all portfolio-specific activity and issues, ensuring cross-communication of critical information between firm departments Manage oversight of all UK based agents and custodians to resolve loan related issues in a timely manner Experience Required Experience in high quality, global capital markets or investment management firms with expertise in Investment Operations and Asset Servicing related functions. Experience in Investment Operations in any of middle office or back-office functions. Prior experience with an alternative asset manager preferred broader asset management experience preferred. Strong knowledge of bank loans primarily with the willingness to cross train and learn various asset classes Must have experience with loan closing process in ClearPar and loan servicing process in Wall Street Office. Also, preferred experience with Black Mountain (Allvue), Everest, Geneva, and/or IVP data management platforms. Understanding of basis accounting theories. Loan Operations experience in private/middle market loans preferred, but not required. Experienced with a diverse set of investment vehicles such as Institutional Separate Accounts, SMA/Limited Partnerships, Open-End Mutual Funds, Closed-End Funds and UCITs, CLOs, and complex fund structures. Hedge fund, Credit or Private Equity experience is a plus. General Requirements Ability to extract meaningful information from extensive research and analysis to effectively present facts and findings in a digestible format, a keen eye for attention to detail A self-directed individual with a can-do attitude, willing to work in an energetic, collaborative, and fast-paced environment, proactive in nature, and a proven ability to resolve issues with minimal supervision Proven outstanding communication (written and verbal), presentation, documentation, collaboration, and interpersonal skills A hands-on approach and ability to synthesize business operations and talent needs Ability to successfully manage multiple priorities and competing demands High accuracy and detail orientation Good judgment in terms of escalating issues vs. solving problems independently A solutions-oriented, self-starter and ability to see the big picture Comfort in dealing with ambiguity and uncertainty in a dynamic and fast-paced environment Ability to be flexible in terms of hours to coordinate with team members across various time zones An analytical mind and a passion/interest in bringing new ideas to increase efficiency of existing processes Dependable, great attitude, highly motivated and a team player Strong Leadership Skills Reporting Relationships Associate Vice President, Global Asset Servicing & Reconciliation There is no set deadline to apply for this job opportunity. Applications will be accepted on an ongoing basis until the search is no longer active.

Posted 2 days ago

Apply

4.0 - 7.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Responsible for developing, optimize, and maintaining business intelligence and data warehouse systems, ensuring secure, efficient data storage and retrieval, enabling self-service data exploration, and supporting stakeholders with insightful reporting and analysis. Grade - T5 Please note that the Job will close at 12am on Posting Close date, so please submit your application prior to the Close Date Accountabilities What your main responsibilities are: Data Pipeline - Develop and maintain scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity Data Integration - Connect offline and online data to continuously improve overall understanding of customer behavior and journeys for personalization. Data pre-processing including collecting, parsing, managing, analyzing and visualizing large sets of data Data Quality Management - Cleanse the data and improve data quality and readiness for analysis. Drive standards, define and implement/improve data governance strategies and enforce best practices to scale data analysis across platforms Data Transformation - Processes data by cleansing data and transforming them to proper storage structure for the purpose of querying and analysis using ETL and ELT process Data Enablement - Ensure data is accessible and useable to wider enterprise to enable a deeper and more timely understanding of operation. Qualifications & Specifications Masters /Bachelor’s degree in Engineering /Computer Science/ Math/ Statistics or equivalent. Strong programming skills in Python/Pyspark/SAS. Proven experience with large data sets and related technologies – Hadoop, Hive, Distributed computing systems, Spark optimization. Experience on cloud platforms (preferably Azure) and it's services Azure Data Factory (ADF), ADLS Storage, Azure DevOps. Hands-on experience on Databricks, Delta Lake, Workflows. Should have knowledge of DevOps process and tools like Docker, CI/CD, Kubernetes, Terraform, Octopus. Hands-on experience with SQL and data modeling to support the organization's data storage and analysis needs. Experience on any BI tool like Power BI (Good to have). Cloud migration experience (Good to have) Cloud and Data Engineering certification (Good to have) Working in an Agile environment 4-7 years of relevant work experience needed. Experience with stakeholder management will be an added advantage. What We Are Looking For Education: Bachelor's degree or equivalent in Computer Science, MIS, Mathematics, Statistics, or similar discipline. Master's degree or PhD preferred. Knowledge, Skills And Abilities Fluency in English Analytical Skills Accuracy & Attention to Detail Numerical Skills Planning & Organizing Skills Presentation Skills Data Modeling and Database Design ETL (Extract, Transform, Load) Skills Programming Skills FedEx was built on a philosophy that puts people first, one we take seriously. We are an equal opportunity/affirmative action employer and we are committed to a diverse, equitable, and inclusive workforce in which we enforce fair treatment, and provide growth opportunities for everyone. All qualified applicants will receive consideration for employment regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy (including childbirth or a related medical condition), physical or mental disability, or any other characteristic protected by applicable laws, regulations, and ordinances. Our Company FedEx is one of the world's largest express transportation companies and has consistently been selected as one of the top 10 World’s Most Admired Companies by "Fortune" magazine. Every day FedEx delivers for its customers with transportation and business solutions, serving more than 220 countries and territories around the globe. We can serve this global network due to our outstanding team of FedEx team members, who are tasked with making every FedEx experience outstanding. Our Philosophy The People-Service-Profit philosophy (P-S-P) describes the principles that govern every FedEx decision, policy, or activity. FedEx takes care of our people; they, in turn, deliver the impeccable service demanded by our customers, who reward us with the profitability necessary to secure our future. The essential element in making the People-Service-Profit philosophy such a positive force for the company is where we close the circle, and return these profits back into the business, and invest back in our people. Our success in the industry is attributed to our people. Through our P-S-P philosophy, we have a work environment that encourages team members to be innovative in delivering the highest possible quality of service to our customers. We care for their well-being, and value their contributions to the company. Our Culture Our culture is important for many reasons, and we intentionally bring it to life through our behaviors, actions, and activities in every part of the world. The FedEx culture and values have been a cornerstone of our success and growth since we began in the early 1970’s. While other companies can copy our systems, infrastructure, and processes, our culture makes us unique and is often a differentiating factor as we compete and grow in today’s global marketplace.

Posted 2 days ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev. The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology. The teams are transforming Operations through Tech and Analytics. Do You Dream Big? We Need You. Job Title: Data Scientist Location: Bangalore Reporting to: Senior Manager Analytics 1.Purpose of the role The Data Scientist will play a key role in designing and delivering data-driven solutions that enable better decision-making across the organization. This role requires strong hands-on coding skills in Python, experience with core data science libraries, and the ability to statistically validate features and models. The analyst will collaborate across teams, work efficiently with existing codebases, and apply version control and development best practices to build scalable, production-ready analytics solutions. With intermediate SQL expertise and a solid grasp of model development workflows, the role ensures robust, interpretable, and actionable outcomes from complex data. 2.Key Tasks AND Accountabilities Develop and maintain data science models using Python, applying intermediate to advanced knowledge of syntax, data structures, and key libraries such as pandas and numpy. Perform feature engineering and statistical validation of features and models to ensure robustness, accuracy, and business relevance. Write clean, modular, and well-documented code following best development practices; optionally adopt Test-Driven Development (TDD) to enable faster iteration and feedback cycles. Collaborate with cross-functional teams to understand data requirements, align on analytical solutions, and translate business problems into data science problems. Read, understand, and extend existing codebases, adapting quickly to different coding styles and project structures. Leverage version control tools like Git for collaborative development, code management, and maintaining reproducibility of models. Write and optimize intermediate-level SQL queries to extract, transform, and analyze data from structured databases. Contribute to the deployment readiness of models, ensuring outputs are interpretable, reusable, and aligned with production or decision-support use cases. Document processes, assumptions, and outputs clearly for stakeholder transparency, reproducibility, and future reference. Stay up to date with industry trends, new tools, and emerging best practices in data science, analytics, and development methodologies. 3.Qualifications, Experience, Skills Bachelor’s or master’s degree in computer science, Information Systems, Artificial Intelligence, Machine Learning, or a related field (B. Tech / BE / Masters in CS/IS/AI/ML). Work experience Minimum of 3 years of hands-on experience in a data science or analytics role, with a proven track record of building and deploying data-driven solutions in real-world scenarios. Technical Skills Required, Python Programming (Intermediate to Advanced): Strong grasp of syntax, data structures, and experience with libraries like pandas and NumPy. Data Science Fundamentals: Ability to statistically validate features and models, ensuring sound analytical rigor. SQL (Intermediate): Proficiency in writing queries to extract, manipulate, and analyze data from relational databases. Version Control (GIT): Familiarity with collaborative development using Git for code versioning and management. Code Adaptability: Comfortable working with and modifying existing codebases written by others. Good to have skills: Object-Oriented Programming (OOPs) in Python : Understanding and applying OOP concepts where appropriate. Test-Driven Development (TDD): Awareness of TDD practices for faster iteration and improved code quality. Model Deployment Lifecycle Knowledge : Familiarity with reproducibility, tracking, and maintaining deployed models (though not explicitly required, it’s a plus if known). And above all of this, an undying love for beer! We dream big to create future with more cheers .

Posted 2 days ago

Apply

0.0 - 1.0 years

0 - 0 Lacs

Kochi, Kerala

Remote

Job Title: Data Analyst and CRM Support Location: palarivattom Hybrid Company: 11X Company Experience: 0–1 Year Gender Preference: Female Candidates Only Employment Type: Full-Time Job Summary: We are seeking a detail-oriented and analytical Data Analyst with exposure to CRM tools to join our growing team at 11X Company, Kerala . The ideal candidate will be responsible for collecting, processing, and analyzing data to help optimize our customer relationship strategies and business decisions. Key Responsibilities: Analyze CRM data to extract insights on customer behavior and campaign performance Assist in maintaining and updating CRM databases and dashboards Prepare regular reports and presentations for internal teams Identify trends, patterns, and areas of improvement using data analytics tools Collaborate with marketing, sales, and operations teams to streamline data flow and improve CRM effectiveness Ensure data accuracy and assist in data cleansing tasks Support ad hoc data requests from various departments Key Requirements: Bachelor’s degree in Data Science, Statistics, Computer Science, Business Analytics, or a related field 0–1 year of experience in data analysis or CRM support Familiarity with CRM tools like Zoho, Salesforce, HubSpot, or similar platforms Proficient in MS Excel, Google Sheets, and basic knowledge of SQL or data visualization tools (Power BI/Tableau) Strong analytical and problem-solving skills Attention to detail and a proactive mindset Good communication skills and ability to collaborate with cross-functional teams Preferred Attributes: Willingness to learn and grow in a data-driven environment Time management and multitasking capabilities Female candidates preferred as per team diversity goals Job Type: Full-time Pay: ₹20,000.00 - ₹25,000.00 per month Benefits: Work from home Schedule: Evening shift Night shift Rotational shift Application Question(s): Do you have experience with advanced excel? Language: English (Required) Work Location: Remote

Posted 2 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies