Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 years
0 Lacs
Gujarat, India
Remote
About The Job About CloudLabs : CloudLabs Inc was founded in 2014 with the mission to provide exceptional IT & Business consulting services at a competitive price, to help clients realize the best value from their investments. Within a short span, CloudLabs evolved from pure-play consulting into a transformative partner for Business Acceleration Advisory, Transformative Application Development & Managed Services - enabling digital transformations, M&A transitions, Automation & Process-driven optimizations & complex Integration initiatives for enterprises across the globe. As a Strategic Planning & Implementation Partner for global companies, CloudLabs has seen a 200% uptake in winning high-value, high-impact and high-risk projects that are critical for the business. With offices in the US, Canada, Mexico & India and with the team of 200+ experienced specialists, CloudLabs is now at an inflection point and ready for its next curve of progress. What We Offer We welcome candidates rejoining the workforce after career break/parental leave and support their journey to reacclimatize too corporate. Flexible remote work. Competitive pay package. Attractive policy, medical insurance benefits, industry leading trainings. Opportunity to work remotely is available. Experience : Minimum 2-3 years of relevant experience. Job Type : Onsite Location : Gujarat. Job Description We are looking for a motivated and technically sound Data Engineer with 2 to 3 years of experience to join our data engineering team. The ideal candidate will have a solid understanding of database systems, strong SQL/PLSQL skills, and a willingness to grow in modern cloud data technologies like Snowflake. Duties And Responsibilities Design, develop, and maintain robust data pipelines and workflows. Write optimized SQL/PLSQL scripts to extract, transform, and load data. Support data integration across systems and ensure high data quality. Collaborate with cross-functional teams to understand data needs and deliver solutions. Participate in performance tuning, data modeling, and code reviews. Continuously explore and adopt cloud data technologies to improve systems and workflows. Ensure timely delivery of data solutions and documentation. Work from the Gujarat office (minimum 4 days per week) as part of a collaborative team environment. What Were Looking For 2 to 3 years of experience in data engineering or database development roles. Strong understanding of database concepts and relational data modeling. Ability to write and troubleshoot complex SQL and PL/SQL queries. Hands-on Experience in Python. This role requires working from our Gujarat office 4 days a week. Preferred But Not Required Qualifications Exposure to ETL processes and tools. Experience working with Snowflake or other cloud data warehouse platforms. Strong written and verbal communication skills. Willingness to learn and complete certifications in cloud data warehouse technologies (e. , Snowflake) with minimal supervision (ref:hirist.tech)
Posted 1 day ago
6.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Microsoft is a company where passionate innovators come to collaborate, envision what can be and take their careers further. This is a world of more possibilities, more innovation, more openness, and the sky is the limit thinking in a cloud-enabled world. Microsoft’s Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. Within Azure Data, the data integration team builds data gravity on the Microsoft Cloud. Massive volumes of data are generated – not just from transactional systems of record, but also from the world around us. Our data integration products – Azure Data Factory and Power Query make it easy for customers to bring in, clean, shape, and join data, to extract intelligence. Microsoft’s Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. We’re the team that developed the Mashup Engine (M) and Power Query. We already ship monthly to millions of users across Excel, Power/Pro BI, Flow, and PowerApps; but in many ways we’re just getting started. We’re building new services, experiences, and engine capabilities that will broaden the reach of our technologies to several new areas – data “intelligence”, large-scale data analytics, and automated data integration workflows. We plan to use example-based interaction, machine learning, and innovative visualization to make data access and transformation even more intuitive for non-technical users. We do not just value differences or different perspectives. We seek them out and invite them in so we can tap into the collective power of everyone in the company. As a result, our customers are better served. Responsibilities Engine layer: designing and implementing components for dataflow orchestration, distributed querying, query translation, connecting to external data sources, and script parsing/interpretation Service layer: designing and implementing infrastructure for a containerized, micro services based, high throughput architecture UI layer: designing and implementing performant, engaging web user interfaces for datavisualization/exploration/transformation/connectivity and dataflow management Embody our culture and values Qualifications Required/Minimum Qualifications Bachelor's Degree in Computer Science, or related technical discipline AND 6+ years technical engineering experience with coding in languages including, but not limited to, C, C++, C#, Java, JavaScript, or Python OR equivalent experience Experience in data integration or migrations or ELT or ETL tooling is mandatory Preferred/Additional Qualifications BS degree in Computer Science Engine role: familiarity with data access technologies (e.g. ODBC, JDBC, OLEDB, ADO.Net, OData), query languages (e.g. T-SQL, Spark SQL, Hive, MDX, DAX), query generation/optimization, OLAP UI role: familiarity with JavaScript, TypeScript, CSS, React, Redux, webpack Service role: familiarity with micro-service architectures, Docker, Service Fabric, Azure blobs/tables/databases, high throughput services Full-stack role: a mix of the qualifications for the UX/service/backend roles Other Requirements Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include, but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft Cloud background check upon hire/transfer and every two years thereafter. Equal Opportunity Employer (EOP) #azdat #azuredata #azdat #azuredata #microsoftfabric #dataintegration Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.
Posted 1 day ago
3.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Level AI was founded in 2019 and is a Series C startup headquartered in Mountain View, California. Level AI revolutionizes customer engagement by transforming contact centers into strategic assets. Our AI-native platform leverages advanced technologies such as Large Language Models to extract deep insights from customer interactions. By providing actionable intelligence, Level AI empowers organizations to enhance customer experience and drive growth. Consistently updated with the latest AI innovations, Level AI stands as the most adaptive and forward-thinking solution in the industry. About the Role:As an Implementation Manager, you will lead our clients’ onboarding and implementation process, ensuring they unlock the full potential of Level AI to enhance the customer experience.You will be responsible for understanding client business requirements, facilitating data integrations, configuring and training on the Level AI products including Auto-QA, Analytics, Voice of the Customer, Agent Assist, and Screen Recording among others, all while driving efficient time to value. Key Responsibilities :Serve as the primary point of contact for key client accounts, building and maintaining strong relationships with clients.Successfully handle onboarding of multiple clients simultaneouslyUnderstand clients' business objectivesUnderstand clients' technical requirements which may require leading technical discovery sessions to ensure that our AI-powered customer support solutions are configured appropriately to meet their needsCollaborate with internal teams, including sales, product, engineering, and customer support, to address client needs and resolve technical issues.Develop and maintain a deep understanding of our AI-powered customer support solutions, and effectively communicate technical information to clients.Identify opportunities for upselling and cross-selling our solutions to existing clients.Track and report on key account metrics, such as customer satisfaction and product usage, and use this information to drive improvements in our solutions. Requirements : Bachelor's degree in Computer Science, Information Systems related field OR equivalent experience 3+ years of experience in a hands on technical role; 1-2+ years of experience delivering successful customer implementations Strong technical background with knowledge of SaaS platforms, APIs, and cloud services Excellent project management skills with the ability to juggle multiple projects simultaneously Ability to translate complex concepts into actionable items to non-technical stakeholders Strong communication skills in English (both written and verbal) Entrepreneurial & Problem-Solving Attitude - Self-motivated, adaptable, and resourceful in tackling implementation challenges Comfortable working in US hours Optional Requirements : Experience interacting with APIs and using cloud services Experience with integrating with CRMs such as Salesforce Familiarity with intent-based and generative artificial intelligence Experience with Telephony Systems such as AWS Connect, Five9 and Genesys
Posted 1 day ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Microsoft’s Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. Within Azure Data, the data integration team builds data gravity on the Microsoft Cloud. Massive volumes of data are generated – not just from transactional systems of record, but also from the world around us. Our data integration products – Azure Data Factory and Power Query make it easy for customers to bring in, clean, shape, and join data, to extract intelligence. We do not just value differences or different perspectives. We seek them out and invite them in so we can tap into the collective power of everyone in the company. As a result, our customers are better served. Responsibilities Build cloud scale products with focus on efficiency, reliability and security Build and maintain end-to-end Build, Test and Deployment pipelines Deploy and manage massive Hadoop, Spark and other clusters Contribute to the architecture & design of the products Triaging issues and implementing solutions to restore service with minimal disruption to the customer and business. Perform root cause analysis, trend analysis and post-mortems Owning the components and driving them end to end, all the way from gathering requirements, development, testing, deployment to ensuring high quality and availability post deployment Embody our culture and values Qualifications Required/Minimum Qualifications Bachelor's Degree in Computer Science, or related technical discipline AND 4+ years technical engineering experience with coding in languages like C#, React, Redux, TypeScript, JavaScript, Java or Python OR equivalent experience Experience in data integration or data migrations or ELT or ETL tooling is mandatory Other Requirements Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include, but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft Cloud background check upon hire/transfer and every two years thereafter. Equal Opportunity Employer (EOP) #azdat #azuredata #azdat #azuredata #microsoftfabric #dataintegration Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.
Posted 1 day ago
0 years
0 Lacs
Navi Mumbai, Maharashtra, India
On-site
At Iron Mountain we know that work, when done well, makes a positive impact for our customers, our employees, and our planet. That’s why we need smart, committed people to join us. Whether you’re looking to start your career or make a change, talk to us and see how you can elevate the power of your work at Iron Mountain. We provide expert, sustainable solutions in records and information management, digital transformation services, data centers, asset lifecycle management, and fine art storage, handling, and logistics. We proudly partner every day with our 225,000 customers around the world to preserve their invaluable artifacts, extract more from their inventory, and protect their data privacy in innovative and socially responsible ways. Are you curious about being part of our growth stor y while evolving your skills in a culture that will welcome your unique contributions? If so, let's start the conversation. About The Role As a Management Trainee – Data Analysis , you will play a key role in transforming raw data into actionable business insights. This role is designed to give you hands-on experience across the data lifecycle—from collection and analysis to visualization and reporting—while helping drive decision-making and operational excellence. You will work closely with cross-functional teams, support business intelligence initiatives, and contribute to process improvements. This is an excellent opportunity for someone looking to build a strong foundation in data analytics, MIS reporting, and business operations. Key Responsibilities Data Analysis Collect, clean, and analyze data from various sources to provide actionable insights for process optimization and business improvement Identify trends, patterns, and anomalies in data and present findings clearly and effectively MIS Development Design, develop, and maintain MIS reports and dashboards to monitor key performance indicators (KPIs), operational metrics, and business metrics Ensure data accuracy and completeness through regular validation and updates Data Visualization Create clear and engaging visualizations using tools like Tableau, Power BI, or similar platforms Present complex data in an easy-to-understand format for stakeholders and leadership Data Hosting and Management Manage data repositories and databases, ensuring security, integrity, and accessibility for authorized users Implement data governance protocols and ensure compliance with relevant data privacy standards Collaboration and Communication Collaborate with cross-functional teams to understand data and reporting needs Translate analytical findings into business insights for non-technical stakeholders Process Improvement Identify and implement improvements in data collection, processing, and reporting workflows Develop tools or methods to enhance data efficiency and accuracy Qualifications Master’s degree in a relevant field (e.g., Data Analytics, Statistics, Business, or Engineering) Proven experience in data analysis, visualization, and MIS development—preferably in operations or business support functions Proficiency in tools and platforms such as Google Suite (Sheets, Slides, Data Studio), Excel, and dashboards Strong analytical and problem-solving skills with a keen eye for detail Ability to work both independently and in a collaborative team setting Excellent verbal and written communication skills Knowledge of data security, governance, and compliance standards Experience with data visualization platforms like Tableau, Power BI, or similar tools (preferred) Experience in database management and data warehousing (preferred) Category: Administrative Services Iron Mountain is a global leader in storage and information management services trusted by more than 225,000 organizations in 60 countries. We safeguard billions of our customers’ assets, including critical business information, highly sensitive data, and invaluable cultural and historic artifacts. Take a look at our history here. Iron Mountain helps lower cost and risk, comply with regulations, recover from disaster, and enable digital and sustainable solutions, whether in information management, digital transformation, secure storage and destruction, data center operations, cloud services, or art storage and logistics. Please see our Values and Code of Ethics for a look at our principles and aspirations in elevating the power of our work together. If you have a physical or mental disability that requires special accommodations, please let us know by sending an email to accommodationrequest@ironmountain.com. See the Supplement to learn more about Equal Employment Opportunity. Iron Mountain is committed to a policy of equal employment opportunity. We recruit and hire applicants without regard to race, color, religion, sex (including pregnancy), national origin, disability, age, sexual orientation, veteran status, genetic information, gender identity, gender expression, or any other factor prohibited by law. To view the Equal Employment Opportunity is the Law posters and the supplement, as well as the Pay Transparency Policy Statement, CLICK HERE Requisition: J0090426
Posted 1 day ago
10.0 years
0 Lacs
Gurugram, Haryana, India
On-site
At Iron Mountain we know that work, when done well, makes a positive impact for our customers, our employees, and our planet. That’s why we need smart, committed people to join us. Whether you’re looking to start your career or make a change, talk to us and see how you can elevate the power of your work at Iron Mountain. We provide expert, sustainable solutions in records and information management, digital transformation services, data centers, asset lifecycle management, and fine art storage, handling, and logistics. We proudly partner every day with our 225,000 customers around the world to preserve their invaluable artifacts, extract more from their inventory, and protect their data privacy in innovative and socially responsible ways. Are you curious about being part of our growth stor y while evolving your skills in a culture that will welcome your unique contributions? If so, let's start the conversation. We are seeking an experienced and visionary leader to head our Insurance Vertical at the national level. The ideal candidate will have a proven track record in the insurance sector, particularly in working with corporate insurance clients. This role requires deep knowledge of industry operations, a strong understanding of digital transformation trends, and significant leadership experience in managing high-performing, cross-functional teams. The candidate will be responsible for driving strategic growth, deepening client relationships, and shaping the future of our insurance offerings in a rapidly evolving market. Key Responsibilities Strategic Leadership Define and implement the growth strategy for the insurance vertical in alignment with the company’s overall business objectives Leverage industry insights and emerging digital trends to stay ahead of the competition Client Relationship Management Build and nurture long-term relationships with key corporate clients, including insurers, brokers, and other ecosystem players Position the company as a trusted strategic partner in their digital transformation initiatives Business Growth Drive revenue growth through new client acquisition and account expansion Focus on offerings across life, health, and general insurance sectors Team Leadership & Development Lead, mentor, and develop a high-performing team including business development managers, solution architects, and account executives Foster a collaborative, accountable, and innovation-driven team culture Market Intelligence Monitor regulatory changes, industry trends, and technological advancements in the insurance sector Adapt business strategy based on insights to maintain market relevance and competitive edge Cross-Functional Collaboration Partner with product, marketing, and operations teams to ensure the successful delivery of tailored solutions Act as the voice of the customer internally, aligning solutions to client pain points and industry needs Qualifications & Skills Bachelor’s degree in Business, Technology, Insurance, or related field; MBA or equivalent preferred 10+ years of experience in the insurance sector, with demonstrated leadership in digital transformation and IT-enabled solutions Proven success in managing and scaling high-performing teams Deep understanding of insurance business models, regulations, and client expectations Strong experience in consultative/solution selling and strategic account management Excellent communication, stakeholder management, and executive presentation skills Ability to lead complex cross-functional initiatives and deliver tangible business outcomes Preferred Attributes Experience with enterprise technology solutions relevant to the insurance industry (e.g., document management, workflow automation, AI/ML, data analytics) Existing relationships with top-tier insurance companies, brokers, and solution partners Demonstrated success in achieving revenue targets and market expansion in competitive environments Track record of driving innovation and managing organizational change effectively Category: General Management Iron Mountain is a global leader in storage and information management services trusted by more than 225,000 organizations in 60 countries. We safeguard billions of our customers’ assets, including critical business information, highly sensitive data, and invaluable cultural and historic artifacts. Take a look at our history here. Iron Mountain helps lower cost and risk, comply with regulations, recover from disaster, and enable digital and sustainable solutions, whether in information management, digital transformation, secure storage and destruction, data center operations, cloud services, or art storage and logistics. Please see our Values and Code of Ethics for a look at our principles and aspirations in elevating the power of our work together. If you have a physical or mental disability that requires special accommodations, please let us know by sending an email to accommodationrequest@ironmountain.com. See the Supplement to learn more about Equal Employment Opportunity. Iron Mountain is committed to a policy of equal employment opportunity. We recruit and hire applicants without regard to race, color, religion, sex (including pregnancy), national origin, disability, age, sexual orientation, veteran status, genetic information, gender identity, gender expression, or any other factor prohibited by law. To view the Equal Employment Opportunity is the Law posters and the supplement, as well as the Pay Transparency Policy Statement, CLICK HERE Requisition: J0091093
Posted 1 day ago
1.0 - 31.0 years
1 - 1 Lacs
Jubilee Hills, Hyderabad Region
On-site
We are looking for a detail-oriented MIS Executive to collect, analyze, and manage data from various web sources and internal tools. The ideal candidate should have a strong understanding of computers, be proficient in Google Sheets and Excel, and assist in generating reports to support business decisions. Key Responsibilities:Data Collection & Management: Extract and compile data from websites, APIs, and other online sources. Maintain and update databases with accurate and relevant information. Ensure data integrity by performing regular checks and validations. Reporting & Analysis: Generate daily/weekly/monthly reports for management. Analyze trends and provide actionable insights from collected data. Assist in preparing presentations with data-driven findings. Required Skills & Qualifications:Education: Bachelor’s degree in Computer Science, IT, Business Analytics, or related field. Experience: 1-2 years in MIS, data handling, or a similar role. Technical Skills: Strong proficiency in Google Sheets & Excel (Advanced Functions, Macros, Automation). Basic knowledge of web scraping tools (e.g., ImportXML, Python, or no-code scrapers) is a plus. Familiarity with SQL, APIs, or data visualization tools (Power BI, Tableau) is an advantage. Analytical Skills: Ability to interpret data and generate insights. Attention to Detail: High accuracy in data entry and reporting. Communication: Good verbal and written communication skills
Posted 1 day ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
To get the best candidate experience, please consider applying for a maximum of 3 roles within 12 months to ensure you are not duplicating efforts. Job Category Finance Job Details About Salesforce Salesforce is the #1 AI CRM, where humans with agents drive customer success together. Here, ambition meets action. Tech meets trust. And innovation isn’t a buzzword — it’s a way of life. The world of work as we know it is changing and we're looking for Trailblazers who are passionate about bettering business and the world through AI, driving innovation, and keeping Salesforce's core values at the heart of it all. Ready to level-up your career at the company leading workforce transformation in the agentic era? You’re in the right place! Agentforce is the future of AI, and you are the future of Salesforce. Description Salesforce’s Quote to Cash (QTC) Enterprise Strategy & Solutions team is hiring a Business Analyst. We’re looking for critical thinkers that want to roll up their sleeves and work on some of the most complex and visible projects currently underway. As a member of the Global Business Strategy and Operations organization, Analysts will perform a variety of responsibilities on enterprise level projects to improve and scale our internal Quote-To-Cash operations. We are seeking proactive, self-motivated individuals who are comfortable navigating ambiguity, take initiative, and consistently drive project success with minimal oversight. This role requires close, real-time collaboration with US-based counterparts—including Functional Leads, Senior Analysts, Technical Architects, and Product Managers—which necessitates aligning with US business hours as per the defined shifts. Responsibilities Coordinate with Functional Leads and Senior Analysts to understand the future state vision for L2C/QTC processes and features in order to then deliver progressive capabilities towards that end-state in each release. Lead the Business Requirements gathering and documentation process by collaborating with crucial subject matter experts to transform existing processes to drive the future of quoting to our customers. Diagram as-is and to-be business processes using tools like Lucidcharts. Coordinate and lead cross-functional meetings, document decisions & follow-up on actions. Engage with Technical Architects and Product Managers to create innovative, holistic solutions to deliver upon the Business Requirements and future state needs. Project management activities including reporting escalations, tracking requirements delivery, communicating cross-functional dependencies and creating status updates. Act as a subject matter expert for Salesforce internal QTC systems and processes. Develop, document, and maintain a thorough repository and understanding of business rules and process flows. Work with training & engagement specialists to create training materials to ensure successful change management results. Ad-hoc reporting and research activities as project needs dictate. Participating in user acceptance testing (UAT). Required Skills/Experience Experience with business requirements gathering and documentation / user story experience Excellent interpersonal skills; ability to articulate verbally and in writing; willingness to appropriately debate difficult issues; ability to think quickly; excellent listening skills; organizational skills Ability to excel in a fast-paced environment delivering accuracy while managing ambiguity and deadlines where adaptability is imperative Capacity to identify and understand broader business and financial issues from an end-user’s perspective and consider cross-functional and downstream impacts Experience successfully juggling multiple projects and tasks concurrently Extreme attention to detail with an ability to work independently and demonstrate initiative Curiosity in order to extract relevant information from subject matter experts Prior experience as a Business Analyst Preferred Skills/Experience Experience related to Configure Price Quote, Contract Lifecycle and/or Order Management processes and systems Working knowledge of Lucidcharts or similar process flow documentation software Working knowledge of Smartsheets or other project management software Experience with Salesforce products a plus Exposure to enterprise level, transformational projects Prior experience with New Product Introductions processes, Business Operations, Quote to Cash Operations and/or M&A Operations Unleash Your Potential When you join Salesforce, you’ll be limitless in all areas of your life. Our benefits and resources support you to find balance and be your best , and our AI agents accelerate your impact so you can do your best . Together, we’ll bring the power of Agentforce to organizations of all sizes and deliver amazing experiences that customers love. Apply today to not only shape the future — but to redefine what’s possible — for yourself, for AI, and the world. Accommodations If you require assistance due to a disability applying for open positions please submit a request via this Accommodations Request Form. Posting Statement Salesforce is an equal opportunity employer and maintains a policy of non-discrimination with all employees and applicants for employment. What does that mean exactly? It means that at Salesforce, we believe in equality for all. And we believe we can lead the path to equality in part by creating a workplace that’s inclusive, and free from discrimination. Know your rights: workplace discrimination is illegal. Any employee or potential employee will be assessed on the basis of merit, competence and qualifications – without regard to race, religion, color, national origin, sex, sexual orientation, gender expression or identity, transgender status, age, disability, veteran or marital status, political viewpoint, or other classifications protected by law. This policy applies to current and prospective employees, no matter where they are in their Salesforce employment journey. It also applies to recruiting, hiring, job assignment, compensation, promotion, benefits, training, assessment of job performance, discipline, termination, and everything in between. Recruiting, hiring, and promotion decisions at Salesforce are fair and based on merit. The same goes for compensation, benefits, promotions, transfers, reduction in workforce, recall, training, and education.
Posted 1 day ago
5.0 - 8.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Job Summary Leads projects for design, development and maintenance of a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with key business stakeholders, IT experts and subject-matter experts to plan, design and deliver optimal analytics and data science solutions. Works on one or many product teams at a time. Key Responsibilities Designs and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Designs and implements framework to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Designs and provide guidance on building reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Designs and implements physical data models to define the database structure. Optimizing database performance through efficient indexing and table relationships. Participates in optimizing, testing, and troubleshooting of data pipelines. Designs, develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity. Assists with renovating the data management infrastructure to drive automation in data integration and management. Ensures the timeliness and success of critical analytics initiatives by using agile development technologies such as DevOps, Scrum, Kanban Coaches and develops less experienced team members. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience Intermediate experience in a relevant discipline area is required. Knowledge of the latest technologies and trends in data engineering are highly preferred and includes: 5-8 years of experince Familiarity analyzing complex business systems, industry requirements, and/or data regulations Background in processing and managing large data sets Design and development for a Big Data platform using open source and third-party tools SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Experience developing applications requiring large file movement for a Cloud-based environment and other data extraction tools and methods from a variety of sources Experience in building analytical solutions Intermediate Experiences In The Following Are Preferred Experience with IoT technology Experience in Agile software development Qualifications Work closely with business Product Owner to understand product vision. 2) Play a key role across DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). 3) Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. 4) Independently design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. 5) Responsible for creation, maintenence and management of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP). 6) Take part in evaluation of new data tools, POCs and provide suggestions. 7) Take full ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization. 8) Proactively address and resolve issues that compromise data accuracy and usability. Preferred Skills Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. Data Replication: Working knowledge of replication technologies like Qlik Replicate is a plus API: Working knowledge of API to consume data from ERP, CRM Job Systems/Information Technology Organization Cummins Inc. Role Category Remote Job Type Exempt - Experienced ReqID 2417810 Relocation Package Yes
Posted 1 day ago
5.0 - 8.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Job Summary Leads projects for design, development and maintenance of a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with key business stakeholders, IT experts and subject-matter experts to plan, design and deliver optimal analytics and data science solutions. Works on one or many product teams at a time. Key Responsibilities Designs and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Designs and implements framework to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Designs and provide guidance on building reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Designs and implements physical data models to define the database structure. Optimizing database performance through efficient indexing and table relationships. Participates in optimizing, testing, and troubleshooting of data pipelines. Designs, develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity. Assists with renovating the data management infrastructure to drive automation in data integration and management. Ensures the timeliness and success of critical analytics initiatives by using agile development technologies such as DevOps, Scrum, Kanban Coaches and develops less experienced team members. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience Intermediate experience in a relevant discipline area is required. Knowledge of the latest technologies and trends in data engineering are highly preferred and includes: 5-8 years of experience Familiarity analyzing complex business systems, industry requirements, and/or data regulations Background in processing and managing large data sets Design and development for a Big Data platform using open source and third-party tools SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Experience developing applications requiring large file movement for a Cloud-based environment and other data extraction tools and methods from a variety of sources Experience in building analytical solutions Intermediate Experiences In The Following Are Preferred Experience with IoT technology Experience in Agile software development Qualifications Work closely with business Product Owner to understand product vision. 2) Play a key role across DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). 3) Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. 4) Independently design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. 5) Responsible for creation, maintenence and management of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP). 6) Take part in evaluation of new data tools, POCs and provide suggestions. 7) Take full ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization. 8) Proactively address and resolve issues that compromise data accuracy and usability. Preferred Skills Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. Data Replication: Working knowledge of replication technologies like Qlik Replicate is a plus API: Working knowledge of API to consume data from ERP, CRM Job Systems/Information Technology Organization Cummins Inc. Role Category Remote Job Type Exempt - Experienced ReqID 2417809 Relocation Package Yes
Posted 1 day ago
4.0 - 5.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Job Summary Supports, develops and maintains a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with the Business and IT teams to understand the requirements to best leverage the technologies to enable agile data delivery at scale. Key Responsibilities Implements and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Implements methods to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Develops reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Develops physical data models and implements data storage architectures as per design guidelines. Analyzes complex data elements and systems, data flow, dependencies, and relationships in order to contribute to conceptual physical and logical data models. Participates in testing and troubleshooting of data pipelines. Develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses agile development technologies, such as DevOps, Scrum, Kanban and continuous improvement cycle, for data driven application. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience 4-5 Years of experience. Relevant experience preferred such as working in a temporary student employment, intern, co-op, or other extracurricular team activities. Knowledge of the latest technologies in data engineering is highly preferred and includes: Exposure to Big Data open source SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Familiarity developing applications requiring large file movement for a Cloud-based environment Exposure to Agile software development Exposure to building analytical solutions Exposure to IoT technology Qualifications Work closely with business Product Owner to understand product vision. 2) Participate in DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). 3) Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. 4) Work under limited supervision to design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. 5) Responsible for creation of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP) with guidance and help from senior data engineers. 6) Take part in evaluation of new data tools, POCs with guidance and help from senior data engineers. 7) Take ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization under limited supervision. 8) Assist to resolve issues that compromise data accuracy and usability. Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Intermediate level expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. API: Working knowledge of API to consume data from ERP, CRM Job Systems/Information Technology Organization Cummins Inc. Role Category Remote Job Type Exempt - Experienced ReqID 2417808 Relocation Package Yes
Posted 1 day ago
3.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Level AI was founded in 2019 and is a Series C startup headquartered in Mountain View, California. Level AI revolutionizes customer engagement by transforming contact centers into strategic assets. Our AI-native platform leverages advanced technologies such as Large Language Models to extract deep insights from customer interactions. By providing actionable intelligence, Level AI empowers organizations to enhance customer experience and drive growth. Consistently updated with the latest AI innovations, Level AI stands as the most adaptive and forward-thinking solution in the industry. Empowering contact center stakeholders with real-time insights, our tech facilitates data-driven decision-making for contact centers, enhancing service levels and agent performance. As a vital team member, your work will be cutting-edge technologies and will play a high-impact role in shaping the future of AI-driven enterprise applications. You will directly work with people who've worked at Amazon, Facebook, Google, and other technology companies in the world. With Level AI, you will get to have fun, learn new things, and grow along with us. Ready to redefine possibilities? Join us! We'll love to explore more about you if you have B.E/B.Tech/M.E/M.Tech/PhD from Tier 1 engineering institutes only with relevant work experience with a top technology company in computer science or mathematics-related fields. 3+ years of experience in AI/ML Strong coding skills in Python and familiarity with libraries like LangChain or Transformers Interest in LLMs, agents, and the evolving open-source AI ecosystem Eagerness to learn, experiment, and grow in a fast-paced environment. Your role at Level AI includes but is not limited to Assist in building LLM-powered agents for internal tools and customer-facing products Support prompt engineering, retrieval-augmented generation (RAG), and tool integrations Collaborate on experiments with open-source and commercial LLMs (e.g., GPT, Claude, Mistral) Help implement and evaluate reasoning, planning, and memory modules for agents Work closely with senior engineers to deploy and monitor AI features in production Bonus Points Experience with open-source LLMs (LLaMA, Mistral, etc.) Basic understanding of vector search, RAG, and prompt engineering concepts Contributions to AI side projects or GitHub repos Exposure to vector databases or retrieval pipelines (e.g., FAISS, Pinecone) To Apply- https://jobs.lever.co/levelai/cc04ab77-6ee3-4078-9cfd-110cda0b1438 To learn more visit : https://thelevel.ai/ Funding : https://www.crunchbase.com/organization/level-ai LinkedIn : https://www.linkedin.com/company/level-ai/
Posted 1 day ago
4.0 - 7.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Responsible for developing, optimize, and maintaining business intelligence and data warehouse systems, ensuring secure, efficient data storage and retrieval, enabling self-service data exploration, and supporting stakeholders with insightful reporting and analysis. Grade - T5 Please note that the Job will close at 12am on Posting Close date, so please submit your application prior to the Close Date Accountabilities What your main responsibilities are: Data Pipeline - Develop and maintain scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity Data Integration - Connect offline and online data to continuously improve overall understanding of customer behavior and journeys for personalization. Data pre-processing including collecting, parsing, managing, analyzing and visualizing large sets of data Data Quality Management - Cleanse the data and improve data quality and readiness for analysis. Drive standards, define and implement/improve data governance strategies and enforce best practices to scale data analysis across platforms Data Transformation - Processes data by cleansing data and transforming them to proper storage structure for the purpose of querying and analysis using ETL and ELT process Data Enablement - Ensure data is accessible and useable to wider enterprise to enable a deeper and more timely understanding of operation. Qualifications & Specifications Masters /Bachelor’s degree in Engineering /Computer Science/ Math/ Statistics or equivalent. Strong programming skills in Python/Pyspark/SAS. Proven experience with large data sets and related technologies – Hadoop, Hive, Distributed computing systems, Spark optimization. Experience on cloud platforms (preferably Azure) and it's services Azure Data Factory (ADF), ADLS Storage, Azure DevOps. Hands-on experience on Databricks, Delta Lake, Workflows. Should have knowledge of DevOps process and tools like Docker, CI/CD, Kubernetes, Terraform, Octopus. Hands-on experience with SQL and data modeling to support the organization's data storage and analysis needs. Experience on any BI tool like Power BI (Good to have). Cloud migration experience (Good to have) Cloud and Data Engineering certification (Good to have) Working in an Agile environment 4-7 years of relevant work experience needed. Experience with stakeholder management will be an added advantage. What We Are Looking For Education: Bachelor's degree or equivalent in Computer Science, MIS, Mathematics, Statistics, or similar discipline. Master's degree or PhD preferred. Knowledge, Skills And Abilities Fluency in English Analytical Skills Accuracy & Attention to Detail Numerical Skills Planning & Organizing Skills Presentation Skills Data Modeling and Database Design ETL (Extract, Transform, Load) Skills Programming Skills FedEx was built on a philosophy that puts people first, one we take seriously. We are an equal opportunity/affirmative action employer and we are committed to a diverse, equitable, and inclusive workforce in which we enforce fair treatment, and provide growth opportunities for everyone. All qualified applicants will receive consideration for employment regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy (including childbirth or a related medical condition), physical or mental disability, or any other characteristic protected by applicable laws, regulations, and ordinances. Our Company FedEx is one of the world's largest express transportation companies and has consistently been selected as one of the top 10 World’s Most Admired Companies by "Fortune" magazine. Every day FedEx delivers for its customers with transportation and business solutions, serving more than 220 countries and territories around the globe. We can serve this global network due to our outstanding team of FedEx team members, who are tasked with making every FedEx experience outstanding. Our Philosophy The People-Service-Profit philosophy (P-S-P) describes the principles that govern every FedEx decision, policy, or activity. FedEx takes care of our people; they, in turn, deliver the impeccable service demanded by our customers, who reward us with the profitability necessary to secure our future. The essential element in making the People-Service-Profit philosophy such a positive force for the company is where we close the circle, and return these profits back into the business, and invest back in our people. Our success in the industry is attributed to our people. Through our P-S-P philosophy, we have a work environment that encourages team members to be innovative in delivering the highest possible quality of service to our customers. We care for their well-being, and value their contributions to the company. Our Culture Our culture is important for many reasons, and we intentionally bring it to life through our behaviors, actions, and activities in every part of the world. The FedEx culture and values have been a cornerstone of our success and growth since we began in the early 1970’s. While other companies can copy our systems, infrastructure, and processes, our culture makes us unique and is often a differentiating factor as we compete and grow in today’s global marketplace.
Posted 1 day ago
20.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Over the last 20 years, Ares’ success has been driven by our people and our culture. Today, our team is guided by our core values – Collaborative, Responsible, Entrepreneurial, Self-Aware, Trustworthy – and our purpose to be a catalyst for shared prosperity and a better future. Through our recruitment, career development and employee-focused programming, we are committed to fostering a welcoming and inclusive work environment where high-performance talent of diverse backgrounds, experiences, and perspectives can build careers within this exciting and growing industry. Job Description Role summary: Ares is looking for an Associate Vice President / Senior Associate to join the Mumbai Investment Operations team. The Investment Operations team works closely with business stakeholders in various lines of business, as well as various corporate functions. The ideal candidate will be responsible for overseeing loan operations team, fund admins, custodians, etc., as well as processing all credit activity and restructures in WSO for loans for various business lines. Other responsibilities include research and escalation of loan operations issues and breaks, working in partnership with the Loan Settlements/Servicing teams in Los Angeles. Must have practical experience with the loan closing and loan servicing process, also processing experience in Wall Street Office is preferred. Ares, as an alternative asset manager, has an asset mix which is comprehensive and heavily concentrated in bank debt. The ideal candidate would have experience working with diverse lines of business for a global client base including pensions, insurance, and institutional investors. The role requires a dynamic, adaptive, experienced hands-on professional to ensure best practices in a fast-paced rapidly growing environment. Primary Responsibilities Specific responsibilities include, but are not limited to: Serve as primary escalation contact and day to day manager for the loan operations team in Mumbai Facilitate training and provide ongoing support for the local team Coordinate, process, and reconcile the processing of all daily servicing events, including amendments and restructures (preparation of transaction loaders, reviewing funds flows, and more) Oversee and manage loan processing in WSO of all deals Direct third-party fund administrators and custodian banks on appropriate processing and review/reconcile processing output for accuracy, including restructures, multicurrency facility processing, non pro rata activity, principal repayments with fees, etc. Daily review of credit events with third-party administrators and custodian banks Act as 1st point of escalation for high-risk breaks and identify areas for issue prevention Review daily recons between internal systems and third parties to resolve discrepancies Coordinate loan operations related audit requests Prepare KPIs on a regular basis and participate in ad hoc projects Maintain high standard of quality controls, and work with internal and external stakeholders to enhance loan operations workflows Liaise with local finance teams, offshore partners, deal teams, investment accounting, middle office, treasury, and trustees for all portfolio-specific activity and issues, ensuring cross-communication of critical information between firm departments Manage oversight of all UK based agents and custodians to resolve loan related issues in a timely manner Experience Required Experience in high quality, global capital markets or investment management firms with expertise in Investment Operations and Asset Servicing related functions. Experience in Investment Operations in any of middle office or back-office functions. Prior experience with an alternative asset manager preferred broader asset management experience preferred. Strong knowledge of bank loans primarily with the willingness to cross train and learn various asset classes Must have experience with loan closing process in ClearPar and loan servicing process in Wall Street Office. Also, preferred experience with Black Mountain (Allvue), Everest, Geneva, and/or IVP data management platforms. Understanding of basis accounting theories. Loan Operations experience in private/middle market loans preferred, but not required. Experienced with a diverse set of investment vehicles such as Institutional Separate Accounts, SMA/Limited Partnerships, Open-End Mutual Funds, Closed-End Funds and UCITs, CLOs, and complex fund structures. Hedge fund, Credit or Private Equity experience is a plus. General Requirements Ability to extract meaningful information from extensive research and analysis to effectively present facts and findings in a digestible format, a keen eye for attention to detail A self-directed individual with a can-do attitude, willing to work in an energetic, collaborative, and fast-paced environment, proactive in nature, and a proven ability to resolve issues with minimal supervision Proven outstanding communication (written and verbal), presentation, documentation, collaboration, and interpersonal skills A hands-on approach and ability to synthesize business operations and talent needs Ability to successfully manage multiple priorities and competing demands High accuracy and detail orientation Good judgment in terms of escalating issues vs. solving problems independently A solutions-oriented, self-starter and ability to see the big picture Comfort in dealing with ambiguity and uncertainty in a dynamic and fast-paced environment Ability to be flexible in terms of hours to coordinate with team members across various time zones An analytical mind and a passion/interest in bringing new ideas to increase efficiency of existing processes Dependable, great attitude, highly motivated and a team player Strong Leadership Skills Reporting Relationships Associate Vice President, Global Asset Servicing & Reconciliation There is no set deadline to apply for this job opportunity. Applications will be accepted on an ongoing basis until the search is no longer active.
Posted 1 day ago
4.0 - 7.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Responsible for developing, optimize, and maintaining business intelligence and data warehouse systems, ensuring secure, efficient data storage and retrieval, enabling self-service data exploration, and supporting stakeholders with insightful reporting and analysis. Grade - T5 Please note that the Job will close at 12am on Posting Close date, so please submit your application prior to the Close Date Accountabilities What your main responsibilities are: Data Pipeline - Develop and maintain scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity Data Integration - Connect offline and online data to continuously improve overall understanding of customer behavior and journeys for personalization. Data pre-processing including collecting, parsing, managing, analyzing and visualizing large sets of data Data Quality Management - Cleanse the data and improve data quality and readiness for analysis. Drive standards, define and implement/improve data governance strategies and enforce best practices to scale data analysis across platforms Data Transformation - Processes data by cleansing data and transforming them to proper storage structure for the purpose of querying and analysis using ETL and ELT process Data Enablement - Ensure data is accessible and useable to wider enterprise to enable a deeper and more timely understanding of operation. Qualifications & Specifications Masters /Bachelor’s degree in Engineering /Computer Science/ Math/ Statistics or equivalent. Strong programming skills in Python/Pyspark/SAS. Proven experience with large data sets and related technologies – Hadoop, Hive, Distributed computing systems, Spark optimization. Experience on cloud platforms (preferably Azure) and it's services Azure Data Factory (ADF), ADLS Storage, Azure DevOps. Hands-on experience on Databricks, Delta Lake, Workflows. Should have knowledge of DevOps process and tools like Docker, CI/CD, Kubernetes, Terraform, Octopus. Hands-on experience with SQL and data modeling to support the organization's data storage and analysis needs. Experience on any BI tool like Power BI (Good to have). Cloud migration experience (Good to have) Cloud and Data Engineering certification (Good to have) Working in an Agile environment 4-7 years of relevant work experience needed. Experience with stakeholder management will be an added advantage. What We Are Looking For Education: Bachelor's degree or equivalent in Computer Science, MIS, Mathematics, Statistics, or similar discipline. Master's degree or PhD preferred. Knowledge, Skills And Abilities Fluency in English Analytical Skills Accuracy & Attention to Detail Numerical Skills Planning & Organizing Skills Presentation Skills Data Modeling and Database Design ETL (Extract, Transform, Load) Skills Programming Skills FedEx was built on a philosophy that puts people first, one we take seriously. We are an equal opportunity/affirmative action employer and we are committed to a diverse, equitable, and inclusive workforce in which we enforce fair treatment, and provide growth opportunities for everyone. All qualified applicants will receive consideration for employment regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy (including childbirth or a related medical condition), physical or mental disability, or any other characteristic protected by applicable laws, regulations, and ordinances. Our Company FedEx is one of the world's largest express transportation companies and has consistently been selected as one of the top 10 World’s Most Admired Companies by "Fortune" magazine. Every day FedEx delivers for its customers with transportation and business solutions, serving more than 220 countries and territories around the globe. We can serve this global network due to our outstanding team of FedEx team members, who are tasked with making every FedEx experience outstanding. Our Philosophy The People-Service-Profit philosophy (P-S-P) describes the principles that govern every FedEx decision, policy, or activity. FedEx takes care of our people; they, in turn, deliver the impeccable service demanded by our customers, who reward us with the profitability necessary to secure our future. The essential element in making the People-Service-Profit philosophy such a positive force for the company is where we close the circle, and return these profits back into the business, and invest back in our people. Our success in the industry is attributed to our people. Through our P-S-P philosophy, we have a work environment that encourages team members to be innovative in delivering the highest possible quality of service to our customers. We care for their well-being, and value their contributions to the company. Our Culture Our culture is important for many reasons, and we intentionally bring it to life through our behaviors, actions, and activities in every part of the world. The FedEx culture and values have been a cornerstone of our success and growth since we began in the early 1970’s. While other companies can copy our systems, infrastructure, and processes, our culture makes us unique and is often a differentiating factor as we compete and grow in today’s global marketplace.
Posted 1 day ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev. The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology. The teams are transforming Operations through Tech and Analytics. Do You Dream Big? We Need You. Job Title: Data Scientist Location: Bangalore Reporting to: Senior Manager Analytics 1.Purpose of the role The Data Scientist will play a key role in designing and delivering data-driven solutions that enable better decision-making across the organization. This role requires strong hands-on coding skills in Python, experience with core data science libraries, and the ability to statistically validate features and models. The analyst will collaborate across teams, work efficiently with existing codebases, and apply version control and development best practices to build scalable, production-ready analytics solutions. With intermediate SQL expertise and a solid grasp of model development workflows, the role ensures robust, interpretable, and actionable outcomes from complex data. 2.Key Tasks AND Accountabilities Develop and maintain data science models using Python, applying intermediate to advanced knowledge of syntax, data structures, and key libraries such as pandas and numpy. Perform feature engineering and statistical validation of features and models to ensure robustness, accuracy, and business relevance. Write clean, modular, and well-documented code following best development practices; optionally adopt Test-Driven Development (TDD) to enable faster iteration and feedback cycles. Collaborate with cross-functional teams to understand data requirements, align on analytical solutions, and translate business problems into data science problems. Read, understand, and extend existing codebases, adapting quickly to different coding styles and project structures. Leverage version control tools like Git for collaborative development, code management, and maintaining reproducibility of models. Write and optimize intermediate-level SQL queries to extract, transform, and analyze data from structured databases. Contribute to the deployment readiness of models, ensuring outputs are interpretable, reusable, and aligned with production or decision-support use cases. Document processes, assumptions, and outputs clearly for stakeholder transparency, reproducibility, and future reference. Stay up to date with industry trends, new tools, and emerging best practices in data science, analytics, and development methodologies. 3.Qualifications, Experience, Skills Bachelor’s or master’s degree in computer science, Information Systems, Artificial Intelligence, Machine Learning, or a related field (B. Tech / BE / Masters in CS/IS/AI/ML). Work experience Minimum of 3 years of hands-on experience in a data science or analytics role, with a proven track record of building and deploying data-driven solutions in real-world scenarios. Technical Skills Required, Python Programming (Intermediate to Advanced): Strong grasp of syntax, data structures, and experience with libraries like pandas and NumPy. Data Science Fundamentals: Ability to statistically validate features and models, ensuring sound analytical rigor. SQL (Intermediate): Proficiency in writing queries to extract, manipulate, and analyze data from relational databases. Version Control (GIT): Familiarity with collaborative development using Git for code versioning and management. Code Adaptability: Comfortable working with and modifying existing codebases written by others. Good to have skills: Object-Oriented Programming (OOPs) in Python : Understanding and applying OOP concepts where appropriate. Test-Driven Development (TDD): Awareness of TDD practices for faster iteration and improved code quality. Model Deployment Lifecycle Knowledge : Familiarity with reproducibility, tracking, and maintaining deployed models (though not explicitly required, it’s a plus if known). And above all of this, an undying love for beer! We dream big to create future with more cheers .
Posted 1 day ago
0.0 - 1.0 years
0 - 0 Lacs
Kochi, Kerala
Remote
Job Title: Data Analyst and CRM Support Location: palarivattom Hybrid Company: 11X Company Experience: 0–1 Year Gender Preference: Female Candidates Only Employment Type: Full-Time Job Summary: We are seeking a detail-oriented and analytical Data Analyst with exposure to CRM tools to join our growing team at 11X Company, Kerala . The ideal candidate will be responsible for collecting, processing, and analyzing data to help optimize our customer relationship strategies and business decisions. Key Responsibilities: Analyze CRM data to extract insights on customer behavior and campaign performance Assist in maintaining and updating CRM databases and dashboards Prepare regular reports and presentations for internal teams Identify trends, patterns, and areas of improvement using data analytics tools Collaborate with marketing, sales, and operations teams to streamline data flow and improve CRM effectiveness Ensure data accuracy and assist in data cleansing tasks Support ad hoc data requests from various departments Key Requirements: Bachelor’s degree in Data Science, Statistics, Computer Science, Business Analytics, or a related field 0–1 year of experience in data analysis or CRM support Familiarity with CRM tools like Zoho, Salesforce, HubSpot, or similar platforms Proficient in MS Excel, Google Sheets, and basic knowledge of SQL or data visualization tools (Power BI/Tableau) Strong analytical and problem-solving skills Attention to detail and a proactive mindset Good communication skills and ability to collaborate with cross-functional teams Preferred Attributes: Willingness to learn and grow in a data-driven environment Time management and multitasking capabilities Female candidates preferred as per team diversity goals Job Type: Full-time Pay: ₹20,000.00 - ₹25,000.00 per month Benefits: Work from home Schedule: Evening shift Night shift Rotational shift Application Question(s): Do you have experience with advanced excel? Language: English (Required) Work Location: Remote
Posted 1 day ago
5.0 years
0 Lacs
India
On-site
What You'll Do Avalara is an AI-first company. We expect every employee to actively leverage AI to enhance productivity, quality, innovation, and customer value. AI is embedded in our workflows and products — and success at Avalara requires embracing AI as an essential capability, not an optional tool. We are looking for a experienced Oracle Cloud ERP Techno Functional Consultant to join our team. You have experience with Oracle Cloud ERP & Oracle EBS, specifically to Cash, Procure to Pay and Tax module. You have understanding of Core Oracle Technology, Oracle business processes, multiple integration tools and the ability to collaborate with partners. You will be reporting to the Senior Technical Lead. Responsibilities What Your Responsibilities Will Be Technical Expertise: programming skills in relevant technologies like Java, SQL, PL/SQL, XML, RESTful APIs, JavaScript, and ADF and web services. Develop custom solutions, extensions, and integrations to meet specific our requirements. Report and Analytics: Proficiency in creating custom reports, dashboards, and analytics using Oracle BI Publisher, Oracle OTBI (Oracle Transactional Business Intelligence), and other reporting tools. Experience reviewing code to find and address potential issues and defects hands-on experience in BI Publisher, OTBI and Data Models Data Integration and Migration: Experience in data integration between Oracle Fusion applications and other systems and data migration from legacy systems to Oracle Fusion. Knowledge of ETL (Extract, Transform, Load) tools. Customization and Extensions: customize and extend Oracle Fusion applications using tools like Oracle JDeveloper, Oracle ADF, and Oracle Application Composer to tailor the software to meet needs. Oracle Fusion Product Knowledge: Expertise in Oracle Fusion Financials, Oracle Fusion SCM (Supply Chain Management), Oracle Fusion Procurement and Oracle Fusion Tax. Security and Access Control: Knowledge of security models, user roles, and access controls within Oracle Fusion applications to ensure data integrity and compliance. Performance Tuning and Optimization: Skills in diagnosing and resolving performance issues, optimizing database queries, and ensuring a smooth operation of Oracle Fusion applications. Problem Troubleshooting: Experience approaching a problem from different angles, analyzing pros and cons of different solutions to identify and address technical issues, system errors, and integration challenges. Experience communicating updates and resolutions to customers and other partners to work with clients, gather requirements, explain technical solutions to non-technical partners, and collaborate with teams. What You’ll Need To Be Successful Qualifications Minimum 5+ years of experience as Oracle Cloud ERP Financials Minimum 5+ years of experience as Oracle EBS Financials Bachelor's degree in Computer Science, Information Technology, or a related field. Previous experience implementing Tax Modules in Oracle Cloud ERP and Oracle EBS- Experience and desire to work in a Global delivery environment Experience with latest integration methodologies. Proficiency in CI/CD tools (Jenkins, GitLab, etc.) How We’ll Take Care Of You Total Rewards In addition to a great compensation package, paid time off, and paid parental leave, many Avalara employees are eligible for bonuses. Health & Wellness Benefits vary by location but generally include private medical, life, and disability insurance. Inclusive culture and diversity Avalara strongly supports diversity, equity, and inclusion, and is committed to integrating them into our business practices and our organizational culture. We also have a total of 8 employee-run resource groups, each with senior leadership and exec sponsorship. What You Need To Know About Avalara We’re defining the relationship between tax and tech. We’ve already built an industry-leading cloud compliance platform, processing over 54 billion customer API calls and over 6.6 million tax returns a year. Our growth is real - we're a billion dollar business - and we’re not slowing down until we’ve achieved our mission - to be part of every transaction in the world. We’re bright, innovative, and disruptive, like the orange we love to wear. It captures our quirky spirit and optimistic mindset. It shows off the culture we’ve designed, that empowers our people to win. We’ve been different from day one. Join us, and your career will be too. We’re An Equal Opportunity Employer Supporting diversity and inclusion is a cornerstone of our company — we don’t want people to fit into our culture, but to enrich it. All qualified candidates will receive consideration for employment without regard to race, color, creed, religion, age, gender, national orientation, disability, sexual orientation, US Veteran status, or any other factor protected by law. If you require any reasonable adjustments during the recruitment process, please let us know.
Posted 1 day ago
0 years
0 Lacs
India
On-site
🔷Job role :PARTNERSHIP OUTREACH INTERN 📍Duration: 2 Months (It is a performance based internship) At Steller Sprangs, our approach goes beyond conventional marketing. We specialize in crafting narratives that resonate, leveraging influencer partnerships that matter, and orchestrating PR campaigns that make waves. Our team is a blend of creativity, strategy, and innovation, ensuring that every project not only meets but exceeds expectations. In an era where digital presence is paramount, I am dedicated to boosting profiles and increasing visibility. As a thought leader in the media and marketing space, I strive to foster meaningful connections, share insights, and contribute to the ongoing discourse within our industry. 🔷Perks Included: 🔸 Certificate of Completion from our Company 🔸Letter of Recommendation on exceptional performance 🔸 Reference platform Recommendations 🔸 Flexible work timing 🔷Responsibilities Include: 🔸Research and identify potential leads through LinkedIn, Google, industry directories, and other online platforms. 🔸 Extract and maintain lead data using Excel/CRM tools. 🔸 Conduct cold outreach via email, LinkedIn, or calls under guidance. 🔸 Set up appointments and demos for the sales team. 🔸 Collaborate with marketing to align lead generation strategies. 🔸 Track outreach efforts and report weekly lead generation progress. 🔸 Maintain and update lead data in CRM platforms like HubSpot, Pipedrive, or Notion (as per company use) Application link - https://forms.gle/rqcvFzFAsbz1Wi9C8
Posted 1 day ago
2.0 - 4.0 years
12 - 15 Lacs
Mumbai Metropolitan Region
On-site
Job Title: Chief of Staff – Founder’s Office (Strategy & Execution) Location: Kandivali, Mumbai Industry: Manufacturing – Jewellery Qualification: BE, IITian, IIMs Experience Required: 2 to 4 Years Reports To: Director / Founder CTC: Open to Discussion Working Days: 6 Days (Monday to Saturday) Working Hours: 9:00 AM to 5:30 PM Industry Preference: Any (Jewellery industry preferred) Key Responsibilities Hands-on Use of Latest Tech Tools Utilize AI, Power BI, ERP, and other relevant tools for data analysis, insight generation, and decision support in jewellery industry operations. Analytical Mindset for Decision Making Apply an analytical mindset to extract insights using AI and Power BI, aiding the Founder in strategic and operational decisions. ERP Management and Integration Explore, manage, and ensure smooth integration of ERP systems for inventory, sales, and production planning operations within the jewellery sector. Data Analysis and Reporting Analyze business data using Power BI and provide actionable insights and reports on operations, sales trends, and production efficiency. AI-Driven Insights for Operations Leverage AI tools for predictive analytics and pattern recognition in industry-specific data to support informed decision-making. Collaboration with Cross-Functional Teams Work closely with departments such as design, production, and sales to ensure effective implementation of data-driven strategies. Others Provide administrative and operational support to the Founder. Assist in managing key projects, initiatives, and assigned tasks. Maintain confidentiality and handle sensitive information with discretion. Take a proactive, organized approach in managing tasks and responsibilities. Work closely with the Founder on strategic projects and provide regular updates and insights. Requirements Industry Experience: Preferred experience in the jewellery industry or a related manufacturing domain with exposure to technology and analytics. Technical Skills: Proficiency in AI, Power BI, ERP systems, and data analysis for deriving business insights. Analytical Abilities: Strong analytical and problem-solving mindset to support strategic decisions. Communication: Excellent communication skills to convey insights to leadership and across departments. Adaptability: Willingness to stay updated with evolving tech tools and analytics trends in Jewellery industry. Education: Technical background preferred (BE, IIT/IIM young graduates). Skills: stakeholder communication,data analysis,analytics,travel booking,project management,stakeholder management,communications,competitive analysis,dashboarding,m&a advisory services,strategy building,business insight generation,strategic thinking,problem-solving,dashboard building,erp,bi tools,executive assistant,business strategy,dashboards,mba,ai,business,strategy,erp systems,adaptability,power bi,executive support,high-growth,data-driven mindset,jewellery,project analysis,collaboration,communication & stakeholder management,analytical mindset,project,d2c,communication skills,communication,strategic business enablement,excel,manufacturing,founder,office,fundraising,calendar planning,cross-functional execution,google workspace,operations,market research,executive administrative assistance,administrative,cross-functional collaboration,btech,projects,presentation skills,performance tracking,performance metrics tracking,travel assistance,sales
Posted 1 day ago
1.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About The Role We are looking for a motivated and detail-oriented Business Intelligence (BI) Analyst to join our growing team in Bangalore. In this role, you will work closely with cross-functional teams to transform data into insightful dashboards, KPIs, and reports that drive strategic decisions. Key Responsibilities Design, build, and maintain interactive dashboards and reports using tools like Power BI or Tableau. Work with stakeholders to identify, define, and track KPIs and business metrics. Write SQL queries to extract and manipulate data from relational databases. Use Python for data analysis, automation, and ad-hoc reporting scripts, as needed. Collaborate with data engineering teams to ensure data availability, quality, and consistency. Analyze large datasets to identify trends, patterns, and actionable insights. Ensure timely delivery of BI outputs in line with business goals and project timelines. Maintain documentation for processes, models, and dashboards. Assist in improving existing dashboards and processes for better scalability and performance. Required Skills And Qualifications Bachelor’s degree in Computer Science, Information Systems, Statistics, Mathematics, or a related field. 1+ year of hands-on experience in Business Intelligence or Data Analysis. Strong proficiency in SQL, with the ability to write advanced queries. Experience in building dashboards using Power BI, Tableau, or similar BI tools. Knowledge of Python for data manipulation and automation. Ability to translate business requirements into effective data visualizations and reports. Strong understanding of data modeling, ETL processes, and data quality concepts. Excellent communication, problem-solving, and time management skills. Ability to work independently and collaboratively in a fast-paced environment. Nice To Have Experience working in an Agile or fast-paced startup environment. Knowledge of cloud platforms like AWS, GCP, or Azure. Familiarity with version control tools (e.g., Git). What We Offer Competitive salary and benefits Opportunity to work on high-impact projects Supportive and results-driven work culture Learning and growth opportunities in a data-centric environment
Posted 1 day ago
10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Business Unit: Cubic Transportation Systems Company Details: When you join Cubic, you become part of a company that creates and delivers technology solutions in transportation to make people’s lives easier by simplifying their daily journeys, and defense capabilities to help promote mission success and safety for those who serve their nation. Led by our talented teams around the world, Cubic is committed to solving global issues through innovation and service to our customers and partners. We have a top-tier portfolio of businesses, including Cubic Transportation Systems (CTS) and Cubic Defense (CD). Explore more on Cubic.com. Job Details: ESSENTIAL DUTIES AND RESPONSIBILITIES: Extract and interpret technical and commercial requirements from complex contracts involving clients, partners, and subcontractors. Collaborate closely with contract managers, legal counsel, engineers, and the PMO to validate and structure requirements effectively. Use IBM DOORS to capture, organize, and trace requirements throughout the lifecycle—from definition through to validation and sign-off. Identify and link contract requirements to specific deliverables, such as: Design and build documents Equipment specifications Test plans and procedures Operations and Maintenance manuals Project schedules and implementation plans Acceptance certificates and supporting documentation Utilize the full suite of DOORS tools to: Manage requirement changes through configurable workflows Link requirements to test plans, design elements, and documentation Collaborate with internal teams and external suppliers Enable cross-functional collaboration via DOORS Web Access (DWA) Integrate with change and quality management tools Track the completion of all deliverable's, ensuring evidence is gathered to support requirement closure and final client acceptance. Provide accurate traceability matrices, compliance documentation, and audit-ready reports to support project close-out. BACKGROUND AND EXPERIENCE: Bachelor’s degree in engineering, Systems Engineering, Project Management, or related discipline. 10+ years of experience in a requirements management or project delivery role, preferably in complex, contract-driven environments. Advanced user of IBM Engineering Requirements Management DOORS, with deep familiarity in: Requirement linking, version control, and change management DOORS Web Access and collaborative tools Integration with Rational and third-party lifecycle tools Requirements Interchange Format (RIF) and supplier collaboration Demonstrated ability to trace requirements across full project lifecycle including design, testing, implementation, and client acceptance. Strong analytical, documentation, and reporting skills. Experience working within structured project frameworks (e.g., V-model, systems engineering lifecycles). Worker Type: Employee
Posted 1 day ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Location - Hyderabad Delhi NCR Description - External Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Positions in this function are responsible for the management and manipulation of mostly structured data, with a focus on building business intelligence tools, conducting analysis to distinguish patterns and recognize trends, performing normalization operations and assuring data quality. Depending on the specific role and business line, example responsibilities in this function could include creating specifications to bring data into a common structure, creating product specifications and models, developing data solutions to support analyses, performing analysis, interpreting results, developing actionable insights and presenting recommendations for use across the company. Roles in this function could partner with stakeholders to understand data requirements and develop tools and models such as segmentation, dashboards, data visualizations, decision aids and business case analysis to support the organization. Other roles involved could include producing and managing the delivery of activity and value analytics to external stakeholders and clients. Team members will typically use business intelligence, data visualization, query, analytic and statistical software to build solutions, perform analysis and interpret data. Positions in this function work on predominately descriptive and regression-based analytics and tend to leverage subject matter expert views in the design of their analytics and algorithms. This function is not intended for employees performing the following work: production of standard or self-service operational reporting, casual inference led (healthcare analytics) or data pattern recognition (data science) analysis; and/or image or unstructured data analysis using sophisticated theoretical frameworks. Primary Responsibilities: Analyze data and extract actionable insights, findings, and recommendations Develop data validation strategies to ensure accuracy and reliability of data Communicate data and findings effectively to internal and external senior executives with clear supporting evidence Relate analysis to the organization's overall business objectives Implement generative AI techniques to reduce manual efforts and automate processes Analyzes and investigates Provides explanations and interpretations within area of expertise Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Qualifications - External Required Qualifications: Undergraduate degree or equivalent experience Experience in creating summarized reports of findings and recommendations Proficiency in SQL language, Snowflake, and AI techniques Solid skills in Microsoft Excel, Word, PowerPoint, and Visio Ability to multitask, take initiative, and adapt to changing priorities Proven self-motivated team player with solid problem-solving and analytical skills Preferred Qualifications: 3+ years of work experience 2+ years of experience working with a healthcare consulting firm Experience in data analytics and hands-on experience in Python Programming, SQL, and Snowflake Proven creative, strategic thinker with excellent critical thinking skills At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Posted 1 day ago
2.0 - 4.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Summary We are seeking a motivated and detail-oriented junior project associate to join our dynamic IT team. This entry-level position is an excellent opportunity for a recent graduate to launch a career in PMO. The successful candidate will learn from senior staff and assist in the day-to-day operations of our organization's Project Management Office (PMO) processes. Qualification Graduate from reputed college / university Preferred Skillset: MS office expert, PMO work experience, and MNC customer facing played. 2-4 years of IT infrastructure project experience and managing reports. Basic IT Networking knowledge: Router, Switches, firewall, wireless access points etc Basic Project management knowledge Excellent in MS office (Intermediate to advance in MS Excel , PPT , DOC, Google Spreadsheet etc) Intermediate experience in Excel formulas and managing large data files, and should be able to extract required data from these files. Strong skills in file and folder management and tracking versions and changes in files. Good in communication and presentation skills Ability to work collaboratively within a team environment. Open to work in US shift (India Night shift) Min 2 years of experience in coordinating with international clients (MNC). Preferred Qualifications Engineering graduate Key Responsibilities: Asset Management: The candidate shall be responsible for managing the asset tracker and working with different stakeholders to get data and update trackers. Training Compliance: The candidate shall be responsible for managing training compliance for a large pool of network engineers and work with different stakeholders to share and get data. Onboarding/Offboarding: Help new customer associates with onboarding and offboarding for leavers and manage their status. Candidates shall be responsible for managing compliance. Prepare the deck and reports. Prepare SOP with the help of the delivery team.
Posted 1 day ago
0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Job description: Job Title: Project Engineer Location: Kanjurmarg (West), Mumbai Company Name: VALAD Infotech Solutions Pvt. Ltd. About VALAD Infotech Solutions: Valad Infotech Solutions, a leading marine technology consulting firm, specializes in a spectrum of services including technical asset and data management, on-board inventory management and barcoding, digitization, back-office services, training, and consulting & analytics. Located in Mumbai, we pride ourselves on our diverse client base, both in India and internationally. Our team comprises marine professionals adept in IT, with expertise in areas like Planned Maintenance System (PMS), Inventory Management, Business Process as a Service (BPaaS), and Computer-Based Training (CBT). Key Accountabilities: Work on varied Maritime, Shipping, Oil and Gas PMS projects. Extract maintenance jobs, routines, and spare parts data from ship machinery manuals. Update, amend, and consolidate system databases. Create PMS databases for new fleet vessels, and amend work procedures as needed. Monitor PMS work schedules, make necessary adjustments, and prepare status reports. Liaise with Chief Engineers and Masters, briefing them on PMS during ship visits. Collaborate closely with Valad’s Project Managers, proposing and managing system improvements. Required Skills: Degree/Diploma in Marine Engineering or Mechanical Engineering (Freshers) Minimum 6 months experience as 4th Engineer/Junior Engineer. Proficiency in mentoring and overseeing a data entry team. Excellent communication skills in English. Strong computer literacy, including MS Office. Experience in data extraction from various maritime sources and familiarity with PMS software like AMOS, Sertica, Shipnet, DNV-GL, KAPA, Ship Manager, & NS5. Additional Qualifications: Possession of an Indian CDC is advantageous. Willingness for inventory check visits to ships (2-3 weeks). Additional experience in PMS Systems, Databases, and Maintenance jobs is beneficial. Working Culture: We offer 5 days working with sufficient annual leave which are best in industry for all employees. Performance based increments and incentives. VALAD Infotech Solutions is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough