Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Leads projects for design, development and maintenance of a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with key business stakeholders, IT experts and subject-matter experts to plan, design and deliver optimal analytics and data science solutions. Works on one or many product teams at a time. Key Responsibilities Designs and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Designs and implements framework to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Designs and provide guidance on building reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Designs and implements physical data models to define the database structure. Optimizing database performance through efficient indexing and table relationships. Participates in optimizing, testing, and troubleshooting of data pipelines. Designs, develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity. Assists with renovating the data management infrastructure to drive automation in data integration and management. Ensures the timeliness and success of critical analytics initiatives by using agile development technologies such as DevOps, Scrum, Kanban Coaches and develops less experienced team members. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience Intermediate experience in a relevant discipline area is required. Knowledge of the latest technologies and trends in data engineering are highly preferred and includes: 5-8 years of experience Familiarity analyzing complex business systems, industry requirements, and/or data regulations Background in processing and managing large data sets Design and development for a Big Data platform using open source and third-party tools SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Experience developing applications requiring large file movement for a Cloud-based environment and other data extraction tools and methods from a variety of sources Experience in building analytical solutions Intermediate Experiences In The Following Are Preferred Experience with IoT technology Experience in Agile software development Qualifications Work closely with business Product Owner to understand product vision. Play a key role across DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. Independently design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. Responsible for creation, maintenence and management of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP). Take part in evaluation of new data tools, POCs and provide suggestions. Take full ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization. Proactively address and resolve issues that compromise data accuracy and usability. Preferred Skills Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. Data Replication: Working knowledge of replication technologies like Qlik Replicate is a plus API: Working knowledge of API to consume data from ERP, CRM
Posted 2 days ago
4.0 - 5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Supports, develops and maintains a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with the Business and IT teams to understand the requirements to best leverage the technologies to enable agile data delivery at scale. Key Responsibilities Implements and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Implements methods to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Develops reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Develops physical data models and implements data storage architectures as per design guidelines. Analyzes complex data elements and systems, data flow, dependencies, and relationships in order to contribute to conceptual physical and logical data models. Participates in testing and troubleshooting of data pipelines. Develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses agile development technologies, such as DevOps, Scrum, Kanban and continuous improvement cycle, for data driven application. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience 4-5 Years of experience. Relevant experience preferred such as working in a temporary student employment, intern, co-op, or other extracurricular team activities. Knowledge of the latest technologies in data engineering is highly preferred and includes: Exposure to Big Data open source SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Familiarity developing applications requiring large file movement for a Cloud-based environment Exposure to Agile software development Exposure to building analytical solutions Exposure to IoT technology Qualifications Work closely with business Product Owner to understand product vision. Participate in DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. Work under limited supervision to design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. Responsible for creation of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP) with guidance and help from senior data engineers. Take part in evaluation of new data tools, POCs with guidance and help from senior data engineers. Take ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization under limited supervision. Assist to resolve issues that compromise data accuracy and usability. Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Intermediate level expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. API: Working knowledge of API to consume data from ERP, CRM
Posted 2 days ago
0.0 - 15.0 years
0 Lacs
Assam
On-site
AB Sun Life Insurance Co Ltd Regional Training Manager - Guwahati Location: Rohini, Guwahati, Assam Job Purpose Job Purpose Description Job Context & Major Challenges Job Context: To impart training to FLS and Advisors in order to upgrade there job knowledge through the induction and develop there skills through interventions periodically in the areas of recruitment and selling skills workshops, which would impact the territory’s productivity. Job Challenges: • Getting trainees (especially advisors) in the training room as they are not on the payrolls of the company and are not willing to invest time to up bring their capabilities. The span being large (managing multiple branches over spread locations and large team of existing universe + new hires) the training is being affected. Geographical distribution Training infrastructure Insufficient Training enablers Key Result Areas KRA (Accountabilities) (Max 1325 Characters) Supporting Actions (Max 1325 Characters) KRA1 Implementation of training architecture at the regions to ensure that right learning happens which leads to desired capability and performance 1. Publish and implement the monthly training calendar for branches basis the training architecture. E.g. licensing training, advisor induction, selling skills & domain training for advisors, product refreshers, etc. 2. Maintain strong contracting with sales hierarchy to ensure implementation of learning initiatives and their follow up activities. 3. Create awareness and drive usage of various sales tools & aids 4. Play critical roles during important events (e.g. new product & fund launches, regulatory changes, etc). Task involves organizing and executing the launch plans, creating awareness with speed & accuracy. 5. Implement the region learning interventions that lead to solving regional problems and grabbing regional opportunities 6. Implement training initiatives that support seasonal business opportunities KRA2 Create Measurable impact on productivity 1. Ensure satisfactory pass % of advisors who attend 4 day refresher training ..at level of at least 60% of attendees passing the exam 2. Manage 1st month performance of new licensed advisors (measured through RCM) to the level of 80% active in RCM period with a minimum defined modal premium. This directly contributes to topline 3. Manage 3 months consistency in activization of new advisors (measured through RCM STAR) to the level of 40%. This directly contributes to topline and also creates a pool of advisors to qualify for entry level of advisor club programs 4. Manage new FLS production up to 6 months from joning (measured through GSG program) to the level of 40% qualification. This directly impacts to topline, better engagement of new FLS and their vintage with organization. Reduced attrition also directly impacts cost. 5. Achieve all these through effective training delivery of team members, goal setting, stake holder alignment and ground level support. . KRA3 Managing Training Administration 1. Ensuring that self and team members follow the process of planning, record keeping, expense control, etc Minimum Experience Level 5 - 15 years Job Qualifications Graduate
Posted 2 days ago
0.0 - 7.0 years
0 Lacs
Pune, Maharashtra
On-site
Software Asset Manager Unlock your potential with Dassault Systèmes, a global leader in Scientific Software Engineering as a Software Asset Manager in Pune, Maharashtra ! Role Description & Responsibilities: Lead or support the drafting, evaluation, and negotiation of software contracts with external vendors. Ensure consistency and compliance of contractual provisions across all software agreements. Identify contractual, financial, and commercial risks, and propose appropriate mitigation measures in coordination with relevant support functions of 3DS Company Act as a liaison between software vendors and internal stakeholders (Procurement, Legal, IT, etc.). Optimize software acquisition and maintenance costs while ensuring compliance with usage rights and support terms. Track software usage and support software audit activities in collaboration with Legal and Procurement teams. Maintain and update the software asset catalog in collaboration with project teams and register licenses in the appropriate asset management tools. Ensure timely renewal of all software assets under maintenance, with approvals from Legal and Cybersecurity teams. Contribute to budgeting activities related to software renewal expenses. Define, maintain, and improve Software Asset Management (SAM) processes and tools for ongoing optimization. Collaborate with global and cross-functional teams to ensure consistency and best practices in SAM initiatives. Qualifications: Education: Bachelor’s or Master’s degree in Engineering or a related field. Experience: 5 to 7 years of experience in Software Asset Management or a similar IT role. Strong understanding of IT environments and enterprise software ecosystems. Proven experience in software license management, contract negotiation, and risk identification. Familiarity with software publishers and licensing models. Strong analytical and problem-solving mindset. Excellent communication skills (written and verbal) in English. Ability to work collaboratively with global teams and across functions. Proficient in using SAM tools and maintaining software asset catalogs. What is in it for you? Work for the one of the biggest software companies Work in a culture of collaboration and innovation Opportunities for personal development and career progression Chance to collaborate with various internal users of DASSAULT SYSTEMES and also stakeholders of various internal and partner projects Inclusion statement As a game-changer in sustainable technology and innovation, Dassault Systèmes is striving to build more inclusive and diverse teams across the globe. We believe that our people are our number one asset and we want all employees to feel empowered to bring their whole selves to work every day. It is our goal that our people feel a sense of pride and a passion for belonging. As a company leading change, it’s our responsibility to foster opportunities for all people to participate in a harmonized Workforce of the Future.
Posted 2 days ago
5.0 - 8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Leads projects for design, development and maintenance of a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with key business stakeholders, IT experts and subject-matter experts to plan, design and deliver optimal analytics and data science solutions. Works on one or many product teams at a time. Key Responsibilities Designs and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Designs and implements framework to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Designs and provide guidance on building reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Designs and implements physical data models to define the database structure. Optimizing database performance through efficient indexing and table relationships. Participates in optimizing, testing, and troubleshooting of data pipelines. Designs, develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity. Assists with renovating the data management infrastructure to drive automation in data integration and management. Ensures the timeliness and success of critical analytics initiatives by using agile development technologies such as DevOps, Scrum, Kanban Coaches and develops less experienced team members. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience Intermediate experience in a relevant discipline area is required. Knowledge of the latest technologies and trends in data engineering are highly preferred and includes: 5-8 years of experince Familiarity analyzing complex business systems, industry requirements, and/or data regulations Background in processing and managing large data sets Design and development for a Big Data platform using open source and third-party tools SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Experience developing applications requiring large file movement for a Cloud-based environment and other data extraction tools and methods from a variety of sources Experience in building analytical solutions Intermediate Experiences In The Following Are Preferred Experience with IoT technology Experience in Agile software development Qualifications Work closely with business Product Owner to understand product vision. Play a key role across DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. Independently design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. Responsible for creation, maintenence and management of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP). Take part in evaluation of new data tools, POCs and provide suggestions. Take full ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization. Proactively address and resolve issues that compromise data accuracy and usability. Preferred Skills Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. Data Replication: Working knowledge of replication technologies like Qlik Replicate is a plus API: Working knowledge of API to consume data from ERP, CRM
Posted 2 days ago
21.0 years
0 Lacs
Vishakhapatnam, Andhra Pradesh, India
On-site
Working Title: Police Officer (Pool position) Classification Title: Police Officer Department Name: Police Department Time Base: Full-time Pay Plan: 12 month Bargaining Unit: 8 (SUPA) Employment Type: Probationary/Permanent Salary Range Hiring salary is anticipated at $96,828 - $113,616 annually commensurate with education and experience CSU Salary Range: $77,016 - $113,616 annually Benefits: Premium benefit package includes outstanding health, dental, and vision plans; life and disability insurances; pension (CalPERS); tuition fee waiver; and 14 paid holidays per year. See our benefits website for additional information. Application Deadline: Open until filled. Initial review of applications will begin Thursday, August 14, 2025. Position Summary: Under the general supervision of the Sergeant, the Police Officer independently performs a variety of general law enforcement and patrol duties. Additionally, the Police Officer actively participates in the Community Oriented Policing Strategies employed at the California Poly Maritime Academy. About The Cal Poly Maritime Academy Cal Poly Maritime Academy is a campus of the California State University, is the only degree-granting maritime academy on the West Coast. Located on the scenic Vallejo waterfront, the campus serves a student population of approximately 1000 undergraduates and 50 graduate students. Cal Poly Maritime Academy offers seven baccalaureate degrees in Business Administration, Global Studies and Maritime Affairs, Facilities Engineering Technology, Marine Engineering Technology, Mechanical Engineering, Marine Transportation, and Oceanography. The undergraduate curriculum includes licensing programs for future merchant marine, coast guard, and naval reserve officers. Cal Poly Maritime Academy also offers a Master of Science in Transportation and Engineering Management degree, as well as a number of extended learning programs and courses. Major Responsibilities General Law Enforcement Protect students, faculty, staff, campus visitors, property, and facilities from accidents, bodily harm, fire, theft, vandalism, and illegal entry. Enforce laws, traffic, and parking regulations. Issue citations for violations within department jurisdictions. Apprehend violators, make arrests, and appear in court as required. Provide general information and assistance to the public. Assist in investigations, administrative assignments, projects, and other duties as assigned. Patrol Operations Proactively patrol residence halls, campus buildings, and other facilities and grounds by foot, vehicle, or bicycle. Emphasize community-policing activities such as introducing oneself to others, casual conversation, joining in social events or games, inquiring about safety needs, and offering police assistance. Establish positive relationships with students, furthering their success. Maintain crowd control during assemblies, sporting events, emergencies, and disturbances. Guard property, including vessel screening & facility security as needed for compliance with maritime security regulations (33 CFR parts 104 and 105). Guard and transport cash funds. Investigate, gather evidence, and prepare reports on accidents, property damage, fires, law violations, thefts, and disturbances of the peace. Respond to campus disasters, including but not limited to: fire, earthquake, active shooter, and potentially violent civil unrest, taking a leadership role as appropriate, and ensuring that appropriate resources are effectively applied to minimize loss of life, injury, property damage, and risk/liability. Carry out follow-up investigation and documentation and make notifications as appropriate. Administer first aid to injured persons. Direct traffic and enforce traffic infractions. Enforce parking regulations. Effectively use a two-way radio. Perform other related duties as assigned. Community-Oriented Policing Interact with the community, establishing a sense of personal safety and promoting the public trust throughout the community. Patrols may include a mixture of vehicular, bike, foot, or stationary assignments. Integrate professional knowledge and duties with the culture of the academic environment to accomplish department goals and objectives. Facilitate programs, meetings, and other community activities or projects in support of department’s mission and safety awareness for the campus community. Promote a positive work environment. Be creative and innovative in suggesting crime prevention and other educational presentations and participate in community events that encourage the same. Training Participate in training which includes classroom and on-the-job instruction. Employees must demonstrate proficiency in the use of firearms and participate in defensive tactics and physical fitness training. Minimum firearms qualification scores must be attained as a condition of continued employment. Must meet all training requirements established by the Commission on Peace Officer Standards and Training, and Department Policy. Advanced Training Maintain proficiency in professional training and/or be delegated responsibility for additional work assignments that include: range master, field training officer, defensive tactics instructor, motorcycle patrol, bicycle patrol, crime prevention officer, and Critical Response Unit (CRU) member. Required Qualifications At least 21 years of age. High School diploma or equivalent. Graduation from a Peace Officer Standards and Training (P.O.S.T.) academy, including obtaining a Basic Course Certificate. Must possess and maintain a valid California State Drivers' license in satisfactory standing. Working knowledge of current criminal codes and laws. Working knowledge of investigation techniques and procedures. Working knowledge of current law enforcement methods and procedures. Ability to quickly learn and apply campus rules and regulations related to work performed. Ability to proactively identify, observe and investigate potentially hazardous conditions or activities. Ability to exercise tact, courtesy, alertness, and good judgment in making decisions according to laws, regulations, policies, and supervisory expectations and in responding others. Ability to operate in an environment that requires discretion and confidentiality. Demonstrate ability to think and act decisively and effectively in emergency and sensitive situations. Demonstrate a willingness to confront problems. Ability to take initiative in developing and improving skills; demonstrate dependability, integrity, good observations skills, and professional bearing; be able to enjoy working with people; and possess credibility as a witness in a court of law. Ability to communicate effectively with diverse student, faculty, staff, and community populations. Ability to work effectively both independently and as part of a team within the department, with a diverse campus community, and with members/agencies outside the CMA community. Ability to read, write, and orally communicate in a clear and concise manner. Ability to understand and carry out oral and written instructions. Ability to prepare concise and accurate reports. Possess intermediate computer proficiency with diverse programs, including Microsoft Office Suite, Internet, and email software. Ability to learn office technology systems Preferred Qualifications Prior law enforcement agency experience. Law enforcement experience in a college or university setting. Associates degree or higher. Special Conditions: Successful completion of a physical agility test, oral interview, written examination, comprehensive background investigation, physical examination, psychological examination, and drug screening required. Must be able to obtain/maintain a Transportation Worker Identification Credential (TWIC) card for work aboard our Training Ship. Hours of Work/Travel: Overtime, travel, travel outside of business hours, and shift work may be required. Physical, Mental And Environmental Conditions Up to 40% of the activities involve sitting, standing, squatting, kneeling or walking; involves lifting heavy weight objects limited to 50 pounds; may involve pushing and pulling objects within the weight limits. Is exposed to excessive noise Is around moving machinery Is exposed to dust, fumes, gases, radiation, microwave (circle) Drives motorized equipment Title IX: Please view the Notice of Non-Discrimination on the Basis of Gender or Sex and Contact Information for Title IX Coordinator at: https://www2.calstate.edu/titleix Equal Opportunity and Excellence in Education and Employment This position is open and available to all regardless of race, sex, color, ethnicity or national origin. Consistent with California law and federal civil rights laws, Cal Poly Maritime Academy provides equal opportunity in education and employment without unlawful discrimination or preferential treatment based on race, sex, color, ethnicity, or national origin. Our commitment to equal opportunity means ensuring that every student and employee has access to the resources and support they need to thrive and succeed in a university environment and in their communities. Cal Poly Maritime complies with Title VI of the Civil Rights Act of 1964, Title IX of the Education Amendments of 1972, the Americans with Disabilities Act (ADA), Section 504 of the Rehabilitation Act, the California Equity in Higher Education Act, California’s Proposition 209 (Art. I, Section 31 of the California Constitution), other applicable state and federal anti-discrimination laws, and CSU’s Nondiscrimination Policy. We prohibit discriminatory preferential treatment, segregation based on race or any other protected status, and all forms of discrimination, harassment, and retaliation in all university programs, policies, and practices. Cal Poly Maritime Academy is a diverse community of individuals who represent many perspectives, beliefs and identities, committed to fostering an inclusive, respectful, and intellectually vibrant environment. We cultivate a culture of open dialogue, mutual respect, and belonging to support educational excellence and student success. Through academic programs, student organizations and activities, faculty initiatives, and community partnerships, we encourage meaningful engagement with diverse perspectives. As a higher education institution, we are dedicated to advancing knowledge and empowering individuals to reach their full potential by prioritizing inclusive curriculum development, faculty and staff training, student mentorship, and comprehensive support programs. At Cal Poly Maritime Academy, excellence is built on merit, talent, diversity, accessibility, and equal opportunity for all. Supplemental Information Background Check: Satisfactory completion of a background check (including a criminal records check, DMV records check, physical, drug screening, and fingerprinting) is required for employment. CSU will make a conditional offer of employment, which may be rescinded if the background check reveals disqualifying information, and/or it is discovered that the candidate knowingly withheld or falsified information. Failure to satisfactorily complete the background check may affect the continued employment of a current CSU employee who was conditionally offered the position. Mandated Reporter: The person holding this position is considered a ‘mandated reporter’ under the California Child Abuse and Neglect Reporting Act and is required to comply with the requirements set forth in CSU Executive Order 1083 revised July 21, 2017, as a condition of employment. Eligibility to Work: Applicants must provide proof of U.S. citizenship or authorization to work in the United States within three days of the date of hire. Integration: The California State University Board of Trustees has approved the integration of the California State University Maritime Academy with California Polytechnic State University, San Luis Obispo, effective July 1, 2025. The two campuses will form one academic institution operated as California Polytechnic State University, San Luis Obispo and all employment positions with California State University Maritime Academy are subject to this transition. Employees hired in the period after the Trustees’ approval on November 21, 2024 and prior to the integration date on July 1, 2025 will have their employment transitioned to California Polytechnic State University, San Luis Obispo. Employment shall continue to be subject to California State University policies and applicable collective bargaining agreements. Any changes in organizational structure, reporting relationships, or employment conditions will be communicated as details are finalized. Application Procedure: Click "APPLY NOW" to complete the Cal Poly Maritime Academy Online Employment Application and attach the following documents: cover letter and resume. Disclaimer: The provisions of this job bulletin do not constitute an expressed or implied contract and any provisions contained may be modified or changed.
Posted 3 days ago
0.0 - 31.0 years
2 - 3 Lacs
Koramangala, Bengaluru/Bangalore
On-site
📢 Job Opening: Relationship Executive (Senior Officer) – Max Life Insurance (MLI) 📍 Location: Field + Customer Meetings 💼 Type: Full-time | Entry Level (0–1.5 yrs experience) ✨ What You’ll Do: 👥 Manage and build strong bonds with existing customers (Database provided – Book of Relations) 📞 Be their go-to contact for queries & support 🎯 Understand financial needs and offer smart planning solutions 📊 Cross-sell/Up-sell Insurance & Investment products 📅 Fix daily appointments from shared data 🏆 Help drive retention and satisfaction 🛠️ Requirements: ✅ Age: 20–25 years ✅ Experience: Freshers/ 0–1.5 yrs in Insurance, MF, CASA, Loans, or Real Estate ✅ Must have own conveyance 🗣️ Strong communication skills a must! 📜 Willing to complete IRDA licensing as per MLI standards 🎁 Perks: 🔹 Company-generated leads 🔹 Flexible growth in financial domain 🔹 Excellent brand to work with – Max Life Insurance! 💬 Interested? Apply now and kickstart your career with a trusted name in insurance! Let me know if you want a Hinglish version too!Ask ChatGPT
Posted 3 days ago
1.0 - 31.0 years
2 - 3 Lacs
Jaipur
On-site
Job Summary: The Agency Development Manager is responsible for building and managing a team of insurance agents/advisors to promote and sell health insurance policies. This role focuses on agent recruitment, training, motivation, and achieving sales targets through the agency distribution model. Key Responsibilities: Agent Recruitment & Onboarding: Identify and recruit potential insurance advisors/agents. Conduct onboarding and licensing of agents as per IRDAI guidelines. Training & Development: Provide regular training to agents on product knowledge, sales techniques, and compliance. Conduct joint field calls to support and mentor agents. Sales Target Achievement: Drive health insurance sales through agents. Ensure achievement of monthly and annual sales goals. Monitor and support agents in achieving their individual targets. Relationship Management: Build long-term relationships with agents and customers. Address and resolve agent/customer queries effectively. Compliance & Documentation: Ensure all sales are compliant with regulatory and company policies. Maintain accurate records of agent licensing, sales, and commissions. Key Skills & Competencies: Strong communication and interpersonal skills Team building and leadership abilities Goal-oriented and self-motivated Understanding of health insurance products and regulatory norms Sales acumen and customer-centric approach Eligibility Criteria: Education: Graduate in any discipline (Preferred: MBA/PG in Sales/Marketing) Experience: 1–3 years in sales, preferably in insurance, BFSI, or direct marketing Benefits: Fixed salary + attractive performance-based incentives Insurance benefits (medical/life) Career growth opportunities within the organization Training and certification support
Posted 3 days ago
0 years
0 Lacs
Deoghar, Jharkhand, India
On-site
Company Description We suggest you enter details here. Role Description This is a full-time on-site role for an MBBS of AIIMS at Apollo Clinic Deoghar. The MBBS will be responsible for conducting patient consultations, diagnosing medical conditions, and providing appropriate treatments and follow-ups. Additional responsibilities include managing patient records, collaborating with other healthcare professionals, and maintaining high standards of patient care and confidentiality. Qualifications MBBS degree from AIIMS or equivalent institution Strong diagnostic and clinical skills Effective communication and interpersonal skills Ability to manage patient records and maintain confidentiality Familiarity with medical software and electronic health records (EHR) systems is a plus Commitment to patient care and adaptability to collaborative environments Professional certification and licensing required to practice in India
Posted 3 days ago
5.0 - 8.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Job Summary Leads projects for design, development and maintenance of a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with key business stakeholders, IT experts and subject-matter experts to plan, design and deliver optimal analytics and data science solutions. Works on one or many product teams at a time. Key Responsibilities Designs and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Designs and implements framework to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Designs and provide guidance on building reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Designs and implements physical data models to define the database structure. Optimizing database performance through efficient indexing and table relationships. Participates in optimizing, testing, and troubleshooting of data pipelines. Designs, develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity. Assists with renovating the data management infrastructure to drive automation in data integration and management. Ensures the timeliness and success of critical analytics initiatives by using agile development technologies such as DevOps, Scrum, Kanban Coaches and develops less experienced team members. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience Intermediate experience in a relevant discipline area is required. Knowledge of the latest technologies and trends in data engineering are highly preferred and includes: 5-8 years of experince Familiarity analyzing complex business systems, industry requirements, and/or data regulations Background in processing and managing large data sets Design and development for a Big Data platform using open source and third-party tools SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Experience developing applications requiring large file movement for a Cloud-based environment and other data extraction tools and methods from a variety of sources Experience in building analytical solutions Intermediate Experiences In The Following Are Preferred Experience with IoT technology Experience in Agile software development Qualifications Work closely with business Product Owner to understand product vision. 2) Play a key role across DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). 3) Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. 4) Independently design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. 5) Responsible for creation, maintenence and management of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP). 6) Take part in evaluation of new data tools, POCs and provide suggestions. 7) Take full ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization. 8) Proactively address and resolve issues that compromise data accuracy and usability. Preferred Skills Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. Data Replication: Working knowledge of replication technologies like Qlik Replicate is a plus API: Working knowledge of API to consume data from ERP, CRM Job Systems/Information Technology Organization Cummins Inc. Role Category Remote Job Type Exempt - Experienced ReqID 2417810 Relocation Package Yes
Posted 3 days ago
5.0 - 8.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Job Summary Leads projects for design, development and maintenance of a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with key business stakeholders, IT experts and subject-matter experts to plan, design and deliver optimal analytics and data science solutions. Works on one or many product teams at a time. Key Responsibilities Designs and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Designs and implements framework to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Designs and provide guidance on building reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Designs and implements physical data models to define the database structure. Optimizing database performance through efficient indexing and table relationships. Participates in optimizing, testing, and troubleshooting of data pipelines. Designs, develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses innovative and modern tools, techniques and architectures to partially or completely automate the most-common, repeatable and tedious data preparation and integration tasks in order to minimize manual and error-prone processes and improve productivity. Assists with renovating the data management infrastructure to drive automation in data integration and management. Ensures the timeliness and success of critical analytics initiatives by using agile development technologies such as DevOps, Scrum, Kanban Coaches and develops less experienced team members. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience Intermediate experience in a relevant discipline area is required. Knowledge of the latest technologies and trends in data engineering are highly preferred and includes: 5-8 years of experience Familiarity analyzing complex business systems, industry requirements, and/or data regulations Background in processing and managing large data sets Design and development for a Big Data platform using open source and third-party tools SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Experience developing applications requiring large file movement for a Cloud-based environment and other data extraction tools and methods from a variety of sources Experience in building analytical solutions Intermediate Experiences In The Following Are Preferred Experience with IoT technology Experience in Agile software development Qualifications Work closely with business Product Owner to understand product vision. 2) Play a key role across DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). 3) Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. 4) Independently design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. 5) Responsible for creation, maintenence and management of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP). 6) Take part in evaluation of new data tools, POCs and provide suggestions. 7) Take full ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization. 8) Proactively address and resolve issues that compromise data accuracy and usability. Preferred Skills Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. Data Replication: Working knowledge of replication technologies like Qlik Replicate is a plus API: Working knowledge of API to consume data from ERP, CRM Job Systems/Information Technology Organization Cummins Inc. Role Category Remote Job Type Exempt - Experienced ReqID 2417809 Relocation Package Yes
Posted 3 days ago
4.0 - 5.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Job Summary Supports, develops and maintains a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with the Business and IT teams to understand the requirements to best leverage the technologies to enable agile data delivery at scale. Key Responsibilities Implements and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured). Implements methods to continuously monitor and troubleshoot data quality and data integrity issues. Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users. Develops reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages. Develops physical data models and implements data storage architectures as per design guidelines. Analyzes complex data elements and systems, data flow, dependencies, and relationships in order to contribute to conceptual physical and logical data models. Participates in testing and troubleshooting of data pipelines. Develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses agile development technologies, such as DevOps, Scrum, Kanban and continuous improvement cycle, for data driven application. Responsibilities Competencies: System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts. Collaborates - Building partnerships and working collaboratively with others to meet shared objectives. Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences. Customer focus - Building strong customer relationships and delivering customer-centric solutions. Decision quality - Making good and timely decisions that keep the organization moving forward. Data Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product. Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making. Problem Solving - Solves problems and may mentor others on effective problem solving by using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and ensures actions to prevent problem reoccurrence are implemented. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in relevant technical discipline, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience 4-5 Years of experience. Relevant experience preferred such as working in a temporary student employment, intern, co-op, or other extracurricular team activities. Knowledge of the latest technologies in data engineering is highly preferred and includes: Exposure to Big Data open source SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework SQL query language Clustered compute cloud-based implementation experience Familiarity developing applications requiring large file movement for a Cloud-based environment Exposure to Agile software development Exposure to building analytical solutions Exposure to IoT technology Qualifications Work closely with business Product Owner to understand product vision. 2) Participate in DBU Data & Analytics Power Cells to define, develop data pipelines for efficient data transport into Cummins Digital Core ( Azure DataLake, Snowflake). 3) Collaborate closely with AAI Digital Core and AAI Solutions Architecture to ensure alignment of DBU project data pipeline design standards. 4) Work under limited supervision to design, develop, test, implement complex data pipelines from transactional systems (ERP, CRM) to Datawarehouses, DataLake. 5) Responsible for creation of DBU Data & Analytics data engineering documentation and standard operating procedures (SOP) with guidance and help from senior data engineers. 6) Take part in evaluation of new data tools, POCs with guidance and help from senior data engineers. 7) Take ownership of the developed data pipelines, providing ongoing support for enhancements and performance optimization under limited supervision. 8) Assist to resolve issues that compromise data accuracy and usability. Programming Languages: Proficiency in languages such as Python, Java, and/or Scala. Database Management: Intermediate level expertise in SQL and NoSQL databases. Big Data Technologies: Experience with Hadoop, Spark, Kafka, and other big data frameworks. Cloud Services: Experience with Azure, Databricks and AWS cloud platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. API: Working knowledge of API to consume data from ERP, CRM Job Systems/Information Technology Organization Cummins Inc. Role Category Remote Job Type Exempt - Experienced ReqID 2417808 Relocation Package Yes
Posted 3 days ago
0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
We are a technology-led healthcare solutions provider. We are driven by our purpose to enable healthcare organizations to be future-ready. We offer accelerated, global growth opportunities for talent thats bold, industrious, and nimble. With Indegene, you gain a unique career experience that celebrates entrepreneurship and is guided by passion, innovation, collaboration, and empathy. To explore exciting opportunities at the convergence of healthcare and technology, check out www.careers.indegene.com. Looking to jump-start your career? We understand how important the first few years of your career are, which create the foundation of your entire professional journey. At Indegene, we promise you a differentiated career experience. You will not only work at the exciting intersection of healthcare and technology but also will be mentored by some of the most brilliant minds in the industry. We are offering a global fast-track career where you can grow along with Indegenes high-speed growth. We are purpose-driven. We enable healthcare organizations to be future ready and our customer obsession is our driving force. We ensure that our customers achieve what they truly want. We are bold in our actions, nimble in our decision-making, and industrious in the way we work. Must Have Prepare and compile global regulatory dossiers in eCTD and non eCTD format in accordance with HA legislations and client specific requirements. Excellent working knowledge of regional regulations and guidance as it pertains to format and submission structure required. Understand, interpret, and apply Agency regulations and guidelines related to submissions. Thorough understanding of all aspects of the publishing software, tools, process, and output requirements. Perform publishing QC tasks within the electronic publishing system and QC of the published output to ensure high submission quality. Perform document quality control checks for others in the department, promptly communicating irregularities in documents and coordinating issue resolution. Dispatch submissions to Regulatory Authorities via agency portal, through customers or directly. Publish clinical documents (taking into account complexity and size) in accordance with department standards and organization KPIs. Ensure published clinical documents meet current internal and external quality standards for electronic and/or paper HA submissions, including minimizing publishing-related technical QC findings and no rework once finalized. Timeliness of deliverables meet both individual document and overall project timelines. Experience with global regulatory submission formats, including familiarity with submission publishing activities. Experience with CSR document publishing, including familiarity with word and PDF formatting. Thorough knowledge of major HA global/regional/national country requirements/regulatory affairs procedures for initial submission, licensing, post approval submission management. Markets Handled: EU, US, Canada and GCC. Good to have EQUAL OPPORTUNITY Indegene is proud to be an Equal Employment Employer and is committed to the culture of Inclusion and Diversity. We do not discriminate on the basis of race, religion, sex, colour, age, national origin, pregnancy, sexual orientation, physical ability, or any other characteristics. All employment decisions, from hiring to separation, will be based on business requirements, the candidates merit and qualification. We are an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, colour, religion, sex, national origin, gender identity, sexual orientation, disability status, protected veteran status, or any other characteristics. Locations Mumbai, MH, IN
Posted 3 days ago
5.0 years
0 Lacs
South Delhi, Delhi, India
On-site
Hi Professionals, We are looking for the Compliance Specialist. Interested candidates can share their resume on info@thedigigen.com or can reach on +91-9811233735 Job Description: Customs & Foreign Trade Compliance Specialist 📍 Location: Delhi (On-site, Non-hybrid) 💼 Experience Required: 5+ years in Customs Compliance, SEZ, and FTP Advisory 📅 Joining: Immediate preferred 🔍 Key Responsibilities: Preparation and filing for: • SVB (Special Valuation Branch) registration and renewals • Bonded warehouse licensing and renewals • AEO (Authorized Economic Operator) certification • RoDTEP, RoSCTL, EPCG, and other DGFT scheme applications Handling client-side documentation for: • Imports under duty exemption schemes • Customs refunds and drawback claims • Import-export reconciliations and warehousing compliances Advisory and filing support for: • Foreign Trade Policy (FTP) benefits • EOU, SEZ, and Advance Authorization schemes Interaction with Customs, DGFT, ICEGATE, SEZ authorities, and NSDL Maintaining MIS and compliance trackers for client regulatory filings Drafting replies and assisting legal teams on customs matters, if required 🧾 Preferred Qualifications: Graduate in Commerce / Law / MBA (Finance) 5+ years hands-on experience in customs compliance and DGFT-related work Working knowledge of ICD/port procedures, documentation, and ICEGATE portal operations 🤝 What We Offer: Exposure to leading Indian and foreign manufacturing clients An opportunity to lead the Customs vertical with autonomy Supportive CA team for legal/financial collaboration Performance-linked incentives and a growth path into leadership roles
Posted 3 days ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
OpenGov is the leader in AI and ERP solutions for local and state governments in the U.S. More than 2,000 cities, counties, state agencies, school districts, and special districts rely on the OpenGov Public Service Platform to operate efficiently, adapt to change, and strengthen the public trust. Category-leading products include enterprise asset management, procurement and contract management, accounting and budgeting, billing and revenue management, permitting and licensing, and transparency and open data. These solutions come together in the OpenGov ERP, allowing public sector organizations to focus on priorities and deliver maximum ROI with every dollar and decision in sync.Learn about OpenGov’s mission to power more effective and accountable government and the vision of high-performance government for every community at OpenGov.com. Role Summary The Marketing Manager will oversee asset quality assurance, cross-team project coordination, and executional alignment across marketing initiatives. This individual will play a foundational role in campaign governance and vendor operations while supporting analytics reporting. Key Responsibilities Develop and execute multi-channel demand generation campaigns (email, paid media, webinars, and content syndication). Collaborate with marketing and sales development teams to create strategies for attracting and converting qualified leads and hitting pipeline targets. ABM Support: Contribute to Account-Based Marketing (ABM) strategies by executing targeted campaigns for high-value accounts. SEO & SEM: Collaborate with digital marketing experts to ensure campaigns are optimized for search engine performance. Quality assurance for all promotional and campaign assets Project management in Jira, ensuring timely execution and proper alignment of resources Submission coordination with external vendors; ensure delivery of assets and campaign briefs Research and vet new marketing vendors to support program expansion Conduct analysis and prepare monthly product check-in reports Track performance metrics across campaigns and identify opportunities for process optimization Over time, assume responsibilities as local site lead for Pune-based marketing functions Ideal Qualifications Bachelor’s degree in Marketing, Advertising, Digital Media, Business Analytics, or equivalent is required A minimum of 4 years of experience in B2B/B2G SaaS advertising experience is required, preferably at a high-growth company with an emphasis on account-based marketing at scale Proficiency with Salesforce and Marketo (or comparable tools) is required; experience with ABM platforms is a plus. Experience with Google Analytics, Google Ads, and social advertising platforms is preferred. Goal-driven with a keen interest in search and digital marketing is required Demonstrated ability to manage multiple campaigns with varying objectives simultaneously is required Strong analytical skills with the ability to interpret data and identify actionable insights is required. Effective interpersonal and communication skills, with the ability to collaborate across teams and with external partners, are required. Strong writing, grammar, spelling, proofreading, and researching skills are required. Why OpenGov? A Mission That Matters. At OpenGov, public service is personal. We are passionate about our mission to power more effective and accountable government. Government that operates efficiently, adapts to change, and strengthens public trust. Some people say this is boring. We think it’s the core of our democracy. Opportunity to Innovate The next great wave of innovation is unfolding with AI, and it will impact everything—from the way we work to the way governments interact with their residents. Join a trusted team with the passion, technology, and expertise to drive innovation and bring AI to local government. We’ve touched 2,000 communities so far, and we’re just getting started. A Team of Passionate, Driven People This isn’t your typical 9-to-5 job; we operate in a fast-paced, results-driven environment where impact matters more than simply clocking in and out. Our global team of 800+ employees is united in our commitment to challenge the status quo. OpenGov is headquartered in San Francisco and has offices in Atlanta, Boston, Buenos Aires, Chicago, Dubuque, Plano, and Pune. A Place to Make Your Mark We pride ourselves on our performance-based culture, where every employee is encouraged to jump in head-first and take action to help us improve. If you have a great idea, we want to hear it. Excellent performance is recognized and rewarded, and we love to promote from within. Benefits That Work For You Enjoy an award-winning workplace with the benefits to match, including: Comprehensive healthcare options for individuals and families. Flexible vacation policy and paid company holidays 401(k) with company match (USA only) Paid parental leave, wellness stipends, and HSA contributions Professional development and growth opportunities A collaborative office environment with weekly catered lunches
Posted 3 days ago
0 years
0 Lacs
Delhi, India
On-site
About The Company Tata Communications Redefines Connectivity with Innovation and IntelligenceDriving the next level of intelligence powered by Cloud, Mobility, Internet of Things, Collaboration, Security, Media services and Network services, we at Tata Communications are envisaging a New World of Communications Job Description Role Overview: Role We are an international telecommunications/commtech company and provides various services including, international and domestic voice services, MPLS, Internet transit, VoIP, IoT, Mobility, CPaaS, CaaS, managed security, cloud, subsea cable capacity and/or other cutting-edge commtech services. We are seeking a dynamic regulatory professional to join our Legal & Regulatory Affairs team. The role is designed to provide end-to-end support across two key focus areas: Dedicated regulatory advisory and support to the Product Office Operational regulatory responsibilities including litigation and compliance tracking Key Responsibilities: I. Regulatory Product Support Act as the primary regulatory liaison to the Product Office, working closely with Product, Network, Legal, and Technology teams. Provide regulatory risk assessments and compliance inputs during product conceptualization, design, and go-to-market stages. Advise on applicable licensing requirements, statutory obligations, and service rollout norms. Maintain a product-wise regulatory tracker and risk matrix. Provide guidance on OSP (Other Service Provider) regulations and DoT guidelines, and support product and operations teams in adhering to them. Liaison with Regulator for approvals and clarifications, if required. II. Regulatory Operations Assist in handling regulatory and legal proceedings before TRAI, DoT, TDSAT and other bodies. Support vetting of commercial documentation from a regulatory standpoint, including LOUs, LOAs, MSAs, customer agreements, etc. Track changes to regulatory frameworks and draft internal briefing notes or impact memos. Contribute to regulatory submissions, consultation responses, and DoT/TRAI filings. Maintain internal documentation for OSP compliance, UL license requirements, and support any audits or inspections. Provide support for RFPs, customer queries, or onboarding reviews from a regulatory perspective. Skills & Competencies Required Strong understanding of Indian telecom regulations, licensing frameworks, and digital sector compliance requirements Proven ability to interpret legal requirements and apply them to business scenarios, especially in product or technology contexts Excellent drafting, documentation, and stakeholder coordination skills Familiarity with regulatory bodies and procedures (e.g., TRAI, DoT, TDSAT) Proactive mindset, attention to detail, and ability to balance long-term strategy with day-to-day execution
Posted 3 days ago
2.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Join a Team That’s Passionate About Making Lives Better! At Bill Gosling Outsourcing, we believe that success starts with an amazing team. We are a global leader in outsourcing solutions, we focus on making lives better, one connection at a time. We provide tailored solutions to businesses around the globe, specializing in customer care, sales, and financial services. We’re looking for enthusiastic, driven individuals to join our dynamic work environment where fun meets results ! Interacts with customers/ customer information on a daily basis, promptly responding to all inquiries in a courteous and efficient manner. Provides information to customers about product features. Helps customers when they are faced with problems or need information and/or ensure customer information is updated accurately. What You'll Do Interacts with customers/ customer information on a daily basis, promptly responding to all inquiries in a courteous and efficient manner. Provides information to customers about product features. Helps customers when they are faced with problems or need information and/or ensure customer information is updated accurately. Help customers with complaints and questions, give customers information about products and services Ensure to deliver BGO and client metrics and expectations on a regular basis. Ensure customer satisfaction and provide professional customer support Update customer files with appropriate information and ensure information being placed in customer files follow regulatory, client specific, and corporate guidelines Engage with customers on all inbound/outbound calls, emails, and other channels of communication applicable and/ or supports to update customer’s account information Champion company core values and other company programs Other duties as assigned Education North America - Minimum High School Diploma or equivalent is required Philippines – Minimum of 2 years post-secondary or equivalent is required Costa Rica – No Minimum requirement United Kingdom – No Minimum requirement Trinidad & Tobago – Minimum 3 CSEC passes English is compulsory in all locations Experience Previous experience in an Agent/Customer Service Representative is preferred but not required Certificates/Licenses There are no personal certification or licensing requirements for this job. What We're Looking For INFORMATION SECURITY RESPONSIBILITIES All Information security responsibilities can be located in The Book of Bill (Global) and The Book of Bill (Global) – French. Please note that Information security responsibilities are based on role. Why Join Us? Growth Opportunities: We believe in promoting from within and providing opportunities for career advancement. Comprehensive Training: We offer extensive paid training to ensure you’re equipped for success. Team-Oriented Culture: Work in a collaborative, supportive environment with peers who are passionate about what they do. Diversity & Inclusion: We celebrate the unique perspectives and contributions of all our employees. Fun Workplace: Join a vibrant team that knows how to have fun! From team engagement activities to social events, we foster a lively and inclusive work environment where you’ll build strong connections. State-of-the-Art Offices: Work in our modern, well-equipped offices designed to enhance collaboration and productivity. Rewarding Work: Help businesses grow while making a real difference in people’s lives! Get to Know Us Better! Follow us to get an insider view of our team in action, our values in motion, and a sneak peek into what makes us an awesome place to work! Twitter & Instagram: bgocareers Facebook: Bill Gosling Outsourcing LinkedIn: Bill Gosling Outsourcing Website – https://www.billgosling.com/careers By applying to this position, you acknowledge that you have read and understood Bill Gosling Outsourcing’s Privacy Policy and consent to the collection, use, and storage of personal information in accordance with the policy. At Bill Gosling Outsourcing, we believe that diversity makes us stronger. We welcome applicants from all backgrounds and are committed to creating an inclusive and supportive workplace where everyone can thrive. Regardless of your race, gender, age, ability status, or any other characteristic, you are valued here. If you require accommodations at any stage of the hiring process, we are happy to work with you to ensure you have the support you need – just let us know. Bill Gosling Outsourcing – Where your career thrives!
Posted 3 days ago
4.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Job Description Join our dynamic and fast paced team in Legal function. This is a unique opportunity for you to be a part of our Legal function in India and partner with Line's of Businesses. Job Summary As an Associate in Global Financial Crimes Legal at JPMorgan Chase, you will gain exposure to diverse products/services across all Lines of Business and APAC locations. Collaborate with Lines of Business Legal, Compliance, and Business teams to ensure comprehensive legal support and compliance with global standards, contributing to global legal strategies. Job Responsibilities Provide legal advisory support to regional and global Global Financial Crimes Legal colleagues on projects and matters related to know-your-client (“KYC”), anti-money laundering (“AML”), economic sanctions laws and regulations (“Sanctions”), anti-bribery and corruption (“ABC”), export controls and other areas in relation to global financial crimes (“Financial Crimes”). Advise Legal, Compliance, and Line of Business stakeholders on Financial Crimes related risks in capital markets, lending, asset and wealth management, strategic investment and other transactions. This includes reviewing and analyzing due diligence information and advising on client and counterparty representations, warranties, and undertakings to mitigate risks. Offer advisory services on assurances and undertakings provided to third parties regarding JPMorgan’s Financial Crimes related policies and controls. Track and report on industry and regulatory developments, including emergent geopolitical risks to the firm, in Financial Crimes, providing insights and advice to internal stakeholders and management as required Lead advisory efforts on special projects related to the administration of global Financial Crimes programs. Provide legal advice on policy development and periodic reviews, and support multi-jurisdictional legal surveys. Advise on group workflow, communications, and special projects within the Legal Department, ensuring alignment with advisory objectives. Provide advisory input on drafting, reviewing, and negotiating legal agreements and documentation as needed and other matters assigned by the Legal Department from time to time. Required Qualifications, Capabilities, And Skills Minimum 4 years post-qualification experience. Experience in transactional, litigation, and/or financial services regulatory matters in a major law firm and/or large multinational corporation. Strong knowledge of financial institution products, services, and transactions. Strong written and oral communication skills, including legal research and drafting. Ability to manage complex and time-sensitive projects. Ability to develop and maintain client relationships. Confidence in translating complex legal concepts into practical solutions. Ability to collaborate in a multi-functional, multi-jurisdictional environment. Creative solution and problem-solving skills. All candidates for roles in the Legal Department must successfully complete a conflicts of interest clearance review prior to commencement of employment. JD or educational equivalent required. Attorney candidates must be in compliance with all relevant licensing requirements including the requirements of the jurisdiction where the role will be located prior to commencement of employment. Preferred Qualifications, Capabilities, And Skills Prior experience with US, EU, and UN Sanctions programs, international KYC/AML standards, and ABC legislation (such as the US Foreign Corrupt Practices Act or UK Bribery Act) is strongly preferred but not essential. ABOUT US JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. About The Team Our professionals in our Corporate Functions cover a diverse range of areas from finance and risk to human resources and marketing. Our corporate teams are an essential part of our company, ensuring that we’re setting our businesses, clients, customers and employees up for success. With large, global operations, the Legal team tackles complex issues and helps shape the regulations that affect the businesses. The group is organized into practice groups that align with the lines of business and corporate staff areas, which encourages collaboration on legal, regulatory and business developments as they arise.
Posted 3 days ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Responsible for developing software programs per technical specifications following programming standards and procedures, performing testing, executing program modifications, and responding to problems by diagnosing and correcting errors in logic and coding. Key Responsibilities Applies secure coding and UI standards and best practices to develop, enhance, and maintain IT applications and programs. Assists with efforts to configures, analyzes, designs, develops, and maintains program code and applications. Performs unit testing and secure code testing, and issues resolution. Follow the process for source code management. Participate in integration, systems, and performance testing and tuning of code. Participates in peer secure code reviews. Harvest opportunities for re-usability of code, configurations, procedures, and techniques. Responsibilities Competencies: Action oriented - Taking on new opportunities and tough challenges with a sense of urgency, high energy, and enthusiasm. Balances stakeholders - Anticipating and balancing the needs of multiple stakeholders. Business insight - Applying knowledge of business and the marketplace to advance the organization’s goals. Drives results - Consistently achieving results, even under tough circumstances. Plans and aligns - Planning and prioritizing work to meet commitments aligned with organizational goals. Tech savvy - Anticipating and adopting innovations in business-building digital and technology applications. Performance Tuning - Conceptualizes, analyzes and solves application, database and hardware problems using industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Solution Configuration - Configures, creates and tests a solution for commercial off-the-shelf (COTS) applications using industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Solution Functional Fit Analysis - Composes and decomposes a system into its component parts using procedures, tools and work aides for the purpose of studying how well the component parts were designed, purchased and configured to interact holistically to meet business, technical, security, governance and compliance requirements. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in Computer Science, Information Technology, Business, or related subject, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience Intermediate level of relevant work experience required. 3-5 years of experience. Qualifications Candidate must have working hands on experience of more than 3 years as a Salesforce developer in Classic or Lightning Web Component. Able to migrate custom components to Lightning Web Component. Knowledge on Angular framework, Java. Code analysis tools to analyze custom codes and their efficiency Design solution on the principles of configuration and use of OOB features to ensure scalability of the org. Use Lightning Flows to Automate some of processing needs Bachelor’s degree - ideally in Computer Science, Engineering or MIS 6+ years of experience in Force.com/Lightning/LWC/Apex, CICD/COPADO/jenkins/DevOps is mandatory Experience with the Salesforce.com platform. Sales Cloud, Service Cloud, CPQ, Experience Cloud etc Experience with Lightning Pages, Visualforce, Triggers, SOQL, SOSL, API, Flows, LWC, Web Services (SOAP & REST) Salesforce Certified Platform Developer-I & II , Salesforce Certified App Builder" Proficiency in data manipulation and analysis using SQL. Experience with Angular framework/Java. Experience with data visualization tools like Tableau, Power BI, or similar. Good to have Airflow, Tableau Agile Methodologies and well versed with GUS/JIRA Strong communication skills to be able to communicate at all levels. Should have a proactive approach to problem-solving. Follow release management CI/CD code deployment process to migrate the code changes Attend the daily scrum calls and working in Global model Familiarity with Javascript, CSS, Splunk Analytics Visual Studio Code/GitHub/Versioning/Packaging
Posted 3 days ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Responsible for developing software programs per technical specifications following programming standards and procedures, performing testing, executing program modifications, and responding to problems by diagnosing and correcting errors in logic and coding. Key Responsibilities Applies secure coding and UI standards and best practices to develop, enhance, and maintain IT applications and programs. Assists with efforts to configures, analyzes, designs, develops, and maintains program code and applications. Performs unit testing and secure code testing, and issues resolution. Follow the process for source code management. Participate in integration, systems, and performance testing and tuning of code. Participates in peer secure code reviews. Harvest opportunities for re-usability of code, configurations, procedures, and techniques. Responsibilities Competencies: Action oriented - Taking on new opportunities and tough challenges with a sense of urgency, high energy, and enthusiasm. Balances stakeholders - Anticipating and balancing the needs of multiple stakeholders. Business insight - Applying knowledge of business and the marketplace to advance the organization’s goals. Drives results - Consistently achieving results, even under tough circumstances. Plans and aligns - Planning and prioritizing work to meet commitments aligned with organizational goals. Tech savvy - Anticipating and adopting innovations in business-building digital and technology applications. Performance Tuning - Conceptualizes, analyzes and solves application, database and hardware problems using industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Solution Configuration - Configures, creates and tests a solution for commercial off-the-shelf (COTS) applications using industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Solution Functional Fit Analysis - Composes and decomposes a system into its component parts using procedures, tools and work aides for the purpose of studying how well the component parts were designed, purchased and configured to interact holistically to meet business, technical, security, governance and compliance requirements. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in Computer Science, Information Technology, Business, or related subject, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience Intermediate level of relevant work experience required. 3-5 years of experience. Qualifications Candidate must have working hands on experience of more than 3 years as a Salesforce developer in Classic or Lightning Web Component. Able to migrate custom components to Lightning Web Component. Knowledge on Angular framework, Java. Code analysis tools to analyze custom codes and their efficiency Design solution on the principles of configuration and use of OOB features to ensure scalability of the org. Use Lightning Flows to Automate some of processing needs Bachelor’s degree - ideally in Computer Science, Engineering or MIS 4+ years of experience in Force.com/Lightning/LWC/Apex, CICD/COPADO/jenkins/DevOps is mandatory Experience with the Salesforce.com platform. Sales Cloud, Service Cloud, CPQ, Experience Cloud etc Experience with Lightning Pages, Visualforce, Triggers, SOQL, SOSL, API, Flows, LWC, Web Services (SOAP & REST) Salesforce Certified Platform Developer-I & II , Salesforce Certified App Builder" Proficiency in data manipulation and analysis using SQL. Experience with Angular framework/Java. Experience with data visualization tools like Tableau, Power BI, or similar. Good to have Airflow, Tableau Agile Methodologies and well versed with GUS/JIRA Strong communication skills to be able to communicate at all levels. Should have a proactive approach to problem-solving. Follow release management CI/CD code deployment process to migrate the code changes Attend the daily scrum calls and working in Global model Familiarity with Javascript, CSS, Splunk Analytics Visual Studio Code/GitHub/Versioning/Packaging
Posted 3 days ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
Logistics Coordinator Primary Responsibilities / Key Result Areas Plans and executes national and international shipments from-to internal/ external customers and the associated record management Creates and processes export shipping documents within SAP environment, attaches to physical and coordinates pickup and movement of shipment with customers and carriers/forwarders. Work cross functionally with all relevant functions to ensure that all global sales and GW orders are delivered within agreed targets of delivery performance, compliant to the contractual agreements and within budgeted cost levels. Recommend optimal transportation modes, routing of frequency. Establish & monitor specific supply chain-based performance measurement systems. Monitor product import or export processes to ensure compliance with regulatory or legal require-ments. Maintain outstanding shipping files. Record inventory items in SAP and perform (remote) GR (Goods receipt): Registering equipment assets / MRP, assessing POD validity, requesting duly signed and dated pack-ing lists / delivery notes and equipment details, follow up, remote label printing in SAP for various sites worldwide), including remote MRP GR for direct deliveries + requesting STO (Stock Transfer Order) transfers to Material Management Team and OBDs (Outbound Deliveries) when needed. Transacting out MRP equipment to assets for various internal projects (on demand) Perform Other Tasks As Required Keep up-to-date knowledge on SES processes, systems and governmental regulations Ensure timely and accurate Goods receipt within SAP to enable internal financial transactions and release payments towards vendors. Qualifications & Experience Education Bachelor Logistics or equivalent through work experience Minimum 4+ years relevant experience in similar functions Solid experience working with freight forwarders & export/import brokers Trained in export compliance and licensing Deep knowledge of Import/Export processes, related regulations and procedures Enhanced Microsoft Office and SAP user-knowledge level (e.g. GR Goods receipt) Experience of working in a customer focused, dynamic and international environment Fluency in English COMPETENCIES Sense of urgency, initiative and competitive drive to get things done. Strong sense of insuring compliance : comply with all standard operating procedures and require-ments. Excellent oral and written communication skills, customer service, and organizational skills with a strong attention to detail Ability to resourcefully work through or around anything blocking things that need to be accomplished Ability to work on one’s own initiative, and without direct or little supervision Strong attention to detail Ability to handle multiple tasks effectively and prioritize the various duties and responsibilities required of the position Must be a team player Pro-active and independent attitude and result oriented approach with the ability to work at distance (time zone and geographically) with other departments and companies Embark on a career with us, where diversity isn't just a buzzword – it's our driving force. We are crafting a workplace mosaic that values every hue, background, and perspective. Join a global team where inclusivity sparks innovation, and individuality is not only embraced but celebrated. At SES we are committed to hiring inspiring individuals from all backgrounds. We take great pride in creating safe and inclusive processes and we support the recruitment, retention, and evolution of all employees irrespective of gender, colour, race, ethnicity, religion, sexual orientation, disability, veteran or marital status, background or walk in life. SES is an Equal Opportunity Employer and welcomes diversity! For more information on SES, click here.
Posted 3 days ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Job Summary Responsible for developing software programs per technical specifications following programming standards and procedures, performing testing, executing program modifications, and responding to problems by diagnosing and correcting errors in logic and coding. Key Responsibilities Applies secure coding and UI standards and best practices to develop, enhance, and maintain IT applications and programs. Assists with efforts to configures, analyzes, designs, develops, and maintains program code and applications. Performs unit testing and secure code testing, and issues resolution. Follow the process for source code management. Participate in integration, systems, and performance testing and tuning of code. Participates in peer secure code reviews. Harvest opportunities for re-usability of code, configurations, procedures, and techniques. Responsibilities Competencies: Action oriented - Taking on new opportunities and tough challenges with a sense of urgency, high energy, and enthusiasm. Balances stakeholders - Anticipating and balancing the needs of multiple stakeholders. Business insight - Applying knowledge of business and the marketplace to advance the organization’s goals. Drives results - Consistently achieving results, even under tough circumstances. Plans and aligns - Planning and prioritizing work to meet commitments aligned with organizational goals. Tech savvy - Anticipating and adopting innovations in business-building digital and technology applications. Performance Tuning - Conceptualizes, analyzes and solves application, database and hardware problems using industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Solution Configuration - Configures, creates and tests a solution for commercial off-the-shelf (COTS) applications using industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Solution Functional Fit Analysis - Composes and decomposes a system into its component parts using procedures, tools and work aides for the purpose of studying how well the component parts were designed, purchased and configured to interact holistically to meet business, technical, security, governance and compliance requirements. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in Computer Science, Information Technology, Business, or related subject, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience Intermediate level of relevant work experience required. 3-5 years of experience. Qualifications Candidate must have working hands on experience of more than 3 years as a Salesforce developer in Classic or Lightning Web Component. Able to migrate custom components to Lightning Web Component. Knowledge on Angular framework, Java. Code analysis tools to analyze custom codes and their efficiency Design solution on the principles of configuration and use of OOB features to ensure scalability of the org. Use Lightning Flows to Automate some of processing needs Bachelor’s degree - ideally in Computer Science, Engineering or MIS 4+ years of experience in Force.com/Lightning/LWC/Apex, CICD/COPADO/jenkins/DevOps is mandatory Experience with the Salesforce.com platform. Sales Cloud, Service Cloud, CPQ, Experience Cloud etc Experience with Lightning Pages, Visualforce, Triggers, SOQL, SOSL, API, Flows, LWC, Web Services (SOAP & REST) Salesforce Certified Platform Developer-I & II , Salesforce Certified App Builder" Proficiency in data manipulation and analysis using SQL. Experience with Angular framework/Java. Experience with data visualization tools like Tableau, Power BI, or similar. Good to have Airflow, Tableau Agile Methodologies and well versed with GUS/JIRA Strong communication skills to be able to communicate at all levels. Should have a proactive approach to problem-solving. Follow release management CI/CD code deployment process to migrate the code changes Attend the daily scrum calls and working in Global model Familiarity with Javascript, CSS, Splunk Analytics Visual Studio Code/GitHub/Versioning/Packaging Job Systems/Information Technology Organization Cummins Inc. Role Category Hybrid Job Type Exempt - Experienced ReqID 2417818 Relocation Package Yes
Posted 3 days ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Description GPP Database Link (https://cummins365.sharepoint.com/sites/CS38534/) Job Summary Responsible for developing software programs per technical specifications following programming standards and procedures, performing testing, executing program modifications, and responding to problems by diagnosing and correcting errors in logic and coding. Key Responsibilities Applies secure coding and UI standards and best practices to develop, enhance, and maintain IT applications and programs. Assists with efforts to configures, analyzes, designs, develops, and maintains program code and applications. Performs unit testing and secure code testing, and issues resolution. Follow the process for source code management. Participate in integration, systems, and performance testing and tuning of code. Participates in peer secure code reviews. Harvest opportunities for re-usability of code, configurations, procedures, and techniques. Responsibilities Competencies: Action oriented - Taking on new opportunities and tough challenges with a sense of urgency, high energy, and enthusiasm. Balances stakeholders - Anticipating and balancing the needs of multiple stakeholders. Business insight - Applying knowledge of business and the marketplace to advance the organization’s goals. Drives results - Consistently achieving results, even under tough circumstances. Plans and aligns - Planning and prioritizing work to meet commitments aligned with organizational goals. Tech savvy - Anticipating and adopting innovations in business-building digital and technology applications. Performance Tuning - Conceptualizes, analyzes and solves application, database and hardware problems using industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Solution Configuration - Configures, creates and tests a solution for commercial off-the-shelf (COTS) applications using industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Solution Functional Fit Analysis - Composes and decomposes a system into its component parts using procedures, tools and work aides for the purpose of studying how well the component parts were designed, purchased and configured to interact holistically to meet business, technical, security, governance and compliance requirements. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in Computer Science, Information Technology, Business, or related subject, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience Intermediate level of relevant work experience required. 3-5 years of experience. Qualifications Candidate must have working hands on experience of more than 3 years as a Salesforce developer in Classic or Lightning Web Component. Able to migrate custom components to Lightning Web Component. Knowledge on Angular framework, Java. Code analysis tools to analyze custom codes and their efficiency Design solution on the principles of configuration and use of OOB features to ensure scalability of the org. Use Lightning Flows to Automate some of processing needs Bachelor’s degree - ideally in Computer Science, Engineering or MIS 6+ years of experience in Force.com/Lightning/LWC/Apex, CICD/COPADO/jenkins/DevOps is mandatory Experience with the Salesforce.com platform. Sales Cloud, Service Cloud, CPQ, Experience Cloud etc Experience with Lightning Pages, Visualforce, Triggers, SOQL, SOSL, API, Flows, LWC, Web Services (SOAP & REST) Salesforce Certified Platform Developer-I & II , Salesforce Certified App Builder" Proficiency in data manipulation and analysis using SQL. Experience with Angular framework/Java. Experience with data visualization tools like Tableau, Power BI, or similar. Good to have Airflow, Tableau Agile Methodologies and well versed with GUS/JIRA Strong communication skills to be able to communicate at all levels. Should have a proactive approach to problem-solving. Follow release management CI/CD code deployment process to migrate the code changes Attend the daily scrum calls and working in Global model Familiarity with Javascript, CSS, Splunk Analytics Visual Studio Code/GitHub/Versioning/Packaging Job Systems/Information Technology Organization Cummins Inc. Role Category Hybrid Job Type Exempt - Experienced ReqID 2417820 Relocation Package Yes
Posted 3 days ago
4.0 - 10.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
This individual will play a crucial, client-facing role in Application Performance Monitoring (APM), User Experience Monitoring (UEM), and Site Reliability Engineering (SRE) solutions, translating client requirements into scalable and effective implementations. Valid Dynatrace certification is mandatory. Take complete charge of the Dynatrace Architecture, provide recommendations and design rightly sized infrastructure for Dynatrace in respective regions across the globe. Manage good strategy around optimal license consumption on an enterprise licensing module. Wisely create standards around RUM. Performance tuning, updating baseline monitoring, and assisting in filling any gaps in the existing monitoring environment. Provide hands-on, administrator support for Dynatrace with hand on coding skills (Java/Python/Shell scripting). Create API based self-services for mass updates to metadata, alerting profiles, notification systems, SLOs and other anomaly detection rules. Create standards around cloud monitoring via Dynatrace and cater to the needs in AWS/Azure and other public clouds. Create standards around OS, Application, DB, Network, Storage, Kubernetes monitoring via Dynatrace and cater to the needs of full stack observability. Dashboarding, Management Zones, Tagging, Alerting profiles and integrations for new application onboardings. Configuration of settings for monitoring, services, log analytics, anomaly detection, integration with 3rd party services, and general preferences. Analyze APM metrics to identify bottlenecks, latency issues, and slow database queries. Utilize visualizations and data provided from Dynatrace to deliver application and infrastructure monitoring information to key stakeholders and other technical resources. Utilize Dynatrace artificial intelligence to identify problems and root causes. Collaborate with staff in diagnosing and resolving issues, engaging other technical resources as needed to troubleshoot and resolve issues. Create and maintain technical documentation, and operating procedures. Present performance reports and recommendations to leadership teams Conduct training and knowledge transfer for staff. Ability to work within an offshore/onshore team structure. Knowledgeable about SRE tools, technologies and best practices. 4-10 years of experience with Dynatrace APM and UEM products. Possess experience in production support and scalable Architecture implementations. Structured approach, analytical thinking, and accuracy. Architect, Deployment, configuration, and maintenance of One Agent, Active Gate, Real User Monitoring (RUM), and Agentless RUM. Container, Cloud, and Virtual Machine monitoring via Dynatrace. Should have experience on CI/CD tools like Jenkins or Bamboo etc. Good to have Dynatrace Associate Certification certified. Good to have experience on Version control tools (GitHub/Bitbucket). Implement auto-remediation scripts using Dynatrace APIs, ServiceNow, Ansible or Terraform. Good to have knowledge on AI tools and technologies for creating Dynatrace solutions. Should be currently working as SRE Engineer and Valid Dynatrace Certification is mandatory. Hybrid mode - 3 days from office - Noida Please share CVs at ankit.kumar@celsiortech.com
Posted 3 days ago
0 years
4 Lacs
India
On-site
URGENT REQ R&D Manager :- LOCATION : NATURAL HERBS & FORMULATION PVT LTD RAIPUR BHAGWANPUR SALARY RANGE - 5 TO 10 LPA EXP 5-10 YR 1- Knowledge of Herbs and their formulation development for dosage Bolus , Tablets , Creams , Ointments , Gels , Powder ,Oral liquids. 2- Knowledge of R and D set up. 3- R and D Raw material sample indent and its analysis. 4- Documenting Licensing and Statutory i.e Ayur products approval and R & D audit. 5- Developing the Ayurveda category by creating or improving formulations that are compliant for the India market and Export market. 6- Creating researches to check viability of the products before they're introduced in the market and stability study of products. 7- Working closely with global counterparts on Formulation/R&D projects, while sharing technical knowledge with them as required. 8- Creating product specifications to ensure products are manufactured consistently and safely. Job Type: Full-time Pay: ₹40,000.00 per month Work Location: In person
Posted 3 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough